PDA

View Full Version : Radeon dynamic per-pixel lighting demo



Humus
03-24-2001, 05:08 PM
Hi guys!

I've written a small demo of volumetric textures in action. It renders dynamic per-pixel lighting with multiple lights using a volumetric texture and dot3 bumpmapping. Cool, hah? http://www.opengl.org/discussion_boards/ubb/smile.gif
Only Radeon cards will be able to run it at this time.

It's available here (http://www.geocities.com/SiliconValley/Cable/9070/3d.html) for people that wants to try it.

Gorg
03-24-2001, 07:51 PM
neat!

paddy
03-24-2001, 08:23 PM
This looks great, Humus !
Way to go http://www.opengl.org/discussion_boards/ubb/smile.gif

davepermen
03-25-2001, 03:58 AM
hm.. the demo looks cool ( the one of your game.. i am a nonradeon user so i cant take a look at the other one.. would be cool if you can support both.. perhaps you can contact me and i do the gf2 version ( and gf3 automatically then.. ) )

is it in one pass or two? should be in one possible, not?
tex0 = attentuation3d
tex1 = normalmap
tex2 = normalizedlightdircube

right?

Lars
03-25-2001, 06:42 AM
But doesn't the Gf2 lacks the support for volumetric textures ?
(Ok you could do it without them, but not that efficient)

Lars

davepermen
03-25-2001, 07:05 AM
its not that mess... you can do it with 2 textures, one 1d and one 2d..

you need 2passes, one for the attentuation and one for the dotproduct, cause you have to less textures in one pass ( on the gf3 you can do it in one then.. with the same settings just tex0 = 1d and tex1 = 2d and the rest shifted 1 tex )

gf2:

pass0:
tex0 = 1dzattentuation
tex1 = 2dxyattentuation

pass1:
tex0 = normalizedpointtolight
tex1 = normalmap


but depending on my other question bout the clambing it would be possible doing it without any texture.. means in one pass with the pass1 combined..

Lars
03-25-2001, 09:39 AM
Just read your question.
But it should be impossible to do real per pixel lighting (think you wan't to do that) by passing vertexpossition and lightposition to the combiners, and then compute the correct lighting values (that what you do currently on the vertex level). But the computation of the combiners is done only with 9 Bit precision, which is not enough for such complex computations.

But there is a way of doing the lighing in one pass with dot3, by only using one texture.
I do this in my Engine, it works the following way :
1. per Vertex compute the light direction vector in local polygon coordinates (for the Dot3 Bumpmapping). 2. Use the x and z Coords from the transformed light to generate texture coordinates for the attenuation map (just a 2D texture). 3. Compute the z-Distance from the light to the vertex, by using the y component of the light. 4. Encode the Dot3 lightvector into the primary colors rgb-Part. 4. Put the Distance into the alpha-Part

The code for this looks like this



// precompute some values
float rangeMult = (1.0f / light->Range);
float rangeMultHalf = (1.0f / light->Range) * 0.5f;

// Go thru all Vertices
for(int iV=0;iV<Mesh->VertexCount;iV++)
{
VertexFormatDot3 *vertex = &amp;vfd[iV];
D3DXVECTOR3 lightDir = lightCoord-vertex->point;

// Transform the Light into the local coordiantesystem of the current vertex (polygon)
Diffuse.x = -D3DXVec3Dot(&amp;lichtDir,&amp;vertex->s);
Diffuse.z = -D3DXVec3Dot(&amp;lichtDir,&amp;vertex->t);
Diffuse.y = D3DXVec3Dot(&amp;lichtDir,&amp;vertex->sXt);

// compute texture coordinates
uvTexs[iV].x = 0.5f + (Diffuse.x * rangeMultHalf);
uvTexs[iV].y = 0.5f + (Diffuse.z * rangeMultHalf);

// compute the remaining distance
det = fMax(0.0f,1.0f - (Diffuse.y *rangeMult));

D3DXVec3Normalize(&amp;Diffuse,&amp;Diffuse);

// Transform it so that it can be easyly expanded in the combiners
Diffuse = (Diffuse + D3DXVECTOR3(1.0f,1.0f,1.0f))*0.5f;

// Put the light vector and the distance into the color Value
LightVector2Dword(&amp;Mesh->colorVals[0][iV],&amp;Diffuse,det);
}


Now you have all the per vertex information. In the combiners you do your normal bumpmapping, and modulate it with the second texture (attenuation map) and with the alpha value from primary color. My Combinersetup looks like this :



// in the constant color is the color of the light source
glCombinerParameterfvNV(GL_CONSTANT_COLOR0_NV, (float*)&amp;c1);

glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_A_NV,GL_TEXTURE0_ARB, GL_EXPAND_NORMAL_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_B_NV,GL_PRIMARY_COLOR_NV, GL_EXPAND_NORMAL_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_C_NV,GL_ZERO, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_D_NV,GL_ZERO, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerOutputNV(GL_COMBINER0_NV, GL_RGB,GL_SPARE0_NV, GL_DISCARD_NV, GL_DISCARD_NV,GL_NONE, GL_NONE, GL_TRUE, GL_FALSE, GL_FALSE);

glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_C_NV,GL_SPARE0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_D_NV,GL_CONSTANT_COLOR0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_B_NV,GL_TEXTURE1_ARB, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_A_NV,GL_PRIMARY_COLOR_NV, GL_UNSIGNED_IDENTITY_NV, GL_ALPHA);
glCombinerOutputNV(GL_COMBINER1_NV, GL_RGB,GL_SPARE1_NV,GL_SPARE0_NV, GL_DISCARD_NV,GL_NONE, GL_NONE, GL_FALSE, GL_FALSE, GL_FALSE);

glFinalCombinerInputNV(GL_VARIABLE_A_NV,GL_SPARE1_ NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);

glFinalCombinerInputNV(GL_VARIABLE_B_NV,GL_SPARE0_ NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);


Of course the bumpmapping is not the highest quality, cause you don't use a normal map, but i never had problems with artifacts yet.

Lars


[This message has been edited by Lars (edited 03-25-2001).]

davepermen
03-25-2001, 10:02 AM
just a stupid comment while reading your codesnippeds.. looks like you have copied them directly out.. u use d3dx in your openglcode.. funny http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
03-25-2001, 10:09 AM
Originally posted by davepermen:
its not that mess... you can do it with 2 textures, one 1d and one 2d..

you need 2passes, one for the attentuation and one for the dotproduct, cause you have to less textures in one pass ( on the gf3 you can do it in one then.. with the same settings just tex0 = 1d and tex1 = 2d and the rest shifted 1 tex )

gf2:

pass0:
tex0 = 1dzattentuation
tex1 = 2dxyattentuation

pass1:
tex0 = normalizedpointtolight
tex1 = normalmap


but depending on my other question bout the clambing it would be possible doing it without any texture.. means in one pass with the pass1 combined..

I do it with one pass per light, I don't think you can do it with multiple lights if you need more than one pass / light.

Tex0 = attenuation adjusted normalmap.
Tex1 = DOT3 bumpmap

For each polygon I have a matrix that defines a transform for the texture coords so that I have texture coord generation doing all the work for me, I only need to load the predefined texture matrix for the polygon. Out of the transform I get a 3d coord. If I'd do this without 3d textures (the way I do it) I'd need to do all that stuff myself and then finally choose one 2d slice and do a texture bind on each light on each polygon, not efficient.

davepermen
03-25-2001, 10:09 AM
for making your code work i need s and t vectirs of the vertex, right? ( and sXt wich is the normal.. )

now how do you calculate them? i wrote a codesnipped in the Post Binormals And Tangents ( down somewhere http://www.opengl.org/discussion_boards/ubb/smile.gif ), but i didnt had the time to try yet, so i'm interested how you calculate em..

for the ati freaks out there.. i love your texture3d, do you see the stress we have without? http://www.opengl.org/discussion_boards/ubb/smile.gif but anyways, i like the registercombiners on the other hand, its great stuff, too.. why oh why dont have the geforce3 a texture3d.. it would be perfect ( with 4 texturestages and all.. wow.. )

tex0 = attentuation3d
tex1 = normalmap
tex2 = normalizedpointtolightCUBE
tex3 = diffusemap

registercombiners: tex0 * tex3 * ( tex1 dot tex2 )


simple it is, simple and precious ( when u use HILO for tex1 even more precious.. )

Humus
03-25-2001, 10:14 AM
Originally posted by davepermen:

for the ati freaks out there.. i love your texture3d, do you see the stress we have without? http://www.opengl.org/discussion_boards/ubb/smile.gif but anyways, i like the registercombiners on the other hand, its great stuff, too.. why oh why dont have the geforce3 a texture3d.. it would be perfect ( with 4 texturestages and all.. wow.. )


Yeah, I was really disappointed, I'd love to have a GF3 with 3d textures ... I hope ATi can bring up some damn good hardware for the Radeon II, and some good drivers and some great developer support DAMNIT!

davepermen
03-25-2001, 10:40 AM
yeah, first time a REAL programable pixelpipeline, like the vertex_program, with the same instructions ( and no clambing to 0-1 or -1-1 while processing it.. )

would be so nice

(oh, btw, and several loops in the program, or a txl instruction, how ever.. some resampling method..)

*dreaming*

kaber0111
03-25-2001, 11:37 AM
>now how do you calculate them? i wrote a
>codesnipped in the Post Binormals And
>how you calculate em..
http://www.angelfire.com/ab3/nobody/calc_tangent.c

compiles all by it's self, just make sure to define the vertex structure in your code base, and build that structure by your poly info, then plugg that..
the binormal is just teh cross between the normal and tangent vector.

btw, you need to use the alpha chanell on the diffuse map or your surfaces come out looking kind of funky.

here is an example with no gloss map (alpha on the diffuse) http://www.angelfire.com/ab3/nobody/temp/glossneeded.jpg

here is a scene with the chanell with the changes.. http://www.angelfire.com/ab3/nobody/lugi1.jpg

like the alpha chanell is the mask on that diffuse texture..

more inf about it at http://www.angelfire.com/ab3/nobody/pplRendering.html

nothing special...
_almost_ done with new edge tracking code though.


jeeesh, i don't know why i can't stop focusing on 3D http://www.opengl.org/discussion_boards/ubb/wink.gif
the game i'm doing is 2D http://www.opengl.org/discussion_boards/ubb/smile.gif

-akbar A.




[This message has been edited by kaber0111 (edited 03-25-2001).]

kaber0111
03-25-2001, 11:45 AM
>you need 2passes, one for the attentuation
>and one for the dotproduct, cause you have
>to less textures in one pass ( on the gf3
>you can do it in one then.. with the same
>settings just tex0 = 1d and tex1 = 2d and

I think someone also mentioned using dep texture reads ...
if you use these, you fragment your consumer base one more time.

and from what i heard dep texture reads are really not all that fast on the geforce3's.

like Doom is not using them, at least from a few weeks ago..
not sure if it is now..

but honestly, i would care caution in this area..
move to dot3 lighting, and use multiple passes to get the effect.
and your going to want to code support for the extra texture units...

but the dep texture reads...
i' say no, at least not yet.

okay, back to working on my 2D game http://www.opengl.org/discussion_boards/ubb/biggrin.gif

-akbar A.

kaber0111
03-25-2001, 12:10 PM
>geforce3 a texture3d.. it would be perfect
>( with 4 texturestages and all.. wow.. )

use 2 textures to do attentuation.
3D textures are an expensive resource.

davepermen
03-25-2001, 12:26 PM
bout the ati-way

tex0 = 3d
tex0.rgb = normalizeddirectionvector
tex0.alpha = distanceattentuation

tex1 = 2d
tex1.rgb = normal
tex1.alpha = somethingelse http://www.opengl.org/discussion_boards/ubb/smile.gif

tex2 = 2d
tex2.rgb = ambienttexture
tex2.alpha = transparence

but you need registercombiners to calculate this i think http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
03-25-2001, 12:29 PM
Originally posted by kaber0111:
>geforce3 a texture3d.. it would be perfect
>( with 4 texturestages and all.. wow.. )

use 2 textures to do attentuation.
3D textures are an expensive resource.



3D texture don't need to be expensive. In my demo I use a 24bit 64x64x64 texture, which is 768kb ... or 1MB if it's expanded to 32bit. I could use a 32x32x32 too, slightly more banding but not that much ... comes at the huge price of 96kb.

kaber0111
03-25-2001, 12:47 PM
neat
> In my demo I use a 24bit 64x64x64 texture,
>which is 768kb ... or 1MB if it's expanded
>o 32bit. I could use a 32x32x32 too,
>slightly more banding but not that much ...
>comes at the huge price of 96kb.

maybe you could give us screeenshots with diffrent 3d texture sizes...
so we can see differences....

Lars
03-25-2001, 02:28 PM
To davepermen :
Look back into the Thread, my code is right above yours http://www.opengl.org/discussion_boards/ubb/smile.gif

Lars

[This message has been edited by Lars (edited 03-25-2001).]

LordKronos
03-25-2001, 03:49 PM
Originally posted by kaber0111:
use 2 textures to do attentuation.
3D textures are an expensive resource.


I say use a single 2D texture to do distance attenuation and thus free up a texture unit. I know, calculating 3D distance attenuation from only a 2D texture seems impossible, but I assure you it is quite possible. I described the technique on my site:
http://www.ronfrazier.net/apparition/research/advanced_per_pixel_lighting.html

I quick summary of the technique:
When doing bumpmapping, you need to move the light into tangent space of the poly. Since you are already performing this calculation, you might as well get everything you can our if it. So, you have a tangent space light vector (tx,ty,tz). Next, scale this by the lights radius to get (stx, sty, stz). You can use (stx, sty) to generate your (u,v) for the 2D radial map. Then you can use stz as a constant distance from the light to the polygon (since this is in tangent space for the poly, the stz distance will be the same across the entire poly). Just throw stz into your primary and secondary color, square it in the combiners and you essentially have the value you would have pulled from a 1D map.

jwatte
03-25-2001, 04:37 PM
I hope ATi can bring up some damn good hardware for the Radeon II

Apparently, when ATI bought the FireGL people, the also acquired a team of l33t GL driver writers. Some say that's the reason they bought them in the first place (as FireGL uses IBM chips on their boards).

As for the expense of 3D textures, 16x16x16 should be enough, because the texturing unit will still interpolate between the values (although I think the fancier interpolation schemes are unavailable for 3D textures).

davepermen
03-26-2001, 06:34 AM
hm cool, lord cronos, i always thought youre somehow talking the same **** i yet read on THE perpixellighing site for geforce2coders.. hm, its YOUR site! http://www.opengl.org/discussion_boards/ubb/smile.gif thats why, got it now.. http://www.opengl.org/discussion_boards/ubb/smile.gif

have you played yet with vertex_program? think its a cool way to set up your texturecoordinates, so that you dont have to do anything more than glVertex3f and glNormal3f ( and in case of ppl glTangent3f == glMultiTexCoord2fARB( GL_TEXTURE_2D, .. hm where can i get the binormal in?! have to calculate in realtime looks like this, not?, or can i send more texturecoordinates to the vertex_program than supported at the end? letz see... )

hm, this ati demo somehow inspired us, not? it just looks great.. and its one pass...

letz try to get it my own on the gf2.. hope to see you soon in my own post: Geforce2 dynamic per-pixel lighing demo.. http://www.opengl.org/discussion_boards/ubb/smile.gif

Lars, its nice to see your code, but isnt it the demo of Humus? So why do YOU explain me how Humus does it? http://www.opengl.org/discussion_boards/ubb/smile.gif anyways, nice day..

davepermen
03-26-2001, 06:51 AM
btw, the binormal, is it nXt or tXn?

Lars
03-26-2001, 08:47 AM
davepermen : No no, Humus does it on the Radeon with 3D Textures, i do it on the GeForce with 2D Textures http://www.opengl.org/discussion_boards/ubb/smile.gif

I think you should be carefull with the vertex programs, at least on GeForce Cards beneath 3, cause they only accelerate the standard pipline. This means if you are computing your lightvectors in the vertexprogramm and als do your basic transforms there it runs fully on the CPU. If instead compute your lighting vectors normally on the cpu you still get the hardware accelerated transforms of the Vertex coordinates.

Lars

davepermen
03-26-2001, 10:21 AM
yeah, i know, but i want to use the vertex_program anyways..

but somehow my tangents and binormals are wrong.. and now i am stressed.. when i render the normals, they look right.. when i render the tantents.. they.. yeah, look somehow funny looking, but not following the texturedirection ( and i think they should do that, not? )

kaber0111
03-26-2001, 11:17 AM
neat trick lord necro.
i'm going to try that techniques later today probably.


btw, dude with problems calculating tangent vector derived from the uv offsets...
i posted code.

and vertex program...
that's for doing operations on _data_..
your going to want to feed in your tangents, binormals and normals into the vertex_program.

-akbar A.

davepermen
03-26-2001, 11:50 AM
i have to calculate and store them first, logically.. but then i want to store them per vertex ( as described in some specs.. )

and THEN i want to direct rotate my light in the vertexprogram etc..

kaber0111
03-26-2001, 01:26 PM
Originally posted by LordKronos:
[B]
Just throw stz into your primary and secondary color, square it in the combiners and you essentially have the value you would have pulled from a 1D map.
B]

I'm just thinking out loud here, but this would mean lots of calls to your register_combiner set function.

because your using the color value as your distance..
so whenever the stz value changes, your going to have to recall the combiner function to draw corretly.

and when your drawing this will be a problem...

am i missing something here?


just coded it up real quick trying that technique..
it's not rendering correctly just yet, but I'm sure it'll be fixed later tonight.
http://www.angelfire.com/ab3/nobody/temp/pointlight.jpg

haven't eaten yet, so i' got to go..

-akbar A.

Lars
03-26-2001, 01:56 PM
Why, the primary and secondary colors are passed to the combiners automatically and filtered for each pixel that is drawn.
So you only need to set the combiners once, compute all color values, and then send the vertices into the pipeline with drawElements or something else.

Lars

kaber0111
03-26-2001, 01:59 PM
?
didn't follow what you said.


btw, think i forget to mention.
props to lord necro for writing really nicely http://www.opengl.org/discussion_boards/ubb/smile.gif

realtimerendering quality nice

-akbar A.



[This message has been edited by kaber0111 (edited 03-26-2001).]

davepermen
03-27-2001, 12:25 AM
its simple:

primarycolor is glColor3f
secondarycolor you set with glSecondaryColor3fEXT

and you can set this per vertex

but the combiners dont change, its always ( for example ) col0 * col1.. there is no difference between them

LordKronos
03-27-2001, 01:59 PM
Originally posted by kaber0111:
props to lord necro for writing really nicely http://www.opengl.org/discussion_boards/ubb/smile.gif


Please, the name is Lord Kronos. You make it sound like Im some sick perverted freak of nature. What are you gonna call me next? Lord Sodomy? Lord Chicken Stuffer? http://www.opengl.org/discussion_boards/ubb/smile.gif

kaber0111
03-27-2001, 09:00 PM
hehe
in my code i got a var

int lordNecro

if it's true, it does the method you described http://www.opengl.org/discussion_boards/ubb/wink.gif

i think i should change it now.
http://www.opengl.org/discussion_boards/ubb/smile.gif

okay, well i got to get back to reading about black and white http://www.opengl.org/discussion_boards/ubb/biggrin.gif
stores in my area shelf it tommorow morning.

davepermen
03-28-2001, 02:18 AM
heh, guys, what bout this?

i have the nv20 emulator enabled, and i have this line in my code:

LOGFILE::log( "glTexImage3D is %ssupported!", wglGetProcAddress( "glTexImage3D" ) ? "" : "not " );

and what do i see in the logfile.txt ?

::LOGFILE::

by davepermen, compiled @ Mar 28 2001
GL_ARB_multitexture extension supported and initialized
GL_NV_vertex_program extension supported and initialized
GL_NV_register_combiners extension supported and initialized
glTexImage3D is supported!

heh guys, sounds like 3dtextures are there some where.. hm.. Humus, i'm interested how the program would look like on my gf2 http://www.opengl.org/discussion_boards/ubb/smile.gif could you compile it without the EXT in the wglGetProcAddresscalls and without checking the extension? perhaps it would run, perhaps it would crash here.. but when it runs ( very slowly i suppose ) it would be great... just to see it one time.. would be enough..


[This message has been edited by davepermen (edited 03-28-2001).]

davepermen
03-28-2001, 02:20 AM
hm, i even dont need to enable the emulator! cool.. looks like 1.2 is really implemented.. perhaps we get the 3dtextures on the gf3 .. *dreaaaaaaaaaaaam*

zed
03-28-2001, 03:16 AM
i shouldnt be writing this ( undeer the due inflence + what not)
but isnt
LOGFILE::log( "glTexImage3D is %ssupported!", wglGetProcAddress( "glTexImage3D" ) ? "" : "not " );
gonna give you "" if it is supported?.
i see in the 10.xx nvidia spec 3dtextures are supported ( why they werent before yet it was a 1.2 driver is for the fairys to decide but ) my next extension demo will have to do this , the old noise tricks beeen sitting around for a while waitinf for 3d textures. must rush im off

Michail Bespalov
03-28-2001, 03:23 AM
Originally posted by davepermen:
hm, i even dont need to enable the emulator! cool.. looks like 1.2 is really implemented.. perhaps we get the 3dtextures on the gf3 .. *dreaaaaaaaaaaaam*

1.2 is *really implemented* even on tnt,but in sw.According to nvOpenGLscpes.pdf 3d textures on gf3 are in sw too.
One thing I dont understand.Carmack is talking about hw 3d textures on gf3 in his plan.
Any ideas ?

Gorg
03-28-2001, 05:06 AM
About 3d tex on the GF3 Looks like they removed it at the last minute for some reason.

I don't know if you here a while ago, but someone got his hand on a draft version of the lastest specs and it was saying HW for 3d textures. In the official specs, it says SW.

Since I am a big X-files fan I have a crazy theory that they removed it because they have secret deal with microsoft that says that the chips released before the XBox must not be better than the one in the XBox(which will probably support 3d tex.) http://www.opengl.org/discussion_boards/ubb/smile.gif and please, don't start believing this is true! http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
03-28-2001, 05:38 AM
Originally posted by davepermen:
heh, guys, what bout this?

i have the nv20 emulator enabled, and i have this line in my code:

LOGFILE::log( "glTexImage3D is %ssupported!", wglGetProcAddress( "glTexImage3D" ) ? "" : "not " );

and what do i see in the logfile.txt ?

::LOGFILE::

by davepermen, compiled @ Mar 28 2001
GL_ARB_multitexture extension supported and initialized
GL_NV_vertex_program extension supported and initialized
GL_NV_register_combiners extension supported and initialized
glTexImage3D is supported!

heh guys, sounds like 3dtextures are there some where.. hm.. Humus, i'm interested how the program would look like on my gf2 http://www.opengl.org/discussion_boards/ubb/smile.gif could you compile it without the EXT in the wglGetProcAddresscalls and without checking the extension? perhaps it would run, perhaps it would crash here.. but when it runs ( very slowly i suppose ) it would be great... just to see it one time.. would be enough..


Drop me a mail and I can send you the sources, I've changed it slightly but not much ... just some optimizations and brighter lights.

davepermen
03-28-2001, 08:47 AM
hm.. with the current drivers i can use glTexImage3D.. but glEnable( GL_TEXTURE_3D ) has just the effect that i dont have ANY textures left.. i converted an ati code from the "slice" demo.. but it does not work.. i just have the plane to slice around.. nothing else..

boring.. HE NVIDIA, PUT THE EMULATOR IN!

would be funny http://www.opengl.org/discussion_boards/ubb/smile.gif

Humus
03-28-2001, 11:30 AM
I think I read that the nVidia emulated 3d texture cannot be larger than 64x64x64 ... while the ATi demo is using 128x128x128.

davepermen
03-28-2001, 11:34 AM
hm.. i'll take another look..

davepermen
03-28-2001, 11:54 AM
i dont get it http://www.opengl.org/discussion_boards/ubb/frown.gif i even dont get 2dtextures in the demo http://www.opengl.org/discussion_boards/ubb/smile.gif.. anyways.. wait filled with hope for the gf3 that she supports tex3d.. (*dreaming*)..