3D textures

How do I map a 3D texture to a cube?
In 2D (square) its simple:
glBegin(GL_QUAD)
glTexCoord2f(…); glVertex3f(…)
.
glEnd();

What to do in the 3D case? (guess I should use glTexCoord3f) but how to map the texture to the cube?
Many thanks,
Yossi

You still use glTexCoord2f(), since the texture is still 2-dimensional. Imagine the cube as 6 single quads, each having texture coordinates of 0,0 to 1,1 and you’ve got it.

Work through nehe’s tutorials http://nehe.gamedev.net/opengl.asp
they cover most topics and range from beginner to advanced, explaining most theoretical and practical stuff about OpenGL.

I guess he actually means using real 3d textures …

Then for the six faces you can for instance do like this:

glBegin(GL_QUADS);
glTexCoord3f(0,1,1);
glVertex(-1,1,1);
glTexCoord3f(1,1,1);
glVertex(1,1,1);
glTexCoord3f(1,0,1);
glVertex(1,-1,1);
glTexCoord3f(0,0,1);
glVertex(-1,-1,1);
glEnd();

glBegin(GL_QUADS);
glTexCoord3f(1,1,1);
glVertex(1,1,1);
glTexCoord3f(1,1,0);
glVertex(1,1,-1);
glTexCoord3f(1,0,0);
glVertex(1,-1,-1);
glTexCoord3f(0,0,0);
glVertex(1,-1,-1);
glEnd();

… and so on …

I don’t know if this code work … try it

How should that work without specifying a 3D texture in memory? And thats not possible so far. Even if, it would recommend an immense amount of memory. Just imagine a 128x128x128 RGBA texture. That would be 8 MByte! The future is here maybe somekind of calculated 3D textures (like in POV).

But the best approach at the moment is still to map the textures on your own on the cube.

It is possible to specify 3D Textures in Memory. There is an extension for this. And for the high Memory consumption, there is also Texture Compression for 3D Textures.

But you are right if you say that it is better to assign specific textures to each side of a cube, if you have a perfect cube.

For more complex Models 3D Textures could be usefull.

Lars

But the glTexCoord3f and 4f has been there before the extension (am I right?). So whats the sense of them?

Originally posted by Kilam Malik:
[b]How should that work without specifying a 3D texture in memory? And thats not possible so far. Even if, it would recommend an immense amount of memory. Just imagine a 128x128x128 RGBA texture. That would be 8 MByte! The future is here maybe somekind of calculated 3D textures (like in POV).

But the best approach at the moment is still to map the textures on your own on the cube.[/b]

Of course I omitted the texture loaing/binding code, I assume that most of the people on this board knows how to bind a texture.

The Radeon actually does 3D texturing in hardware. Yes it takes a lot of memory but if you have symetrical textures you can use some of the mirror extensions to save on memory usage. There are actually a few of examples of using 3D textures in the Radeon SDK:
http://www.ati.com/na/pages/resource_cen…otProduct3.html
http://www.ati.com/na/pages/resource_cen…ne3SpecMap.html
http://www.ati.com/na/pages/resource_cen…deonVolVis.html
http://www.ati.com/na/pages/resource_cen…umeTexture.html

Does GF3 support 3D textures in hardware? I’ve heard some Nvidia people say no, and heard that some Nvidia people have said yes.

Anyone done any cool demos of what you can do with 3D textures?

Nutty

Nutty -

I have asked this before. I got a straight “No” out of Cass about it (in this forum). It also says “software” if you look at the table in the front of nvidia’s opengl extensions pdf.

However, during the initial announcement of the GF3, John Carmack made reference to the 3d texturemap support of the gf3, and said that it was better than the Radeon’s because it didn’t tie up 2 texture units (or something like that). I also saw a recent review of the hercules 3d prophet III that talked about the volumetric texture capability, so there is confusion on the issue.

<begin speculation>
This leads me to believe that the chip has the capability but that nvidia decided to disable it on the regular gf3’s. I am thinking they’ll release a quadro type card with the ability reenabled, and sell it for about $400 more. This is unfortunate, since I have been wanting 3d texmap capability for a long time and I refuse to go to ATI because of nvidia’s great linux and opengl support.
</end speculation>

– Zeno

Originally posted by Nutty:
[b]Does GF3 support 3D textures in hardware? I’ve heard some Nvidia people say no, and heard that some Nvidia people have said yes.

Anyone done any cool demos of what you can do with 3D textures?

Nutty[/b]

Yup, get my per pixel lighting demo here: http://hem.passagen.se/emiper/3d.html

It uses a 3d texture to create a lightfield, sort of. It’s very easy to do dynamic lighting with 3d textures.

GL_EXT_texture3D

this extension is NOT SUPPORTED AT ALL, but as nvidia sayd, gf3 should be opengl1.2 => should be able to use texture3D ( NOT_EXT! ) and it is even possible to get the pointers to the funcs! try it, nutty if it works on your gf3, i dont have one, and on the emulator i get no errors if i grab the pointers wglGetProcAddress( “glTexture3D” ) like that ( without the EXT, as i sayd ) but i had nothing on screen…

so for you Humus, write one version where you dont check for GL_EXT_texture3D and grab the functionpointers without the EXT at the end… put in alot of error checks for making sure every pointer is valid and upload this and then nutty and co can try this… it “should” work… but letz see… testing is bether than thinking

Originally posted by Kilam Malik:
But the glTexCoord3f and 4f has been there before the extension (am I right?). So whats the sense of them?

You can used them in the texture matrix. Ive done some stuff before where I input an object space (x,y,z) vector as the (s,t,r)texture coord, and have the texture matrix set up to do a transformation on this, and the final tex coords end up in (s,t) components, while the (r) is discarded. As for the fourth component, I’ve never used it but I think it can be used in projective texturing.

Originally posted by Zeno:
This leads me to believe that the chip has the capability but that nvidia decided to disable it on the regular gf3’s. I am thinking they’ll release a quadro type card with the ability reenabled, and sell it for about $400 more.

Well, I have heard a rumor that Microsoft wanted certain features “reserved” for the xbox, and since 3D textures had already been designed into the NV20, they just crippled the chip to satisfy them. Not that Im saying I believe this, as most rumors are actually BS, but I thought this was an interesting one. I personally dont feel that nvidia would typically be the type of company to do this, yet I fully believe that Microsoft WOULD be the type of company to request this, and I think the xbox deal was important enough to nvidia that they would have agreed.

Originally posted by davepermen:
[b]GL_EXT_texture3D

this extension is NOT SUPPORTED AT ALL, but as nvidia sayd, gf3 should be opengl1.2 => should be able to use texture3D ( NOT_EXT! ) and it is even possible to get the pointers to the funcs! try it, nutty if it works on your gf3, i dont have one, and on the emulator i get no errors if i grab the pointers wglGetProcAddress( “glTexture3D” ) like that ( without the EXT, as i sayd ) but i had nothing on screen…

so for you Humus, write one version where you dont check for GL_EXT_texture3D and grab the functionpointers without the EXT at the end… put in alot of error checks for making sure every pointer is valid and upload this and then nutty and co can try this… it “should” work… but letz see… testing is bether than thinking [/b]

Fixed!
Tell me if it works. I guess is should work on all GF based cards now, even though it will be software mode.
http://hem.passagen.se/emiper/3d.html

Originally posted by LordKronos:
Well, I have heard a rumor that Microsoft wanted certain features “reserved” for the xbox, and since 3D textures had already been designed into the NV20, they just crippled the chip to satisfy them. Not that Im saying I believe this, as most rumors are actually BS, but I thought this was an interesting one. I personally dont feel that nvidia would typically be the type of company to do this, yet I fully believe that Microsoft WOULD be the type of company to request this, and I think the xbox deal was important enough to nvidia that they would have agreed.

While I don’t know whether there’s any truth behind this theory it sound like it could be, at least when one considers Carmacks early comments on GF3 where he said that “3d texture were implemented” and that they unlike the case with Radeon were “supported in a fully orthogonal way”. This suggests that he had tried them and they worked as expected. But officially nVidia says that they dropped it because it took too much time, which really doesn’t fit well with Carmacks comment which would suggest that it was already finished and working.

To answer one question: the z and w of the texture coordinates are useful if you use the texture matrix. Read up on texture coordinate generation in the spec.

To answer another question: it’s perfectly valid to bind a 3D texture (and yes, a high-resolution one uses lots of memory). However, drawing geometry with this texture bound only draws the texels that intersect the texel plane – OpenGL does NOT do volumetric rendering.

The best uses for 3D texturing so far seem to be for certain light sources, and to give you an extra degree of freedom when using dependent texture reads.

Sigh. Actually, I’ve just been corrected by some folks at NVidia.
Apparently they do support volume textures in GeForce 3. I was just
quoting what I’d been erroneously told by one of their dev rel engineers
(who shall remain nameless). Wish they’d make their minds up…

Apologies for the misinformation and confusion.

“posted by some ms guy”

so it does look like the geforce3 does support 3dtextrues in hardware

Well, I’m happy to say I just got my Gf3 and installed it today

The demos are amazing, particularly the chameleon and zoltar.

Anyway, I tried your 3d texture prog humus - it popped up a blank (white) screen and set my desktop resolution to 1024x768 @ 60Hz. That’s it, though

I have some code that I made using 3d texturemaps in the past. Within the next day or two I’ll give it a try and let everyone know for sure whether there is hardware support for 3d tex maps (at least with my current drivers).

– Zeno

Hmm … that was bad
One thing though, did you let the app run more than say 15 seconds? Since it runs in software mode I guess it could take a while before anything comes up on screen since it’s quite fillrate limited.