PDA

View Full Version : 3D textures

yossi
05-03-2001, 03:40 AM
How do I map a 3D texture to a cube?
In 2D (square) its simple:
glTexCoord2f(....); glVertex3f(......)
.
glEnd();

What to do in the 3D case? (guess I should use glTexCoord3f) but how to map the texture to the cube?
Many thanks,
Yossi

Dodger
05-03-2001, 03:57 AM
You still use glTexCoord2f(), since the texture is still 2-dimensional. Imagine the cube as 6 single quads, each having texture coordinates of 0,0 to 1,1 and you've got it.

Work through nehe's tutorials http://nehe.gamedev.net/opengl.asp
they cover most topics and range from beginner to advanced, explaining most theoretical and practical stuff about OpenGL.

Humus
05-03-2001, 08:24 AM
I guess he actually means using real 3d textures ...

Then for the six faces you can for instance do like this:

glTexCoord3f(0,1,1);
glVertex(-1,1,1);
glTexCoord3f(1,1,1);
glVertex(1,1,1);
glTexCoord3f(1,0,1);
glVertex(1,-1,1);
glTexCoord3f(0,0,1);
glVertex(-1,-1,1);
glEnd();

glTexCoord3f(1,1,1);
glVertex(1,1,1);
glTexCoord3f(1,1,0);
glVertex(1,1,-1);
glTexCoord3f(1,0,0);
glVertex(1,-1,-1);
glTexCoord3f(0,0,0);
glVertex(1,-1,-1);
glEnd();

... and so on ...

I don't know if this code work ... try it http://www.opengl.org/discussion_boards/ubb/tongue.gif

Kilam Malik
05-04-2001, 01:02 AM
How should that work without specifying a 3D texture in memory? And thats not possible so far. Even if, it would recommend an immense amount of memory. Just imagine a 128x128x128 RGBA texture. That would be 8 MByte! The future is here maybe somekind of calculated 3D textures (like in POV).

But the best approach at the moment is still to map the textures on your own on the cube.

Lars
05-04-2001, 01:27 AM
It is possible to specify 3D Textures in Memory. There is an extension for this. And for the high Memory consumption, there is also Texture Compression for 3D Textures.

But you are right if you say that it is better to assign specific textures to each side of a cube, if you have a perfect cube.

For more complex Models 3D Textures could be usefull.

Lars

Kilam Malik
05-04-2001, 04:51 AM
But the glTexCoord3f and 4f has been there before the extension (am I right?). So whats the sense of them?

Humus
05-04-2001, 05:30 AM
Originally posted by Kilam Malik:
How should that work without specifying a 3D texture in memory? And thats not possible so far. Even if, it would recommend an immense amount of memory. Just imagine a 128x128x128 RGBA texture. That would be 8 MByte! The future is here maybe somekind of calculated 3D textures (like in POV).

But the best approach at the moment is still to map the textures on your own on the cube.

Of course I omitted the texture loaing/binding code, I assume that most of the people on this board knows how to bind a texture.

daveg
05-04-2001, 06:34 AM
The Radeon actually does 3D texturing in hardware. Yes it takes a lot of memory but if you have symetrical textures you can use some of the mirror extensions to save on memory usage. There are actually a few of examples of using 3D textures in the Radeon SDK:

Nutty
05-04-2001, 07:25 AM
Does GF3 support 3D textures in hardware? I've heard some Nvidia people say no, and heard that some Nvidia people have said yes.

Anyone done any cool demos of what you can do with 3D textures? http://www.opengl.org/discussion_boards/ubb/smile.gif

Nutty

Zeno
05-04-2001, 10:02 AM
Nutty -

I have asked this before. I got a straight "No" out of Cass about it (in this forum). It also says "software" if you look at the table in the front of nvidia's opengl extensions pdf.

However, during the initial announcement of the GF3, John Carmack made reference to the 3d texturemap support of the gf3, and said that it was better than the Radeon's because it didn't tie up 2 texture units (or something like that). I also saw a recent review of the hercules 3d prophet III that talked about the volumetric texture capability, so there is confusion on the issue.

<begin speculation>
This leads me to believe that the chip has the capability but that nvidia decided to disable it on the regular gf3's. I am thinking they'll release a quadro type card with the ability reenabled, and sell it for about \$400 more. This is unfortunate, since I have been wanting 3d texmap capability for a long time and I refuse to go to ATI because of nvidia's great linux and opengl support.
</end speculation>

-- Zeno

Humus
05-04-2001, 10:14 AM
Originally posted by Nutty:
Does GF3 support 3D textures in hardware? I've heard some Nvidia people say no, and heard that some Nvidia people have said yes.

Anyone done any cool demos of what you can do with 3D textures? http://www.opengl.org/discussion_boards/ubb/smile.gif

Nutty

Yup, get my per pixel lighting demo here: http://hem.passagen.se/emiper/3d.html

It uses a 3d texture to create a lightfield, sort of. It's very easy to do dynamic lighting with 3d textures.

davepermen
05-04-2001, 11:13 AM
GL_EXT_texture3D

this extension is NOT SUPPORTED AT ALL, but as nvidia sayd, gf3 should be opengl1.2 => should be able to use texture3D ( NOT_EXT! ) and it is even possible to get the pointers to the funcs! try it, nutty if it works on your gf3, i dont have one, and on the emulator i get no errors if i grab the pointers wglGetProcAddress( "glTexture3D" ) like that ( without the EXT, as i sayd ) but i had nothing on screen..

so for you Humus, write one version where you dont check for GL_EXT_texture3D and grab the functionpointers without the EXT at the end.. put in alot of error checks for making sure every pointer is valid and upload this and then nutty and co can try this.. it "should" work.. but letz see.. testing is bether than thinking http://www.opengl.org/discussion_boards/ubb/smile.gif

LordKronos
05-04-2001, 11:18 AM
Originally posted by Kilam Malik:
But the glTexCoord3f and 4f has been there before the extension (am I right?). So whats the sense of them?

You can used them in the texture matrix. Ive done some stuff before where I input an object space (x,y,z) vector as the (s,t,r)texture coord, and have the texture matrix set up to do a transformation on this, and the final tex coords end up in (s,t) components, while the (r) is discarded. As for the fourth component, I've never used it but I think it can be used in projective texturing.

LordKronos
05-04-2001, 11:33 AM
Originally posted by Zeno:
This leads me to believe that the chip has the capability but that nvidia decided to disable it on the regular gf3's. I am thinking they'll release a quadro type card with the ability reenabled, and sell it for about \$400 more.

Well, I have heard a rumor that Microsoft wanted certain features "reserved" for the xbox, and since 3D textures had already been designed into the NV20, they just crippled the chip to satisfy them. Not that Im saying I believe this, as most rumors are actually BS, but I thought this was an interesting one. I personally dont feel that nvidia would typically be the type of company to do this, yet I fully believe that Microsoft WOULD be the type of company to request this, and I think the xbox deal was important enough to nvidia that they would have agreed.

Humus
05-04-2001, 01:33 PM
Originally posted by davepermen:
GL_EXT_texture3D

this extension is NOT SUPPORTED AT ALL, but as nvidia sayd, gf3 should be opengl1.2 => should be able to use texture3D ( NOT_EXT! ) and it is even possible to get the pointers to the funcs! try it, nutty if it works on your gf3, i dont have one, and on the emulator i get no errors if i grab the pointers wglGetProcAddress( "glTexture3D" ) like that ( without the EXT, as i sayd ) but i had nothing on screen..

so for you Humus, write one version where you dont check for GL_EXT_texture3D and grab the functionpointers without the EXT at the end.. put in alot of error checks for making sure every pointer is valid and upload this and then nutty and co can try this.. it "should" work.. but letz see.. testing is bether than thinking http://www.opengl.org/discussion_boards/ubb/smile.gif

Fixed!
Tell me if it works. I guess is should work on all GF based cards now, even though it will be software mode.
http://hem.passagen.se/emiper/3d.html

Humus
05-04-2001, 01:39 PM
Originally posted by LordKronos:
Well, I have heard a rumor that Microsoft wanted certain features "reserved" for the xbox, and since 3D textures had already been designed into the NV20, they just crippled the chip to satisfy them. Not that Im saying I believe this, as most rumors are actually BS, but I thought this was an interesting one. I personally dont feel that nvidia would typically be the type of company to do this, yet I fully believe that Microsoft WOULD be the type of company to request this, and I think the xbox deal was important enough to nvidia that they would have agreed.

While I don't know whether there's any truth behind this theory it sound like it could be, at least when one considers Carmacks early comments on GF3 where he said that "3d texture were implemented" and that they unlike the case with Radeon were "supported in a fully orthogonal way". This suggests that he had tried them and they worked as expected. But officially nVidia says that they dropped it because it took too much time, which really doesn't fit well with Carmacks comment which would suggest that it was already finished and working.

jwatte
05-04-2001, 06:08 PM
To answer one question: the z and w of the texture coordinates are useful if you use the texture matrix. Read up on texture coordinate generation in the spec.

To answer another question: it's perfectly valid to bind a 3D texture (and yes, a high-resolution one uses lots of memory). However, drawing geometry with this texture bound only draws the texels that intersect the texel plane -- OpenGL does *NOT* do volumetric rendering.

The best uses for 3D texturing so far seem to be for certain light sources, and to give you an extra degree of freedom when using dependent texture reads.

zed
05-04-2001, 06:55 PM
Sigh. Actually, I've just been corrected by some folks at NVidia.
Apparently they *do* support volume textures in GeForce 3. I was just
quoting what I'd been erroneously told by one of their dev rel engineers
(who shall remain nameless). Wish they'd make their minds up...

Apologies for the misinformation and confusion.

"posted by some ms guy"

so it does look like the geforce3 does support 3dtextrues in hardware

Zeno
05-04-2001, 11:31 PM
Well, I'm happy to say I just got my Gf3 and installed it today http://www.opengl.org/discussion_boards/ubb/smile.gif

The demos are amazing, particularly the chameleon and zoltar.

Anyway, I tried your 3d texture prog humus - it popped up a blank (white) screen and set my desktop resolution to 1024x768 @ 60Hz. That's it, though http://www.opengl.org/discussion_boards/ubb/frown.gif

I have some code that I made using 3d texturemaps in the past. Within the next day or two I'll give it a try and let everyone know for sure whether there is hardware support for 3d tex maps (at least with my current drivers).

-- Zeno

Humus
05-05-2001, 04:54 AM
Hmm ... that was bad http://www.opengl.org/discussion_boards/ubb/frown.gif
One thing though, did you let the app run more than say 15 seconds? Since it runs in software mode I guess it could take a while before anything comes up on screen since it's quite fillrate limited.

DaViper
05-07-2001, 12:09 AM
hmm looking at the NVIDIA Extensions Spec it tells that 3D Textures are only supported in software....

Chris

Zeno
05-07-2001, 12:56 AM
Well, the first test is in. I tried my 3d texturemapping program under linux with the 0.9-769 drivers, and it's definitely software rendering. However, the drivers seem to lack a lot of gf3 features (although they do identify it correctly at xwin startup).. For instance, I didn't see the vertex program extension when I checked the extensions string. I hope drivers with that feature come out soon http://www.opengl.org/discussion_boards/ubb/smile.gif

Tomorrow I will port the program to windows where the (leaked) drivers are more advanced and let you know if I get hardware or software rendering.

-- Zeno

moichi
05-23-2001, 08:55 AM
New version of nvOpenGLspecs.pdf available.
And extension support table update to "EXT_texture_3D is supported in NV2x family(not software)".
I don't know what this change mean.
http://www.nvidia.com/marketing/Developer/DevRel.nsf/pages/A86B9D846E815D628825681E007AA680

Humus
05-23-2001, 05:07 PM
So, it is supported after all?

Zeno
05-23-2001, 09:04 PM
Ooh http://www.opengl.org/discussion_boards/ubb/smile.gif It is indeed checked off as hardware accelerated on this new pdf.

It also says "The extension support colums are based on the latest & greatest Nvidia driver release". Can anyone say for sure whether this works under 12.4? I don't remember if I tried my 3d texmap program after I installed that version, but I had to bump back down to 12.2 due to some crashes.

-- Zeno

ffish
05-24-2001, 03:12 AM
Any comments on this new pdf if you're allowed to say Matt or Cass? I'm itching to go down to the shop and order my new GeForce3! http://www.opengl.org/discussion_boards/ubb/smile.gif Just want to hear confirmation from the horse's mouth. I promise it won't go further than this discussion board http://www.opengl.org/discussion_boards/ubb/smile.gif I just hope that the pdf is correct.

[This message has been edited by ffish (edited 05-24-2001).]

ffish
05-24-2001, 03:26 AM
Just read a little more of the specs. The imaging subset is hardware accelerated too!? I thought you said a little while ago that that was way too expensive on consumer hardware Matt? Whoa, now I _really_ have to get the GeForce3. I _need_ convolution filters and 3D textures in hardware for my volume rendering thesis. I didn't think I'd get them both this year! <fingers crossed/>

05-24-2001, 02:41 PM
You're misreading the meaning of the ARB_imaging row. We do accelerate blend color, blend subtract, and blend minmax, which are part of ARB_imaging. However, that in no way suggests that we accelerate some of the fancier things.

Further clarifications relating to 3D textures should be forthcoming. I'm confused myself by all the different contradictory statements we've put out.

- Matt

ffish
05-24-2001, 03:57 PM
Thanks for the clarification re imaging subset Matt. I guess I'll have to wait for convolution filters in hardware http://www.opengl.org/discussion_boards/ubb/frown.gif. The confusion about 3D textures implies to me that they can be accelerated in hardware. I just hope you're allowed to do it in a (current?) future driver release. I've said before, there'll be at least a couple of us here that will buy a GeForce3 on the basis of accelerated 3D textures alone - otherwise I'll wait to upgrade my GeForce 2 GTS until the next generation of cards.

cass
05-24-2001, 06:16 PM
I apologize that the story on 3D textures has been ambiguous and frustrating to all. My suggestion would be to not waste time speculating. We will make a definitive statement soon, but exactly when is someone else's call.

Cass

djmk-ultra
06-08-2001, 01:29 PM
Originally posted by ffish:
I _need_ convolution filters and 3D textures in hardware for my volume rendering thesis. <fingers crossed/>[/B] ... I can only assume you have read the Rezk-Salama/ Engel paper on Volume Rendering on PC hardware. They have a nice hack for getting 3d-interp out of 2d textures for volume rendering, I can think of a few hacks to make this work for general geometry, an exersise left to the reader. As for Convolution, I have my fingers crossed as well.
--joey

ffish
06-08-2001, 06:33 PM
Thanks, joey. Yeah, I've read it. I just like the idea of ease of implementation of 3D textures. If they ever become as supported and optimized as 2D textures volume rendering will be so much easier, along with a lot of other useful effects.

davepermen
06-09-2001, 02:06 AM
humus, your 3d-texture is running smoothly on my gf2mx!

smoothly with bout 4 fpm, frames per minute

on a pentium3 500.. not too bad http://www.opengl.org/discussion_boards/ubb/wink.gif for what do i need an ati radeon, if my gf can do it in software? http://www.opengl.org/discussion_boards/ubb/wink.gif http://www.opengl.org/discussion_boards/ubb/wink.gif

davepermen
06-09-2001, 02:08 AM
oh, btw, could you please give the possiblilty to set the resolution of the demo? i would like to try 320x240 or something like that, perhaps it will be a little faster then..

jwatte
06-09-2001, 08:46 AM
Lessee, it's several weeks since "an answer to the 3D texture question will be forthcoming."

Bump.

PS: Convolution in hardware probably ain't gonna happen in the GF3 series -- and I'd rather see accumulation buffer than convolution in the GF4 :-)

djmk-ultra
06-13-2001, 07:13 AM
Originally posted by jwatte:
Lessee, it's several weeks since "an answer to the 3D texture question will be forthcoming."
Bump.
PS: Convolution in hardware probably ain't gonna happen in the GF3 series -- and I'd rather see accumulation buffer than convolution in the GF4 :-)

Hi, this is from a D3D discussion forum:

"... The GeForce3 support for DirectX will include true volume textures run
in hardware, on the GPU without intervention from the CPU. There will be no
hackery involved, the hardware natively supports volume textures and that
feature will be exposed in a forthcoming driver..."
"..."We intend to expose full and robust hardware support for volume textures,
including compressed volume textures, on GeForce3 in a future driver."

You can quote me on that.

Thanks,

Richard "7 of 5" Huddy
Developer Relations, NVIDIA Corporation."

OK, so that doesn't say anything about opengl, but close. Cass? any comment. I have heard that the delay is a marketing agreement with a, oh lets say, former graphics giant.

luv
joey

Zeno
06-13-2001, 03:16 PM
Mmmmm...juicy info.

I can't wait http://www.opengl.org/discussion_boards/ubb/smile.gif

-- Zeno

cass
06-14-2001, 02:10 PM
All the information from Richard is accurate. GeForce3 3D texture support will be turned on in a future driver, and that future is Soon.

Thanks -
Cass