Using 3D Textures

Hi,

 I want to use 3D textures. I have a nVidia GeForce 256 II GTS 64mb card. It seems to be pretty nice. I'm expecting there must be some way (even if it is relatively slow) to do 3D textures. Could someone please help me here? Do I need an extension? (WinNT comes with 1.1 right?). What if I can't find the extension? Arggh!

Thank you
Justin Voshell

(please don’t tell me I have to buy a radeon card… we just bought two of these cards and they are $$$)

Yes, you need the following extension :

GL_EXT_texture3D

You can find info on extensions at :
http://oss.sgi.com/projects/ogl-sample/registry/

And the good news is that your card supports them very well !

I don’t have any link to 3D textures tutorials but if you know how to use 2D ones, it shouldn’t be too difficult for you to use the 3D ones !

Regards.

Eric

Oooooooopppppppsssssssss !

Big mistake from me…
I have just checked and it seems that only the ATI Radeon can use 3D textures at the moment (at consumer level…).

I am sorry I said the GTS could…

Eric

OK, let’s make things clear (that’s funny coz’ there is actually another thread where they are talking about 3D textures ! I am only reporting what they are saying in case you miss it…).

So, as long as you drivers are OpenGL 1.2 compliant (which is the case for nVidia drivers), you can use 3D textures (That’s good news isn’t it ?!??).

BUT, the only consumer-level card that will actually handle these textures in hardware is the ATI Radeon…

Which means that even your GTS will crawl when using 3D textures…

But you can at least try them !!!

Ok, I suppose this closes the discussion !

Regards.

Eric

P.S. : I should have pointed you to this other thread ; that would have been quicker !

Ok, thanks for sorting this out for me. I am curious how I go about using OpenGL 1.2 functions even though I only have the 1.1 OpenGL opengl32.lib (Microsoft’s). Do I treat everything as an extension, and get pointers to the functions, or do I just call them and link to some nVidia .lib?

Thanks
Justin Voshell

Yes, just treat all OpenGL 1.2 entry points as though they are extension entry points. You’ll need to do a wglGetProcAddress on glTexImage3D in this case. You’ll also need a glext.h that has the appropriate enumerants.

3D textures definitely do work on our drivers, but don’t expect anything much in the way of performance.

  • Matt

As I was fed up with that kind of problem, I have written a class that wraps OpenGL rendering contexts and its associated extensions.

When you call the Initialize member function, it loads all the available extensions. Then, when you use the MakeCurrent function, it copies the RC function pointers to global variables that are accessible from anywhere in the program.

Surely, there is a performance hit (coz’ I copy all the function pointers each time I do a MakeCurrent) but then I can use extensions as if they were normal gl calls…

You should probably do something similar !

But do not forget : you still have to test if the extension is available before using it !!! For example, I use the GL_EXT_separate_specular_color extension for rendering specular highlights with textures in one pass. The thing is, when I render to a bitmap (for saving in a file), I use the Microsoft GDI Generic OpenGL implementation, which does not have this extension !

So somewhere in my code I have :

if (bGL_EXT_separate_specular_color)
DoItInOnePass();
else
DoItInTwoPassesAndTellThisGuyToBuyAGraphicsCard();

Anyway, I think this is a really nice system to use extensions with minimal headache !

Best regards.

Eric