Can i set graphics card to always use 32bit depth for textures in OpenGL ?

Is there any way to set this in my program? On every ATI cards is this set to 16bit by default, and even if I inicialize window with 32bit color depth, textures are still in 16bit and it looks bad.

Use the internalformat parameter in the glTexImage2d call… thats always the correct way, then should ATI fix the quality slider in the controlpanel, but that doesnt matter if you ask for the correct texture format to begin with.

I’m programming in delphi, and it doesn’t recognize format like GL_RGB16, although I call glTexImage2d from opengl32.dll.
And is there any way to let opengl build mipmaps when I use glTexImage3d?

first of all, GL_RGB16 would mean 16 bit per component, that you dont have in any HW at the moment… second, the GL_RGB4. GL_RGB8 and so on are specified in the opengl specification, and in the glext.h, its not that compilcated to read a C/c++ define and make a delphi const out of it ( im doing all opengldevelopment in delphi as well)