Please, anyone knows if there is any way to
detect the implementation’s maximum texture sized internal format bit depth? For example
when I’ve set the internal format for texture to GL_RGBA, on both RagePro and Riva TNT2 the
textures were converted by both ICDs to 4bit.
The bit depth had to be set explicitly to a
maximum of GL_RGB5_A1 on RagePro, and GL_RGBA8 on TNT2 to achieve better results…
Is there any way of detecting the maximum for
an implementation? (choosing GL_RGBA8 on RagePro leads to that it chooses GL_RGBA4 as
the internal format, not the GL_RGBA5_A1 which is closer to the one desired.)