gl aint telling me the truth about texture internal format

what i do is create a texture with a specific internal format eg GL_RGB4 .
and then i query the internal format like so
GLint format;
glGetTexLevelParameteriv(GL_TEXTURE_2D,0,GL_TEXTURE_INTERNAL_FORMAT,&format);
i get back the right answer GL_RGB4

now if i load + draw someit with
GL_RGB4 and GL_RGBA4 they should look the same on the screen right? ie they both use the same number of bits 4 for the rgb data.
but they look different. GL_RGB5_A1 though looks the same as GL_RGB4. i notice the spec saiz that gl is only required to choose an approxiament internal representation of what u ask for but surely when i use this glGetTexLevelParameteriv it’ll return the true internal representation + not the one that i asked for

You’re actually querying what you requested the internal format to be, not something that corresponds to the true HW internal format.

To get information about bit depths, you should be querying one of GL_TEXTURE_RED_SIZE, GL_TEXTURE_GREEN_SIZE, GL_TEXTURE_BLUE_SIZE, GL_TEXTURE_ALPHA_SIZE, GL_TEXTURE_LUMINANCE_SIZE, or GL_TEXTURE_INTENSITY_SIZE.

In our case, GL_RGB4 and GL_RGB5 both correspond to a 565 format, while GL_RGBA4 is 4444 and GL_RGB5_A1 is 1555.

  • Matt

cheers matt.
hehe G3_R3_B2 gives me better quality than GRBA4 (prolly r5g6b5) looks like im gonna have to write a program to enumerate all the formats seeing what i ask for and what i get given