problem with 24bit textures

I have a problem with my AtiRadeon 9700 card, when I load a 24bit texture, during rendering it is rendered as a 16bit texture (I can clearly see the “change-over” borders of color).

I am creating the texture with the following call:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE, texture.data);

I have a 32bit viewport, I tried that with polygon shading and it looks ok. The bitmap data are allright 24bit.
And what is more - the same code seems to work OK on nVidia cards (like geforce2), but not on my radeon.

Try to use

glTexImage2D(GL_TEXTURE_2D, 0, 3, texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE, texture.data);

GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now…

Originally posted by fuxiulian:

GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now…

3 does not mean 3 bytes. It has exactly the same meaning as GL_RGB. Both means 3 color components, and none of them specifies any specific order of the components or bit depth.

Bob’s right.

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB[b]8[/b],
   texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE,
   texture.data);

This will correct the issue.
It’s a driver regression (though technically not a spec violation). ATI are aware of it.

Thank you very much for your reply, I will try it.