I have a problem with my AtiRadeon 9700 card, when I load a 24bit texture, during rendering it is rendered as a 16bit texture (I can clearly see the “change-over” borders of color).
I am creating the texture with the following call:
I have a 32bit viewport, I tried that with polygon shading and it looks ok. The bitmap data are allright 24bit.
And what is more - the same code seems to work OK on nVidia cards (like geforce2), but not on my radeon.
GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now…
Originally posted by fuxiulian:
GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now…
3 does not mean 3 bytes. It has exactly the same meaning as GL_RGB. Both means 3 color components, and none of them specifies any specific order of the components or bit depth.