I have a problem with my AtiRadeon 9700 card, when I load a 24bit texture, during rendering it is rendered as a 16bit texture (I can clearly see the "change-over" borders of color).
I am creating the texture with the following call:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE, texture.data);
I have a 32bit viewport, I tried that with polygon shading and it looks ok. The bitmap data are allright 24bit.
And what is more - the same code seems to work OK on nVidia cards (like geforce2), but not on my radeon.