PDA

View Full Version : problem with 24bit textures



Platinum
11-02-2003, 02:20 AM
I have a problem with my AtiRadeon 9700 card, when I load a 24bit texture, during rendering it is rendered as a 16bit texture (I can clearly see the "change-over" borders of color).

I am creating the texture with the following call:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE, texture.data);

I have a 32bit viewport, I tried that with polygon shading and it looks ok. The bitmap data are allright 24bit.
And what is more - the same code seems to work OK on nVidia cards (like geforce2), but not on my radeon.

fuxiulian
11-02-2003, 03:54 AM
Try to use

glTexImage2D(GL_TEXTURE_2D, 0, 3, texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE, texture.data);

GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now...

Bob
11-02-2003, 04:03 AM
Originally posted by fuxiulian:

GL_RGB is just a hint. It could be GL_RGB8 or something else. If you are using 3 instead of GL_RGB, you are telling OpenGL that you want 3 bytes. Hopefully it will work now...

3 does not mean 3 bytes. It has exactly the same meaning as GL_RGB. Both means 3 color components, and none of them specifies any specific order of the components or bit depth.

zeckensack
11-02-2003, 07:39 AM
Bob's right.


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8,
texture.width, texture.height, 0, GL_RGB, GL_UNSIGNED_BYTE,
texture.data);
This will correct the issue.
It's a driver regression (though technically not a spec violation). ATI are aware of it.

Platinum
11-03-2003, 06:09 AM
Thank you very much for your reply, I will try it.