PDA

View Full Version : Using 8bit textures instead of 24bit



07-05-2000, 09:19 AM
I'm using Win32 and not using GLUT or MFC.

I'm currently loading a 24bit bitmap for
my 2 colour fonts (a bit wasteful!).
The 24bit bitmap loads fine, but any image
of less bit depth appears blank.

This is the format i'm using for 24bit bitmaps:

glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,texture->Width,
texture->Height,0,GL_RGB,GL_UNSIGNED_BYTE,texture->Image);

but it doesn't work for 8bit.
What should go i place of the GL_RGB?
What do you have to do differently with 8bit textures?

Can anyone point me in the right direction?
Any help much appreciated! http://www.opengl.org/discussion_boards/ubb/smile.gif

answer
07-05-2000, 10:09 AM
I think that one needs to specify Indexed color when using 8bits of depth (and thus, 8bit textures)

answer
07-05-2000, 10:14 AM
Check out http://nate.scuzzy.net/gltut/ for a demo which uses 8bit textures

07-06-2000, 06:25 AM
Thanks, it looks like that may do it!
http://www.opengl.org/discussion_boards/ubb/biggrin.gif

gtada
07-07-2000, 03:54 PM
Out of curiosity: is your bitmap 24-bit or 8-bit? Will OpenGL do the conversion for you if it is 8-bit?

Greg

blide
07-07-2000, 11:37 PM
I have been using Nehe's tutorials and have done the texture mapping one. Using photoshop I was able to make 24bit textures that worked fine. Then my friend made an 8bit texture through microsoft paint and it also worked fine. I don't know much about opengl...but from what I saw it did not care what color depth the texture was it still worked fine. Btw, since most of my code was straight of Nehe I was using bitmaps...and Nehe's bitmap loading function.

Also, my code looks exactly the same as yours except for the third parameter to glTexImage2D...instead of GL_RGB I am using the number 3. Maybe this will fix it...however I have no idea!

http://www.opengl.org/discussion_boards/ubb/smile.gif

[This message has been edited by blide (edited 07-08-2000).]