support for bitmap < 8 bpp???

Help!!!

I have been trying to work with bitmaps with less than 8 bits per pixel without success. I have used glColorTable to implement the color lookup, and use glTexImage2D to specify the texture. 8 bit works fine with texType of GL_UNSIGNED_BYTE. However, for 4 bit and 1 bit I have tried using GL_BITMAP as the type with GL_COLOR_INDEX as the format but without success - basically the texture is black (completely black screen, as when either function errors it doesn’t map the texture to my objects and they show up as white having used glColorf(1.0,1.0,1.0).

I have searched the discussion groups for GL_BITMAP, but there was little information directly relevant. Also, there was a post in the beginner’s section but the 3 of us could come up with nothing and no one else volunteered anything.

So the question, is it possible to support < 8 bpp such that it uses a lookup table and that it recognizes that there is more than one pixel per byte? I would rather not convert the data into larger memory unless that is absolutely the only way. From the docs I have read within extension registry as well as the red book and blue book it seems it should be.

Below are the functions I use to get the resource and load it into GL. Please let me know if you see a solution or know of one.

void CglTexture::LoadBitmapResource(UINT id)
{
HRSRC Resource = (HRSRC)LoadResource(AfxGetInstanceHandle(), FindResource(AfxGetInstanceHandle(), MAKEINTRESOURCE(id), MAKEINTRESOURCE(RT_BITMAP)));
unsigned char data = (unsigned char)LockResource(Resource);

bmpInfo = (BITMAPINFO*)data;
colors = data + bmpInfo->bmiHeader.biSize;
pixels = colors + bmpInfo->bmiHeader.biClrUsed * sizeof(bmpInfo->bmiColors);

GLenum format = GL_RGBA;
intFormat = GL_RGBA;

switch (bmpInfo->bmiHeader.biBitCount)
{
case 0:
case 1:
intFormat = GL_COLOR_INDEX1_EXT;
texFormat = GL_COLOR_INDEX;
texType = GL_BITMAP;
break;
case 4:
intFormat = GL_COLOR_INDEX4_EXT;
texFormat = GL_COLOR_INDEX;
texType = GL_BITMAP;
break;
case 8:
intFormat = GL_COLOR_INDEX8_EXT;
texFormat = GL_COLOR_INDEX;
texType = GL_UNSIGNED_BYTE;
break;
case 16:
switch(bmpInfo->bmiHeader.biCompression)
{
case BI_BITFIELDS:
if (*(colors + 1) == 0x07E0)
{
texFormat = GL_RGB;
texType = GL_UNSIGNED_SHORT_5_6_5;
break;
}
case BI_RGB:
texFormat = GL_RGB5_A1;
texType = GL_UNSIGNED_SHORT_5_5_5_1;
break;
}
break;
case 24:
texFormat = GL_BGR;
texType = GL_UNSIGNED_BYTE;
intFormat = GL_RGB;
break;
case 32:
switch(bmpInfo->bmiHeader.biCompression)
{
case BI_RGB:
texFormat = GL_RGBA;
texType = GL_UNSIGNED_BYTE;
case BI_BITFIELDS:
texFormat = GL_RGBA;
texType = GL_UNSIGNED_BYTE;
}
break;
default:
texFormat = GL_RGB;
texType = GL_UNSIGNED_BYTE;
break;
}
if (texFormat == GL_COLOR_INDEX)
{
glEnable(GL_COLOR_TABLE);
glColorTable(texTarget, format, bmpInfo->bmiHeader.biClrUsed, GL_BGRA, texType, colors);
err = glGetError();
if (err)
DoError();
}
}

void CglTexture::TexImage(int level)
{
glTexImage2D(texTarget, level, intFormat, bmpInfo->bmiHeader.biWidth, bmpInfo->bmiHeader.biHeight, 0, texFormat, texType, pixels);
err = glGetError();
if (err)
DoError();
}

Thanks,
Patrick

Just thought I’d add my experiences while messing with this a bit. I used the following code for loading textures:

// Stickman figure
unsigned char BITMAP_DATA[16] =
{
0x00,
0x00,
0x00,
0x82,
0x44,
0x28,
0x10,
0x10,
0x7C,
0x10,
0x18,
0x3C,
0x18,
0x00,
0x00,
0x00
};

unsigned char PALLETTE =
{
0, 0, 0,
255, 255, 255
};

void InitTexture()
{

unsigned char pBluePallette[256*3];
unsigned char pBlueTextureIndices[256];

for (int c=0;c<256;c++)
{
pBluePallette[c3] = rand() % 256;
pBluePallette[c
3+1] = rand() % 256;
pBluePallette[c*3+2] = 255-c;

  pBlueTextureIndices[c] = c;

}

glBindTexture(GL_TEXTURE_2D, 1);
if (glColorTableEXT)
{
//glColorTableEXT(GL_TEXTURE_2D, GL_RGB8, 256, GL_RGB, GL_UNSIGNED_BYTE, pBluePallette);
glColorTableEXT(GL_TEXTURE_2D, GL_RGB8, 2, GL_RGB, GL_UNSIGNED_BYTE, PALLETTE);
}

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, 8, 16, 0, GL_COLOR_INDEX,
GL_BITMAP, BITMAP_DATA);

glBindTexture(GL_TEXTURE_2D, 2);

if (glColorTableEXT)
{
glColorTableEXT(GL_TEXTURE_2D, GL_RGB8, 256, GL_RGB, GL_UNSIGNED_BYTE, pBluePallette);
}

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX8_EXT, 256, 1, 0, GL_COLOR_INDEX,
GL_UNSIGNED_BYTE, pBlueTextureIndices);

}

The first paletted texture didn’t work, the second does.

From reading the spec on the GL_EXT_paletted_texture extension, it seemed like maybe GL_BITMAP was an invalid type, but I may have misread it. I’m guessing that GL_BITMAP is only possible to use as a type in glTexImage2D when you have a palettized display. Again, that assumption might be incorrect, though.

Edit: Removed all reference to topic subject from edit URL to try and fix that code tag!

[This message has been edited by Deiussum (edited 03-14-2003).]

Weird. Tried to edit my above post to add the closing code tag and I got this message:

We cannot post this because it appears that you are tring to hack the topic subject. Use your back button to try again.

What?!?

Edit: Figured it out. When you click the edit link, the URL includes this in the query string:

TopicSubject=support+for+bitmap+<+8+bpp|QUS| |QUS| |QUS|

Evidently, it wasn’t converting it back into the appropriate topic and thought I was trying to hack it. Removing that query string and refreshing the edit page before making edits worked.

[This message has been edited by Deiussum (edited 03-14-2003).]

Sorry about that. Too many iterations of trying different things and I failed to get the code back into its original form. Originally, the glColorTable call had the type hard-coded to GL_UNSIGNED_BYTE, and it still does not work with GL_BITMAP within the glTexImage2D call regardless of what I use as the format.

However, I suppose it will be simpler to convert the pixel data (4 bpp or 1bpp) into GL_UNSIGNED_BYTE and then reconvert it with glTexImage2D back into 4bpp or 1 bpp internally.

Thanks,
Patrick