Monochrome textures

I have some large, monochrome textures for bitmap
fonts. The most memory efficient way to keep this
data in video memory is obviously as a 1 bit per
pixel texture (as opposed to 32 bit RGBA).

How do I instruct OpenGL to interpet the texture
data in this way? I can only work out how to get
one byte per pixel.

use GL_BITMAP as type in the glTexImage2d call.

You can use shaders:

  1. Use 8-bit font texture as S coordinate for 2nd texture unit - color 0.0 is mapped to 0.5/256 and color 1.0 is mapped to 255.5/256

  2. Use glTexGen for T coordinate for 2nd texture unit - it generates 0.5/8, 1.5/8 … 7.5/8 depending on x coordinate of screen.

  3. Put 256x8 texture to 2nd texture unit:
    -1st row: 0 255 0 255 0 255 0 255 …
    -2nd row: 0 0 255 255 0 0 255 255 …
    -3rd row: 0 0 0 0 255 255 255 255 …
    -…
    If you use GL_NEAREST filtering on both textures then your font texture will work as 1-bit texture.

Thanks!

Hmm, it’s not quite working with GL_BITMAP.

    glBindTexture(GL_TEXTURE_2D, gltex->id);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, gltex->w, gltex->h,
                            0, GL_COLOR_INDEX, GL_BITMAP, gltex->pixels);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

gltex->pixels contains 1 bit monochrome data.

Instead of the expected texture, I just get
a black quad. The other textures (RGB, RGBA etc)
work correctly.

What am I doing wrong? glGetError() is silent
on the matter…

I gave up using monochrome bitmaps about 4 years ago. Instead I use grayscale textures, with 0 for 0 and 255 for 1. The conversion is fast, using a look-up table, with 256 64-bit values. I doubt you can get HW acceleration with 1bpp anyway.

In glTexImage2D it’s the third parameter that defines what’s the internal storage format of the texture. 7th and 8th parameters only define what’s the format of data that you have in your system memory. It’s converted to texture’s internal format. I don’t think that GL_BITMAP can be used as internal format of texture. Perhaps there is some extension that allows this, but I doubt it.

I wouldn’t trust GL_BITMAP anyway. Perhaps it will work on your hardware but not on many others. It’s not a common case to use bitmaps with OpenGL. Textures are the ones that are favoured.

So your options are:
-use 8bits/pixel textures (wastes memory)
-store bitmaps in sys mem and use glDrawPixels (slow)
-use 8bits/8pixels textures (just as I described) with shaders - gives good speed and compression, but requires shaders

When you use COLOR_INDEX, you’re saying that the 1-bit per pixel is an index into a color map. You need to create a two-entry color map with glPixelMap prior to your teximage call.

COLOR_INDEX is not accelerated on modern Nvidia cards, dunno about ATI

I think all current cards has dropped palette support, since long even.

I’d personally do what I always done, use a luma texture (used as luma/alpha). If going for a full 64k UTS-2 font it may be worse, but for plain ASCII? We’re talking about at most 256char_xchar_y bytes for fixed font (non-fixed requires more work, but that’s also not very hard) for a total of a whopping 25KB for a 10x10 font. I think you need not worry about storage requirements in that case. :slight_smile: