An *EVIL* Puzzle

Here’s an evil puzzle for you to solve. You’re a programmer trying to create an RTS game engine, and your project has the following restrictions:

  1. You are trying to target a very large number of platforms and system configurations, so you shouldn’t use any OpenGL extensions unless you have an alternative for ALL cards that don’t support it.

  2. You have a great number of legacy 8-bit palettized textures you have to use, all potentially at the same time, and you can’t change the format in which they are stored on disk.

  3. Your code has to be able to change the palette of any 8-bit texture being rendered. The texture may even be rendered several times per frame with a different palette each frame.

  4. Memory usage must be kept to a minimum. Therefore, converting every 8-bit texture to 32-bits and having separate copies of each 32-bit image for each palette is NOT an option.

  5. Your code must have a reasonable frame rate on a GeForce or better. (It can be slower on older generation cards, but it must be able to run on those slower cards.)

  6. You must include support for 32-bit RGBA textures.

Well, I think that’s it for the restrictions. Good luck!

How many times do I have to tell you? You won’t ever pass your computer courses if you keep tricking people into doing your homework for you!

VAPORWARE. I see Microsoft has scored more fresh meat.

It’s been awhile since I’ve seen the extension list of the base MS OpenGL implementation, but if memory serves, one of the things there was GL_EXT_paletted_texture. They even have that one defined in their header, so you can be reasonably certain that it will exist for all cards, and it should take care of many of the things you need.

About the only thing you can’t be sure of is point 5 because anything using purely MS’s OpenGL drivers is going to be slow for any large amount of triangles.

It’s been awhile since I’ve seen the extension list of the base MS OpenGL implementation, but if memory serves, one of the things there was GL_EXT_paletted_texture.

I’m already using this extension, but it pretty much only works on Nvidia GeForce family cards.

About the only thing you can’t be sure of is point 5 because anything using purely MS’s OpenGL drivers is going to be slow for any large amount of triangles.

The solution I use for cards that don’t have 8-bit palettized texture support is to first set the palette with the following code:

GLfloat red[256];
GLfloat blue[256];
GLfloat green[256];
GLfloat alpha[256];

for(i = 0; i < 256; i++) {
  red[i] = PalGetRValue(Palette->m_Colors[i]) / 255.0f;
  green[i] = PalGetGValue(Palette->m_Colors[i]) / 255.0f;
  blue[i] = PalGetBValue(Palette->m_Colors[i]) / 255.0f;
  alpha[i] = PalGetAValue(Palette->m_Colors[i]) / 255.0f;
}

glPixelMapfv(GL_PIXEL_MAP_I_TO_R, 256, red );
glPixelMapfv(GL_PIXEL_MAP_I_TO_G, 256, green);
glPixelMapfv(GL_PIXEL_MAP_I_TO_B, 256, blue);
glPixelMapfv(GL_PIXEL_MAP_I_TO_A, 256, alpha);

Then, I draw the 8-bit texture with this:

glPixelTransferi(GL_MAP_COLOR, true);
glDrawPixels(iWidth, iHeight, GL_COLOR_INDEX, GL_UNSIGNED_BYTE, pSurf);
glPixelTransferi(GL_MAP_COLOR, false);

The frame rate is horrific, though. When I intentionally disable the glColorTable support in my program, I only get about one to two frames a second on my GeForce2 GTS 64MB. However, it takes almost no time to load up, the memory usage is low, and the resulting rendering looks identical to what I get with glColorTable. I plan to use this as fall-back rendering method and supplement it with card-specific solutions to improve performance on various systems.

If there’s a better way to do this, I’d be very interested in hearing it. Also, if anyone has card-specific solutions to using 8-bit palettized textures that don’t involve the GL_EXT_paletted_texture, I’d also be most interested.