Color palettes on windows xp with an ATI Radeon 9800

hi, i’m currently writing a small game with opengl on windows xp, and i want to use a color table when rendering an image (the color table shifts every frame to create a neat effect with the static image). supposedly this should be easily done with glColorTable() and glDrawPixels(), but i come to find that glColorTable() is part of the extension GL_ARB_imaging, which apparently ATI does not support.

my question is, does anybody know how to render with color tables on an ATI Radeon card (aside from writing it myself)?

i searched through ATI’s opengl extension reference for anything that might support palettes. didn’t find anything, obviously. i have the feeling the solution is right in front of my face, so please excuse my ignorance if that’s the case :slight_smile:

thanks,
byron

i think i’m getting closer, but i’m still having problems.

so now i’m trying to use wglGetProcAddress() to get the glColorTable function. my code is like this:

PFNGLCOLORTABLEPROC glColorTable = 0;

*
*
*

glColorTable = (PFNGLCOLORTABLEPROC)(wglGetProcAddress("glColorTable"));
assert(glColorTable != NULL); // assert() fails here
glColorTable(GL_COLOR_TABLE, GL_RGB, 256, GL_RGB, GL_UNSIGNED_BYTE, table);
 

the assert() fails, and i don’t know why. any suggestions?

-byron

Radeons don’t support it. There’s no use trying to get the function, because it’s simply not there.

Why don’t you just …

uint palette[256];
uint rgba_buffer[];
ubyte paletted_buffer[];

for (every_pixel)
{
   rgba_buffer[i]=palette[paletted_buffer[i]];
}

glDrawPixels(<...>,GL_RGBA,GL_UNSIGNED_BYTE,rgba_buffer);

well, this is actually a game that i wrote about a year ago, which i’m trying to update in order to work on my opengl skills. at the moment it does pretty much what you wrote in your post (for the color table stuff), but it runs at only around 10 or 15 fps, and i am trying to optimize it. i am looking up every pixel in the image in the color table every frame, since the color table is being shifted every frame (i am using glDrawPixels() to render the image). i’m fairly sure now that that’s not a big bottleneck, but i got off on a tangent trying to get the hardware to do all the color table and color table shifting stuff. perhaps it’s just not possible on a radeon?
it just seems stupid that ATI, one of the leading graphics card manufacturers, doesn’t support something as simple and useful as color palettes.

-byron

I kind of have the same problem that you were having (at that time at least…) did you find a solution? I had a strange idea… i don’t know if it is applicable…

Can we change the window dc PIXELFORMATDESCRIPTOR so that the pixel type is a color index instead of RGBA? We can after that easily change the dc’s palette.

Palette animation is so yesterday. :wink:
You can do this with a dependent texture read.
Download your color index data as a 2D texture, download a 1D texture with the color table, write a fragment shader which reads the index texture which will return a value between 0.0 and 1.0 and use that as the s-coordinate in a texture lookup on the 1D color palette texture.
Now you can either upload the 1D texture with every frame if you want complete control about the color change, or if you only want to rotate the colors around, add a programable offset to the fragment program which is added to the s-coordinate and update that every frame. Use GL_REPEAT wrap mode for the 1D texture in the latter case.

Set up three maps using glPixelMapfv() for GL_PIXEL_MAP_I_TO_R, GL_PIXEL_MAP_I_TO_G, and GL_PIXEL_MAP_I_TO_B then use glDrawPixels with format = GL_COLOR_INDEX

I think that’s what you want.

Don.

Performance normally sucks as soon as you add any mappings, scales or biases to the pixel path.

So you don’t think that changing the control’s dc palette can work?

Correct, I’m of the opinion that Windows palettes are for 256 color displays and nobody should use that anymore.
Windows XP doesn’t even offer that in the control panel and starts with 800*600 HighColor after installation.
The performance comment was for the glDrawPixels with index to color mappings in OpenGL. Everything which needs to touch the pixels during the glDrawPixels operation might fall off the fast paths.

Can this 1-D texture method work, if i have multiple 2-d textures that give a 3-d effect?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.