PDA

View Full Version : indexed color textures



Radish
04-01-2004, 11:08 AM
How do I use indexed color textures? What do I send to glTexImage2D(), and how do I give it the color table?

mikael_aronsson
04-01-2004, 10:11 PM
Hi !

If you run OpenGL in color indexed mode (huhhh!) then you have already setup a color table and you have to use that.

If you run OpenGL in non color index mode and your texture is color index you have to convert the image to a non color indexed texture when you load it.

Mikael

Radish
04-02-2004, 08:10 AM
Seems like having to convert on the fly for every texture would be a significant bottle-neck, but then lots of games use 256 color textures with OpenGL, so I guess I have no idea what I'm talking about. *shrugs*

chowe6685
04-02-2004, 08:53 AM
you just do all the conversion at load time, once you have created the texture there is no more conversion that needs to be done

Radish
04-02-2004, 10:27 AM
Okay, so there's no way of sending glTexImage2D() the value GL_COLOR_INDEX for the format parameter and have it work? Because the Redbook seems to suggest otherwise here:

Recall that when you specify your texture map with glTexImage*d(), the third argument is the number of R, G, B, A components to be selected for each texel. A single selected component is interpreted as a luminance value (L); if there are two, the first is luminance, and the second is an alpha value (A). Three components form an RGB color triple (C), and four components provide an RGB triple and a value for alpha. Note that this selection is performed after the pixel-transfer function has been applied. Therefore, it makes sense, for example, to specify a texture with a GL_COLOR_INDEX image because the indices are converted to RGBA values by table lookup before they're used to form the texture image.For the complete text: http://fly.cc.fer.hr/~unreal/theredbook/chapter09.html

Only suggests, though, doesn't really say anything clear or useful. Some time before that he says texture mapping only works in RGB mode, too, so he's not talking about OpenGL in indexed color mode. I'm confused.

Radish
04-05-2004, 11:07 AM
*cough* Sorry.

zeckensack
04-05-2004, 04:04 PM
What the red book says in that paragraph is that the texture image is converted from indexed to non-indexed during the texture image specification. Ie it will be stored as a regular, non-indexed texture in video memory.

The table is specified via glColorTable. You can look that one up in the spec. Might look like this

struct Thingy
{
ubyte r,g,b,a;
};

Thingy table[256];
//fill table entries
<...>

glColorTable(GL_COLOR_TABLE,GL_RGBA8,256,GL_RGBA,G L_UNSIGNED_BYTE,table);But there's a better way for this kind of thing. Certain hardware (http://www.delphi3d.net/hardware/extsupport.php?extension=GL_EXT_paletted_texture) supports the EXT_paletted_texture (http://oss.sgi.com/projects/ogl-sample/registry/EXT/paletted_texture.txt) and EXT_shared_texture_palette (http://oss.sgi.com/projects/ogl-sample/registry/EXT/shared_texture_palette.txt) extensions.

This will keep the texture in indexed format in video memory. Example usage:
glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);
glColorTableEXT(GL_SHARED_TEXTURE_PALETTE_EXT,
GL_RGBA8,256,GL_RGBA,GL_UNSIGNED_BYTE,table);
glTexImage2D(GL_TEXTURE_2D,level,
GL_COLOR_INDEX8_EXT,
width,height,0,
GL_COLOR_INDEX,GL_UNSIGNED_BYTE,
indexed_texture_data);

ioquan
04-05-2004, 11:19 PM
The palletted texture extension would be nice, except that it is up to this point, as far as I know, not supported on any ATI cards. Most of the newer NVidia cards support it.

Radish
04-06-2004, 08:32 AM
Damn, I use an ATI (and I'm certainly not the only one).

The reason I want to use indexed color is so I can use palette changes to reflect changes in armor and status and such for PC sprites. I assume since the texture in video memory is converted to RGBA and no longer subject to any palette, I'll have to reconstruct the texture (call glTexImage2D) every time I change the palette. Correct?

ZbuffeR
04-06-2004, 08:51 AM
Correct.

But if you have a small number of shades, just precompute them all, and only bind the correct one when needed.

gdewan
04-06-2004, 09:03 AM
If your hardware can use the results from one texture as coordinates for another texture, you can use that to fake it.

Radish
04-16-2004, 04:43 PM
New question.


Originally posted by zeckensack:

glColorTable(GL_COLOR_TABLE,GL_RGBA8,256,GL_RGBA,G L_UNSIGNED_BYTE,table);What would be the difference between sending GL_RGBA8 and GL_RGBA for the second parameter, internalformat, if any?

Radish
04-16-2004, 05:17 PM
And now there's another problem, in that VC++ can't seem to find glColorTable. It is prominently displayed in the specification, however. I suspect I need a newer OpenGL library or somesuch, though I'll be damned if I know how to get it. Or I'm just forgetting to #include something, but that would just be too easy.

Radish
04-16-2004, 06:26 PM
Fixed it. Found glext.h and all that. Then I had to #define GL_GLEXT_PROTOTYPES myself to get glext.h to do its job. Odd.

zeckensack
04-17-2004, 12:00 AM
Originally posted by Radish:
What would be the difference between sending GL_RGBA8 and GL_RGBA for the second parameter, internalformat, if any?You're right, it wouldn't make much of a difference. This was just a copy&paste artifact ;)

Internalformat for glColorTable behaves in exactly the same way as it does for glTexImage2D. You can ask for a specific storage resolution, if you don't want to have the "default" resolution, or if you're unsure what that default is.

It's just a hint, though. You may not get exactly what you want, most notably if the hardware doesn't support the format natively. Eg requesting RGBA16 on a Geforce2MX will silently be ignored. The hardware just can't do it, so the driver silently drops down to a supported format.

charliejay
04-18-2004, 02:04 AM
If your hardware will do fragment shaders, even if it's only ATI_TEXT_FRAGMENT_SHADER, and you have enough texture units, you might be able to achieve a paletted texture effect with just two textures and a parameter...

Radish
04-19-2004, 04:48 PM
Okay, nevermind, problem not solved. Even with glext.h and everything compiling fine I get "unresolved external symbol _glColorTable@24" when it tries to build. Goddammit!

ZbuffeR
04-21-2004, 12:24 AM
Are you compiling under windows ?

By the way, new entry points defined in glext.h should be initialized at runtime, see :
(above OpenGL 1.1 functionnality, extensions)
http://opengl.org/resources/faq/getting_started.html

http://glew.sourceforge.net/