8-bits textures

Does the 256 colours textures increase the speed of the rendering (maybe the cards are designed for instead of 32bits textures ?). And how can i set the palette of the texture ? I know how to get it, but i don’t manage to give the palette to OpenGL. I saw a glGetPaletteEXT or something like that, but I don’t know how to use ewtentions. Can anyone help me ?
Thanks,
Antoche

You can only load RGB/RGBA formatted pictures with OpenGL.
Thus if you have an 8 bit picture with pallete, pixel 0 with value 13 has color 13 in your pallette, this gives you three values in the range 0…63 (probably).
red = pal[13].red/63255;
green = pal[13].green/63
255;
blue = pal[13].blue/63*255;

Hope this helps, (and that I got it correct too)
John

[This message has been edited by Sjonny (edited 03-22-2000).]

Another good way to upload 8-bit textures is to expand the texture to 24-bits (RGB triplets) when you read the texture file.
You’ll wanna read the palette data from the file and store it in a lookup table ( char [256][3] ) and then for each pixel read from the texture, use the lookup table to get the RGB values, and store those in the texture buffer. Then just upload the texture buffer as usual to OpenGL.

So 8bits textures are just a waste of time and color depth. I thought they were faster fot 3d cards rather than 24bpp textures…

my perception was that 8bit textures were good for saving memory, and weren’t that much slower to display, probably a negligible difference - because the colortable is loaded into the GPU’s memory, so i guess it depends on the card’s handling of that

There is some bad information above.

OpenGL does support 8 bit textures natively-- you can use GL_ALPHA or GL_LUMINANCE for greyscale textures (masks, or height maps for example.)

But you are asking about 8 bit color paletted textures. These are (inexplicably) not supported by native OpenGL. You can convert your 8bit texture to 24bit RGB or 32bit RGBA when you upload it to the GPU during glTexImage2D() by specifying the appropriate format and internal formats. Go read the man page.

Now, since they are a very useful idea, there are extensions for real paletted textures GL_EXT_paletted_texture and GL_EXT_shared_texture_palette. However they only seem to be implemented by nvidia, not ATI, so they aren’t of much use in a real-world application.

You can also use glPixelMap for some on the fly palette conversion, but this is terribly slow. Or, on very recent cards, you can write a fragment program to do texture lookups with colorization.

I have an nvidia card - how do I use these extensions in my code?

As for every extension, read the offical docs at the OpenGL Extension Registry :
http://oss.sgi.com/projects/ogl-sample/registry/EXT/paletted_texture.txt

8bit textures used to be commonly supported with the palette extension to save bandwidth and memory, but this has been deprecated. DO NOT USE IT. Some of the best new hardware does not support it.

To save bandwidth and space today the best approach is to use the compressed texture formats supported in hardware.

Read this for info on how to do this:
http://developer.nvidia.com/attach/1506

Your forgetting another good reason for paletted textures… The age-old art of palette animation… That is… animating an image simply by changing the color of an entry in the palette.

I wasn’t forgetting it, however something that isn’t supported in drivers cannot be done on those cards.

There is a more important loss of functionality because of this missing feature IMHO, and that’s the easily implementable post LUT color interpolation for things like material classification in scientific visualization.

I think this is all a bit moot since the discussion implies performance is the motivation.

If it ever gets implemented on some current hardware that doesn’t support the feature it will require a dependent texture read (or actually multiple of them for filtering) and be slower than alternatives, and looking forward into the future it is likely that this feature will be implemented as a dependent texture read on all hardware if it exists at all.

Like I said, it’s deprecated, so don’t use it. If you do use it then you need a fallback path on new hardware.

OTOH, if you want to squeeze more info on legacy cards I suppose this is the way, just be aware that the better way to do this on newer hardware is the compressed texture extensions I pointed at, compressed texture is not that new and probably also exists on lots of old hardware.

[This message has been edited by dorbie (edited 12-29-2003).]

I was going to ask where you got your information about EXT_paletted_texture being deprecated, but then I found this thread on the advanced forums. That’s too bad that they don’t support it in the GeForceFX cards. I found it to be a fun little extension to play with.

Anyway, the point I was trying to get across before is just that performance isn’t the only reason you might want to use paletted textures. Granted, that is what the original post was intended for, but I was simply pointing out another reason someone might choose to use paletted textures. Oh well…

Oh man, Toadman is bummed

Could have had some cool effects