PDA

View Full Version : Implementing a "palette" using pixel shader?



Robbo
08-07-2003, 07:04 AM
Hi there,

Is it possible to implement "palettes" for rendering a greyscale image in with any set of colours you want using pixel shaders?

I have a texture map that basically consists of luminosity data and I want to define various user-selectable palettes to view the dataset with. I figured if I just were able to upload a palette texture, upload the luminosity data and fire off a pixel shader program, I might be able to do this. But how? I'm new to pixel shaders.

Any hints would be much appreciated.

Won
08-07-2003, 07:38 AM
You can simply use a dependent texture read. You read the "luminosity" texture and use that value to index a 1-D (usually) palette texture. Something like:

!!ARBfp1.0

TEMP indexSample;

OUTPUT out = result.color;

TEX indexSample, fragment.texcoord[0], texture[0], 2D;
TEX out, indexSample, texture[1], 1D;

Assuming luminosity is bound to texunit0 and palette to texunit1, and texcoords are coming from texcoord0.

The thing to be careful about is texture filtering. For palettized textures, you want the texture to be filtered after the index lookup, not before, which is what this example does. This might be ok, depending on how you use your texture.

-Won

Robbo
08-07-2003, 07:41 AM
Ok, but I'm having trouble understanding how dependant texture reads work - I mean I thought you just had access to the currently interpolated texels for the currently bound textures - if I can do a [] read from another texture unit, I guess I'm in luck. Thanks for the tip.

Won
08-07-2003, 08:03 AM
Note that the fragment program decouples texture units from texture coordinates. The TEX instruction forces you to specify the output register, the texture coordinate (which can be computed rather than passed in), and the indexing type. The filtering parameters are still set externally.

This means you can do stuff like sample the same texture multiple times without having to bind it to multiple units. This might be useful to perform custom filtering. You can also compute texture coordinates in the vertex or fragment program any way you seem fit.

-Won

Robbo
08-07-2003, 08:05 AM
Ok, I've got the hand of this now. So I can use, say, the texture bound to unit 1 even though my vertex format has only 1 texture unit - I mean I'm using the value from the first texture just to lookup into the second texture. I don't need a texture coord for the second texture too?

Robbo
08-07-2003, 08:45 AM
Its beautiful and elegant - currently our software is doing a CPU palette lookup for each pixel in the image before copying the surface to texture. We are limited to 256 colours! Now, I can announce the new version has a different colour for each possible value (our luminance info is 16bit) and 0% cpu usage http://www.opengl.org/discussion_boards/ubb/wink.gif

Thanks for the help.

Won
08-07-2003, 09:26 AM
No prob. You wouldn't happen to be doing medical viz would you?

-Won

Robbo
08-07-2003, 09:37 AM
Not medical no - Infrared Thermography. Does have some medical uses though.

Robbo
08-07-2003, 10:20 AM
I need to ask, for graphics cards that don't support at 16 bit luminance texture type - is it possible to somehow combine an RGB value into a single value for the dependant texture read? ie. we have 0x0F and 0xAC in the red and green component of the texture, which indexes the palette texture with a value of 0x0FAC?

vember
08-07-2003, 12:15 PM
robbo:
just get the texture with a 2D call instead, like: tex2D(blah,i.r,i.g)

Robbo
08-07-2003, 12:17 PM
Yes, I've got it - just use a 2d texture for the palette.

Won
08-07-2003, 02:39 PM
Once again, you need to be careful about texture filtering, this time into the palette texture.

-Won

jwatte
08-08-2003, 02:37 AM
The easiest way to get the filtering similar to that of paletted textures, is to set the palette texture to GL_NEAREST filtering mode. Then you know that you won't be getting weird smears or blends.

soda
08-12-2003, 08:31 AM
Originally posted by Robbo:

Its beautiful and elegant - currently our software is doing a CPU palette lookup for each pixel in the image before copying the surface to texture. We are limited to 256 colours! Now, I can announce the new version has a different colour for each possible value (our luminance info is 16bit) and 0% cpu usage http://www.opengl.org/discussion_boards/ubb/wink.gif

Thanks for the help.

Could you post some code to show how you get the trick working ?
I'd like to implement the same thing but I don't understand all steps !

Thx