HELP ! glColorTableEXT does not work for me

Hi everybody,

Well I have a problem using color palette with texture

I’m working with gl4java, a wrapper in java for opengl

I’m trying to update the color palette for a texture dynamically, I read all post on this subject in this forum, but it does not work for me !!!

My texture is a raw RGB file, its size is 1024 * 1024

Here is the code I’m using to create the texture :

//---------------------------------

gl.glEnable(GL_TEXTURE_2D);

  colors = new byte[256*3];

  for (int k=0; k<256 * 3; k+=3)
  {
    colors[k] =     (byte) k;
    colors[k + 1] = (byte) k;
    colors[k + 2] = (byte) k;
  }

  loadTiles();

  gl.glGenTextures(_nbTilesOneDim * _nbTilesOneDim, _texName);

  for (int i=0; i < _nbTilesOneDim; i++)
      for (int j=0; j < _nbTilesOneDim; j++)
      {
        _buffer = (byte[]) _tilesArray[i][j];

        gl.glBindTexture(GL_TEXTURE_2D, _texName[i * _nbTilesOneDim + j]);
        gl.glColorTableEXT(GL_TEXTURE_2D,GL_RGB,256,GL_RGB,GL_UNSIGNED_BYTE,colors);
        gl.glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

        gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
        gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);

        gl.glTexImage2D(GL_TEXTURE_2D,
                            0,
                            GL_COLOR_INDEX8_EXT,
                            _tileWidth,
                            _tileHeight,
                            0,
                            GL_COLOR_INDEX,
                            GL_UNSIGNED_BYTE,
                            _buffer);

    }

//--------------------------------------

Well the color is changed but the texture looks wrong : it seems that data are “wrong”

When I display the texture using RGB mode it is ok !

Someone can help me ?

thx

The computation of your palette looks weird. Is the overflow intended ?

If you want a linear ramp, your code should look like this:
colors = new byte[256*3];

for (int k=0; k<256; k++)
{
colors[k3 + 0] = (byte) k;
colors[k
3 + 1] = (byte) k;
colors[k*3 + 2] = (byte) k;
}

[This message has been edited by KlausE (edited 05-24-2003).]

Don’t forget to enable the color table:
glEnable(GL_COLOR_TABLE);
Everything else looks okay…

Well thx for your help but it does not work ;(

I made the following test :

I draw with a picture editor 4 squares in my source image and look it in my app :

I see only 2 squares, …, I’m wondering if all data from my image are really used by opengl !?

I do certainly something wrong but I don’t know what !

So do you know what is the meaning of GL_COLOR_INDEX8_EXT ?

Others ideas ?

I found my error !!!

I seems that my source image have to be a one channel image, not a RGB image !

As I need only one channel it’s ok now !!!

I’d like to know if it’s a “good answer” ?

GL_COLOR_INDEX8_EXT is the internal texture format that is required for paletted textures.

So now I have an other problem :

I tested my app on WinXP with a geforce2 and win2000 with on board graphical card : intel 82845g (and the latest drivers for it)

it works with the geforce2 but not with the intel chipset : glColorTableEXT and glColorTable are not avaible !!!

This is a recent card , so I don’t understand why it does not work

An idea ?

Coortables are not supported any longer by recent cards(GFFX also doesn’t support it). Sorry.

Originally posted by Zengar:
Coortables are not supported any longer by recent cards(GFFX also doesn’t support it). Sorry.

>>> so what can I use to do the same thing ?

You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

Originally posted by jwatte:
You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

could you explain this a little bit more ?
I don’t see what to use ;(

So I read the following specs for the intel chipset at http://www.intel.com/support/graphics/intel845g/feature.htm , so if I understand what is written , paletted textures are supported, so what’s wrong in my case ?

Hmm, you seem to have right. Colortables should be really supported by Intel. Maybe they support other extensions(SGI)? Or try another driver.

Originally posted by Zengar:
Hmm, you seem to have right. Colortables should be really supported by Intel. Maybe they support other extensions(SGI)? Or try another driver.

>>> I tested supported extensions with a little software : glview.exe and it reports that GL_EXT_PALETTED is not supported
I have the latest drivers so I don’t know what to do

I could test it on a PC with i815 chipset and the extension exists

So how can I handle paletted textures with hardware acceleration

Have you tried older drivers?

Originally posted by Zengar:
Have you tried older drivers?

>>> Well, no and I whish not

I’d like that my app work with any driver !
Or at least with the latest

Originally posted by jwatte:
You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

>>> could you explain your idea a little bit more please ?

Thx

Do a search on this forum. We already went into excruciating detail about three weeks ago.

In brief, you’d stick your palette in a GL_NEAREST 1D texture, and you’d then use a grayscale 8-bit texture to get a value out, which you pass in as the S coordinate of the 1D texture. This in effects implements palette look-up.

Originally posted by soda:
[b]
>>> Well, no and I whish not

I’d like that my app work with any driver !
Or at least with the latest[/b]

I don’t think that we’ve ever supported the paletted texture extension on the i845G and there aren’t any plans to. Quite frankly, the extension wasn’t very popular, and newer Intel chipsets (i855GM, i865G) don’t have hardware paletted texture support. From what I hear other vendors are dropping support for the extension as well. This means that your best bet is probably to find an alternative rendering technique.

That’s probably not the answer you’d like to hear, but I hope it clears things up.

– Ben

Originally posted by jwatte:
[b]Do a search on this forum. We already went into excruciating detail about three weeks ago.

In brief, you’d stick your palette in a GL_NEAREST 1D texture, and you’d then use a grayscale 8-bit texture to get a value out, which you pass in as the S coordinate of the 1D texture. This in effects implements palette look-up.[/b]

=========================================

Well I tryed it without success ;(

Here is what I do :

(init)

_bufferFloat = new float[256 * 3];

for (int i = 0; i < 256 * 3; i++)
_bufferFloat[i] = 0.5f;

gl.glGenTextures(1, _texName);
gl.glBindTexture(GL_TEXTURE_1D, _texName[0]);
gl.glTexImage1D(GL_TEXTURE_1D, 0, GL_RGB, 256, 0, GL_RGB, GL_FLOAT,
_bufferFloat);
gl.glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);

and here is the code in the display method :

float tf = new float[512];

// create a height map…
for (int l = 0; l < 512; l++)
tf[l] = (float) Math.random();

gl.glEnable(GL_TEXTURE_1D);
gl.glBindTexture(GL_TEXTURE_1D, _texName[0]);
gl.glEnableClientState(GL_TEXTURE_COORD_ARRAY);
gl.glTexCoordPointer(1, GL_FLOAT, 0, tf);

gl.glBegin(GL_QUADS);
gl.glTexCoord2f(0.0f, 0.0f); gl.glVertex2f( -512.0f, -512.0f);
gl.glTexCoord2f(0.0f, 1.0f); gl.glVertex2f( -512.0f, 512.0f);
gl.glTexCoord2f(1.0f, 1.0f); gl.glVertex2f(512.0f, 512.0f);
gl.glTexCoord2f(1.0f, 0.0f); gl.glVertex2f(512.0f, -512.0f);
gl.glEnd();

gl.glDisableClientState(GL_TEXTURE_COORD_ARRAY);

gl.glDisable(GL_TEXTURE_1D);

I see the result of the 1d texture on the quad but not data from my height map !

Someone can help me ?

How is your code intended to forward the output of the 1D look-up into the texture coordinates of the 2D texture?

Hint: ARB_fragment_program, or the nVIDIA GeForce3 texture shaders, or the ATI Radeon 8500 shaders, are necessary.