PDA

View Full Version : HELP ! glColorTableEXT does not work for me



soda
05-24-2003, 04:26 AM
Hi everybody,

Well I have a problem using color palette with texture

I'm working with gl4java, a wrapper in java for opengl

I'm trying to update the color palette for a texture dynamically, I read all post on this subject in this forum, but it does not work for me !!!

My texture is a raw RGB file, its size is 1024 * 1024

Here is the code I'm using to create the texture :

//---------------------------------

gl.glEnable(GL_TEXTURE_2D);

colors = new byte[256*3];

for (int k=0; k<256 * 3; k+=3)
{
colors[k] = (byte) k;
colors[k + 1] = (byte) k;
colors[k + 2] = (byte) k;
}

loadTiles();

gl.glGenTextures(_nbTilesOneDim * _nbTilesOneDim, _texName);

for (int i=0; i < _nbTilesOneDim; i++)
for (int j=0; j < _nbTilesOneDim; j++)
{
_buffer = (byte[]) _tilesArray[i][j];

gl.glBindTexture(GL_TEXTURE_2D, _texName[i * _nbTilesOneDim + j]);
gl.glColorTableEXT(GL_TEXTURE_2D,GL_RGB,256,GL_RGB ,GL_UNSIGNED_BYTE,colors);
gl.glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);

gl.glTexImage2D(GL_TEXTURE_2D,
0,
GL_COLOR_INDEX8_EXT,
_tileWidth,
_tileHeight,
0,
GL_COLOR_INDEX,
GL_UNSIGNED_BYTE,
_buffer);

}

//--------------------------------------

Well the color is changed but the texture looks wrong : it seems that data are "wrong"

When I display the texture using RGB mode it is ok !

Someone can help me ?

thx

KlausE
05-24-2003, 06:02 AM
The computation of your palette looks weird. Is the overflow intended ?

If you want a linear ramp, your code should look like this:
colors = new byte[256*3];

for (int k=0; k<256; k++)
{
colors[k*3 + 0] = (byte) k;
colors[k*3 + 1] = (byte) k;
colors[k*3 + 2] = (byte) k;
}

[This message has been edited by KlausE (edited 05-24-2003).]

shinpaughp
05-24-2003, 06:02 PM
Don't forget to enable the color table:
glEnable(GL_COLOR_TABLE);
Everything else looks okay...

soda
05-25-2003, 12:34 PM
Well thx for your help but it does not work ;(

I made the following test :

I draw with a picture editor 4 squares in my source image and look it in my app :

I see only 2 squares, ..., I'm wondering if all data from my image are really used by opengl !?

I do certainly something wrong but I don't know what !

So do you know what is the meaning of GL_COLOR_INDEX8_EXT ?

Others ideas ?

soda
05-25-2003, 01:05 PM
I found my error !!!

I seems that my source image have to be a one channel image, not a RGB image !

As I need only one channel it's ok now !!!

I'd like to know if it's a "good answer" ?

KlausE
05-25-2003, 01:19 PM
GL_COLOR_INDEX8_EXT is the internal texture format that is required for paletted textures.

soda
05-26-2003, 12:51 AM
So now I have an other problem :

I tested my app on WinXP with a geforce2 and win2000 with on board graphical card : intel 82845g (and the latest drivers for it)

it works with the geforce2 but not with the intel chipset : glColorTableEXT and glColorTable are not avaible !!!

This is a recent card , so I don't understand why it does not work

An idea ?

Zengar
05-26-2003, 07:20 AM
Coortables are not supported any longer by recent cards(GFFX also doesn't support it). Sorry.

soda
05-26-2003, 08:49 AM
Originally posted by Zengar:
Coortables are not supported any longer by recent cards(GFFX also doesn't support it). Sorry.

>>> so what can I use to do the same thing ?

jwatte
05-26-2003, 08:57 AM
You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

soda
05-26-2003, 10:11 AM
Originally posted by jwatte:
You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

could you explain this a little bit more ?
I don't see what to use ;(

So I read the following specs for the intel chipset at http://www.intel.com/support/graphics/intel845g/feature.htm , so if I understand what is written , paletted textures are supported, so what's wrong in my case ?

Zengar
05-26-2003, 02:57 PM
Hmm, you seem to have right. Colortables should be really supported by Intel. Maybe they support other extensions(SGI)? Or try another driver.

soda
05-27-2003, 02:47 AM
Originally posted by Zengar:
Hmm, you seem to have right. Colortables should be really supported by Intel. Maybe they support other extensions(SGI)? Or try another driver.

>>> I tested supported extensions with a little software : glview.exe and it reports that GL_EXT_PALETTED is not supported
I have the latest drivers so I don't know what to do

I could test it on a PC with i815 chipset and the extension exists

So how can I handle paletted textures with hardware acceleration

Zengar
05-27-2003, 04:28 PM
Have you tried older drivers?

soda
05-28-2003, 04:45 AM
Originally posted by Zengar:
Have you tried older drivers?


>>> Well, no and I whish not

I'd like that my app work with any driver !
Or at least with the latest

soda
05-28-2003, 03:22 PM
Originally posted by jwatte:
You can use dependent reads out of a look-up texture (1D) using a luminance texture as input.

>>> could you explain your idea a little bit more please ?

Thx

jwatte
05-28-2003, 06:14 PM
Do a search on this forum. We already went into excruciating detail about three weeks ago.

In brief, you'd stick your palette in a GL_NEAREST 1D texture, and you'd then use a grayscale 8-bit texture to get a value out, which you pass in as the S coordinate of the 1D texture. This in effects implements palette look-up.

bashbaug
05-28-2003, 09:28 PM
Originally posted by soda:

>>> Well, no and I whish not

I'd like that my app work with any driver !
Or at least with the latest

I don't think that we've ever supported the paletted texture extension on the i845G and there aren't any plans to. Quite frankly, the extension wasn't very popular, and newer Intel chipsets (i855GM, i865G) don't have hardware paletted texture support. From what I hear other vendors are dropping support for the extension as well. This means that your best bet is probably to find an alternative rendering technique.

That's probably not the answer you'd like to hear, but I hope it clears things up.

-- Ben

soda
06-02-2003, 12:17 PM
Originally posted by jwatte:
Do a search on this forum. We already went into excruciating detail about three weeks ago.

In brief, you'd stick your palette in a GL_NEAREST 1D texture, and you'd then use a grayscale 8-bit texture to get a value out, which you pass in as the S coordinate of the 1D texture. This in effects implements palette look-up.

=========================================

Well I tryed it without success ;(

Here is what I do :

(init)

_bufferFloat = new float[256 * 3];

for (int i = 0; i < 256 * 3; i++)
_bufferFloat[i] = 0.5f;

gl.glGenTextures(1, _texName);
gl.glBindTexture(GL_TEXTURE_1D, _texName[0]);
gl.glTexImage1D(GL_TEXTURE_1D, 0, GL_RGB, 256, 0, GL_RGB, GL_FLOAT,
_bufferFloat);
gl.glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
gl.glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);

and here is the code in the display method :

float[] tf = new float[512];

// create a height map...
for (int l = 0; l < 512; l++)
tf[l] = (float) Math.random();

gl.glEnable(GL_TEXTURE_1D);
gl.glBindTexture(GL_TEXTURE_1D, _texName[0]);
gl.glEnableClientState(GL_TEXTURE_COORD_ARRAY);
gl.glTexCoordPointer(1, GL_FLOAT, 0, tf);

gl.glBegin(GL_QUADS);
gl.glTexCoord2f(0.0f, 0.0f); gl.glVertex2f( -512.0f, -512.0f);
gl.glTexCoord2f(0.0f, 1.0f); gl.glVertex2f( -512.0f, 512.0f);
gl.glTexCoord2f(1.0f, 1.0f); gl.glVertex2f(512.0f, 512.0f);
gl.glTexCoord2f(1.0f, 0.0f); gl.glVertex2f(512.0f, -512.0f);
gl.glEnd();

gl.glDisableClientState(GL_TEXTURE_COORD_ARRAY);

gl.glDisable(GL_TEXTURE_1D);

I see the result of the 1d texture on the quad but not data from my height map !

Someone can help me ?

jwatte
06-02-2003, 08:44 PM
How is your code intended to forward the output of the 1D look-up into the texture coordinates of the 2D texture?

Hint: ARB_fragment_program, or the nVIDIA GeForce3 texture shaders, or the ATI Radeon 8500 shaders, are necessary.

soda
06-03-2003, 04:09 AM
Originally posted by jwatte:
How is your code intended to forward the output of the 1D look-up into the texture coordinates of the 2D texture?

Hint: ARB_fragment_program, or the nVIDIA GeForce3 texture shaders, or the ATI Radeon 8500 shaders, are necessary.

>>> well my texture 2D is a heightmap and I'd like to use a color table to change quickly pixels color

So my texture 1D is my color table and my texture 2D is the heihtmap

But you wrote that I should use an Nvidia or Ati extension... but I'd like that the trick works with most latest graphical cards

So is this méthod only usable with Nvidia or Ati graphical cards ?

Thx for your help !

[This message has been edited by soda (edited 06-03-2003).]

Mihail121
06-03-2003, 11:26 AM
The question about the paletted textures has been widely discussed here and i think that the indexed images have prooved to be space and friendly all the way since DOS.If NVidia want's to manipulate the users and try to teach them what's good and what's not just becase UT2K3 doesn't use EXT_paletted_texture that won't work here!There is always ATI to turn to.I've noticed Mark Kilgard offen comes to this forum so i want to invite him to give his opinion over the paletted textures and the say(if he's allowed and know that answer of course) what are the future plans of NV for the indexed textures.10x!

Sancho
06-03-2003, 10:46 PM
Originally posted by Mihail121:
The question about the paletted textures has been widely discussed here and i think that the indexed images have prooved to be space and friendly all the way since DOS.If NVidia want's to manipulate the users and try to teach them what's good and what's not just becase UT2K3 doesn't use EXT_paletted_texture that won't work here!There is always ATI to turn to.I've noticed Mark Kilgard offen comes to this forum so i want to invite him to give his opinion over the paletted textures and the say(if he's allowed and know that answer of course) what are the future plans of NV for the indexed textures.10x!

You are completely right !

soda
06-04-2003, 09:26 AM
Well, how can I do then ?

At this time I use RGB textures and update on the fly the color of each pixel and copy the new image in my texture...

And it's a little bit slow even on a p4@2ghz !

Sancho
06-04-2003, 10:09 PM
Originally posted by soda:
Well, how can I do then ?

At this time I use RGB textures and update on the fly the color of each pixel and copy the new image in my texture...

And it's a little bit slow even on a p4@2ghz !

Perhaps with 2 texture units and a vertex program. The first unit will contain the color table (as a 1D or 2D texture) and the second unit will contain the indices (as 2D or 3D texture in luminance format).

David

soda
06-07-2003, 08:21 AM
Well, could someone help me a little bit more : I'm trying to use a GL_TEXTURE_1D to do my color palette but even with all advices from this topic I am against lost !

Is there some sample avaible ?

Thx

m2
06-07-2003, 11:03 AM
This is not exactly what you asked for, but almost. It uses the shared texture palette extension. Look it up in the OpenGL extension registry.




void apply_palette(GLubyte data[4][256])
{
GLubyte Table[4*256];

for(int i=0; i < 256; ++i)
{
Table[i*4+0] = data[0][i];
Table[i*4+1] = data[1][i];
Table[i*4+2] = data[2][i];
Table[i*4+3] = data[3][i];
}

glColorTable(
GL_SHARED_TEXTURE_PALETTE_EXT,
GL_RGBA8,
256,
GL_RGBA,
GL_UNSIGNED_BYTE,
(const GLvoid *)Table);
glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);
}


[This message has been edited by m2 (edited 06-07-2003).]

soda
06-13-2003, 12:27 PM
Originally posted by m2:
This is not exactly what you asked for, but almost. It uses the shared texture palette extension. Look it up in the OpenGL extension registry.




void apply_palette(GLubyte data[4][256])
{
GLubyte Table[4*256];

for(int i=0; i < 256; ++i)
{
Table[i*4+0] = data[0][i];
Table[i*4+1] = data[1][i];
Table[i*4+2] = data[2][i];
Table[i*4+3] = data[3][i];
}

glColorTable(
GL_SHARED_TEXTURE_PALETTE_EXT,
GL_RGBA8,
256,
GL_RGBA,
GL_UNSIGNED_BYTE,
(const GLvoid *)Table);
glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);
}


[This message has been edited by m2 (edited 06-07-2003).]

==========================

Thx for your trick but glColorTable is not supported by the intel chipset ;(

I don't know what to do ;(((

KlausE
06-13-2003, 01:57 PM
Believe us ... there is no platform-independent way to implement hardware-accelerated palettes.

Use a dependent texture fetch operation to lookup into your palette. All reasonable current and future graphics chips provide and will provide dependent texture fetch operations.

soda
06-13-2003, 03:18 PM
Originally posted by KlausE:
Believe us ... there is no platform-independent way to implement hardware-accelerated palettes.

Use a dependent texture fetch operation to lookup into your palette. All reasonable current and future graphics chips provide and will provide dependent texture fetch operations.

========================================

Ok but I'd like to understand some things :
- how can I use 1D texture with 2D texture to do the job ? (sample code would be cool)
- how others people who have posted some similar posts solved this problem ?

Thx

soda
06-13-2003, 03:28 PM
Originally posted by jwatte:
Do a search on this forum. We already went into excruciating detail about three weeks ago.

In brief, you'd stick your palette in a GL_NEAREST 1D texture, and you'd then use a grayscale 8-bit texture to get a value out, which you pass in as the S coordinate of the 1D texture. This in effects implements palette look-up.

>>> I searched on the forum "palette texture" and "1D texture" and found nothing interesting (I read all posts !)
I saw how using 1D texture to have texture shaders, 1D tex is used as a lookup table, but it was applied on triangles . So how can I use s coords from 1D to 2D tex ?

soda
06-21-2003, 01:58 AM
up

Mihail121
06-21-2003, 02:59 AM
Hey you,the Soda guy!Better take one soda cause what you've just do ain't very fair agains us.If there is no more post on your post that means that people doesn't care for it and it shoud go in the histroy!Man this place really needs a moderator......

zeckensack
06-21-2003, 03:21 AM
Well, while it's up ...
It puzzles me that this thread has 32 replies but so far nobody has asked this question:
soda,
Why do you want to use paletted textures? If you'd tell us what exactly you need, it would be much easier to recommend a proper replacement.

Take your pick:
a)paletted textures save texture memory, and you need a whole lot of that so you really must make savings somewhere
b)You want to do palette tricks for animation.
c)paletted textures save texture bandwidth, and you want it to be fast.


What is it?

soda
06-23-2003, 11:03 AM
Originally posted by Mihail121:
Hey you,the Soda guy!Better take one soda cause what you've just do ain't very fair agains us.If there is no more post on your post that means that people doesn't care for it and it shoud go in the histroy!Man this place really needs a moderator......

Well sorry to be so painful but I'd like to learn opengl and it is not so easy that it seems !

I read all docs I found , I bought a book , I tryed lots of things without success http://www.opengl.org/discussion_boards/ubb/wink.gif
Well I would like to get some explicit tricks, I'm not a guru !!!!

You are not fair writing that people doesn't care to this problem, if you made a search on this forum you can see that lots of people want to do the same thing !!!

However, this is exactly what I want to do :

I have some data stored in an array (A1, 16 bits)

I show these data using a texture and using some rules :

this is some pseudo code :

for each element in A1 do
if (A1[i] < xyz)
t[i*3]= a; // red
t[i*3 + 1] = b; // green
t[i*3 + 2] = c; // blue
else if (A1[i] < zyx)
t[i*3]= a2; // red
t[i*3 + 1] = b2; // green
t[i*3 + 2] = c2; // blue
....

Well you can see that if A1 is big , my texture is big too

So I cut the big texture in small textures and update them each frame

So the problem is that it is slow when I have more than 1024 * 1024 pixels to update

That's why I thought to use a paletted texture in a first approach
But as new gfx cards does not support this extension , I'm searching a new and fast way to do the job !!

So if really nobody can give me advices, I'll stop to update this post ;(

Thx to all people who given me advices !

Korval
06-23-2003, 12:07 PM
Well sorry to be so painful but I'd like to learn opengl and it is not so easy that it seems !

The point he was making is that you shouldn't bump threads because people aren't responding. If they aren't responding, it is generally considered that you should accept it as the unwillingness of people to help you, or (considering the prior responses) the notion that they have helped as far as they would like to.

[This message has been edited by Korval (edited 06-23-2003).]

m2
06-23-2003, 11:38 PM
Originally posted by soda:

Ok but I'd like to understand some things :
- how can I use 1D texture with 2D texture to do the job ? (sample code would be cool)
- how others people who have posted some similar posts solved this problem ?


Either what I posted before (with "slight" modifications if the shared texture palette extension is not available), or use dependent texture lookups like KlausE suggested. That means you use either the texture shader extension on older NVIDIA cards (3, 4, IIRC) or ARB fragment programs. If you use the texture shader extension beware, that thing is rather fscked up in some respects.

Example code:




glActiveTexture(GL_TEXTURE0_ARB);
glBindTexture(GL_TEXTURE_2D, name);
glTexImage2D(GL_TEXTURE_2D, 0,
GL_INTENSITY, tx, ty, 0,
GL_LUMINANCE, GL_UNSIGNED_BYTE,
data);
glActiveTexture(GL_TEXTURE1_ARB);
glBindTexture(GL_TEXTURE_2D, palette);
glTexImage2D(GL_TEXTURE_2D, 0,
GL_RGBA, dx, dy, 0,
GL_RGBA, GL_UNSIGNED_BYTE,
table);
glTexEnvi(GL_TEXTURE_SHADER_NV,
GL_SHADER_OPERATION_NV,
GL_DEPENDENT_AR_TEXTURE_2D_NV);
glTexEnvi(GL_TEXTURE_SHADER_NV,
GL_PREVIOUS_TEXTURE_INPUT_NV,
GL_TEXTURE0_ARB);
glTexEnvi(GL_TEXTURE_ENV,
GL_TEXTURE_ENV_MODE,
GL_REPLACE);


"table" is a N x 2 texture with table[N*4*0+i] = table[N*4*1+i]. For fragment programs just replace the TexEnv stuff.

Not tested.