PDA

View Full Version : glColorTable works for 8bit indexed texture but not 16 bit??



analogic
08-02-2004, 08:59 AM
When I do the following for 8 bit indexed texture lookup table, it works fine. However, when I changed it to 16 bits, the texture does not display correctly to the screen. All I get is a black screen. Here is my code for 16bit. What am I doing wrong? Thanks.

unsigned char colormap[65536*3] ;

glColorTable(GL_SHARED_TEXTURE_PALETTE_EXT, GL_RGB, 65536, GL_RGB, GL_UNSIGNED_BYTE, colormap) ;

glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);

glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX16_EXT, x, y, 0, GL_COLOR_INDEX, GL_UNSIGNED_SHORT, index ) ;

Klaus
08-03-2004, 06:44 AM
NVidia cards only support 8 bit paletted textures. For larger palettes you have to employ dependent texture operations available on GeForce3-6, ATI R2xx-4xx, 3Dlabs Wildcat etc.

- Klaus

soda
08-18-2004, 09:40 AM
Originally posted by Klaus:
NVidia cards only support 8 bit paletted textures. For larger palettes you have to employ dependent texture operations available on GeForce3-6, ATI R2xx-4xx, 3Dlabs Wildcat etc.

- Klausis there a way to do something compatible with all cards ?
Sometimes ago I used color tables for my texture but as they are not supported now I'm wondering how we should do now ?