PDA

View Full Version : glColorTable works for 8bit but not 16 bit??



08-02-2004, 08:53 AM
When I do the following for 8 bit indexed texture lookup table, it works fine. However, when I changed it to 16 bits, the texture does not display correctly to the screen. All I get is a black screen. Here is my code for 16bit. What am I doing wrong? Thanks.

unsigned char colormap[65536*3] ;

glColorTable(GL_SHARED_TEXTURE_PALETTE_EXT, GL_RGB, 65536, GL_RGB, GL_UNSIGNED_BYTE, colormap) ;

glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);

glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX16_EXT, x, y, 0, GL_COLOR_INDEX, GL_UNSIGNED_SHORT, index ) ;

Relic
08-02-2004, 09:04 AM
Check glGetError().
The most reasonable expectation is that the hardware or implementation does not support 16 bit color index tables and your glColorTable call throws an error if the width parameter is bigger than the hardware limit.
With all the 8 bits per channel hardware around 256 is likely to be the limit.
Newer hardware doesn't even support this extension anymore.

analogic
08-02-2004, 09:48 AM
I did not get an error for glGetError(). Is my code correct if I wanted to use 16 bit paletted textures?