glColorTable works for 8bit indexed texture but not 16 bit??

When I do the following for 8 bit indexed texture lookup table, it works fine. However, when I changed it to 16 bits, the texture does not display correctly to the screen. All I get is a black screen. Here is my code for 16bit. What am I doing wrong? Thanks.

unsigned char colormap[65536*3] ;

glColorTable(GL_SHARED_TEXTURE_PALETTE_EXT, GL_RGB, 65536, GL_RGB, GL_UNSIGNED_BYTE, colormap) ;

glEnable(GL_SHARED_TEXTURE_PALETTE_EXT);

glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX16_EXT, x, y, 0, GL_COLOR_INDEX, GL_UNSIGNED_SHORT, index ) ;

NVidia cards only support 8 bit paletted textures. For larger palettes you have to employ dependent texture operations available on GeForce3-6, ATI R2xx-4xx, 3Dlabs Wildcat etc.

  • Klaus

Originally posted by Klaus:
[b]NVidia cards only support 8 bit paletted textures. For larger palettes you have to employ dependent texture operations available on GeForce3-6, ATI R2xx-4xx, 3Dlabs Wildcat etc.

  • Klaus[/b]
    is there a way to do something compatible with all cards ?
    Sometimes ago I used color tables for my texture but as they are not supported now I’m wondering how we should do now ?