GL_EXT_paletted_texture on NVIDIA GeForce

My application has a tremendous amount of texture, so I would like to use the
GL extension GL_EXT_paletted_texture. It works fine with GL_COLOR_INDEX8_EXT but it is
much more slower than using GL_RGB. Is it normal? Here are both scenarios. Maybe I’m doing something wrong!

Scenario 1: Very Slow.

GLuint texId, w, h, image;
GLfloat pal[256
4];
Load8BitsImag(“image.bmp”,&w,&h,pal,image);
glGenTextures(1, &texId);
glBindTexture(GL_TEXTURE_2D, texId);
glColorTableEXT(GL_TEXTURE_2D, GL_RGBA8, 256, GL_RGBA, GL_FLOAT, pal);
glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX8_EXT, w, h, 0, GL_COLOR_INDEX, GL_UNSIGNED_BYTE, image);
free(image);

Scenario 2: Very Fast.

GLuint texId, w, h, *image;
LoadRGBAImage(“image.rgb”, &w, &h, image);
glGenTextures(1, &texId);
glBindTexture(GL_TEXTURE_2D, texId);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image);
free(image);

This is of course your intialisation code for your texture management eh?
you didn’t specify this each time during the rendering loop?? (just in case)
Anyway everything seems correct in your initialisation code. the obly funky thing (which is not wrong) is to use floating points values for your palette. 768 bytes should be enough for 256 colors RGB/888 to describe your palette to openGl and of course 16*3 entries for 4bits indexed pictures…
gl