Paletted texture support on Nvidia/Linux?

Hi,

I’m looking at using paletted textures, but it seems a number of implementation either don’t support them, or do so badly.

In particular, I just get a black rectangle when I try to use a paletted texture on an Nvidia GTS 2 using the current Linux OpenGL drivers, which claim to support the paletted texture extention. The same code works with the 3dfx DRI drivers.

Is this a driver bug, or does the Nvidia hardware not support paletted textures?

Thanks,
J

We do support paletted textures. We wouldn’t expose the extension if we didn’t.

Quake and Quake 2 and various Quake licensees all use that extension, so it does work. It’s possible you’ve encountered a bug, but first, are you absolutely certain your application is correct? (Working on one driver and not on another means at least one of them is buggy, though it doesn’t necessarily say which.) Definitely also check for GL errors being reported.

  • Matt

Thanks for the quick reply.

I’m getting error 1282 (invalid operation) after glColorTable(GL_TEXTURE_2D, GL_RGBA, 256, GL_RGBA, GL_FLOAT, palette);

The basic setup is:

	/* dist is a 256x256 unsigned byte array of values */
	glGenTextures(1, &distTex);
	glBindTexture(GL_TEXTURE_2D, distTex);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_COLOR_INDEX8_EXT, 256, 256, 0, GL_COLOR_INDEX, GL_UNSIGNED_BYTE, dist);
	glEnable(GL_TEXTURE_2D);

	glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

/* …and to draw */
glEnable(GL_BLEND);
glBindTexture(GL_TEXTURE_2D, distTex);
glColorTable(GL_TEXTURE_2D, GL_RGBA, 256, GL_RGBA, GL_FLOAT, palette);

glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2i(0, 0);

glTexCoord2f(1, 0);
glVertex2i(width, 0);

glTexCoord2f(1, 1);
glVertex2i(width, height);

glTexCoord2f(0, 1);
glVertex2i(0, height);	

glEnd();

Also, as an aside, I’m also having problems with GL_INTENSITY format textures. The glTexImage2D call fails with 1280 (invalid enumerant).

Thanks,
J

Are you calling glColorTable or glColorTableEXT?

It makes a big difference.

glColorTableEXT is for EXT_paletted_texture, while glColorTable is for ARB_imaging. So since 6.xx drivers don’t support ARB_imaging, we give an INVALID_OPERATION eror on glColorTable. We will be supporting ARB_imaging in the near future, at which point these will get aliased together into the same function.

  • Matt

Oh, and for intensity textures, remember to use GL_INTENSITY for the internalformat/components parameter but GL_LUMINANCE for the format. GL_INTENSITY is not a valid format.

  • Matt

OK, that fixed it. I guess I’m confused about how the OGL extension process works. I thought glColorTable and glColorTableEXT were the same function at different stages in it’s extension life-cycle. You’re saying they’re different functions with very similar names and the same prototype?

Thanks for the GL_INTENSITY hint; I hadn’t noticed it’s only an internal format.

Thanks again,
J

glColorTable vs. glColorTableEXT: oh, I just reread your reply - they will get aliased. So I guess the portable thing is to always use glColorTableEXT (that will always be present, even after the alias, right?).

Thanks,
J

Use glColorTableEXT for the most portable paletted texture code.

glColorTable shouldn’t even exist in drivers that don’t support OpenGL 1.2.

  • Matt