View Full Version : 16 bit Texture

10-15-2005, 02:02 AM
Hi All..

I am new to 16- bit textures.. I am doing a simple program that contains a 16-bit 2d Texture, and am using a 1D texture lookup to change the palette of the 2d texture.

glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE16, 256, 256, 0, GL_LUMINANCE16,GL_UNSIGNED_SHORT, pData);

glTexImage1D(GL_TEXTURE_1D, 0, GL_RGBA, 65356, 0, GL_RGBA, GL_UNSIGNED_BYTE, pPalette);
My Fragment Shader is :

uniform sampler2D oTexture2d;
uniform sampler1D oPalette2d;

'void main()
gl_FragColor = texture1D(oPalette2d, texture2D(oTexture2d, gl_TexCoord[0].xy));
The result is always a white image, it seems that the shader did not work at all, since the palette does not have any white...

What did I do wrong? I am using a GeForce 6800 Ultra.

10-15-2005, 07:23 AM
I suppose 1D textures are also limited to the maximum size of 4069 pixels on the GeForce6800 as are the 2D textures. You should get a gl error and the 1D texture should be undefined, which will give you white pixels.
Anyway, you were probably thinking of 65536 and not 65356...

10-16-2005, 01:12 AM
Thank you Def for your reply.. I missed out the fact of texture size limitation. Anyway, you were probably thinking of 4096 and not 4069 :)