16 bit Texture

Hi All…

I am new to 16- bit textures… I am doing a simple program that contains a 16-bit 2d Texture, and am using a 1D texture lookup to change the palette of the 2d texture.

 
glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE16, 256, 256, 0, GL_LUMINANCE16,GL_UNSIGNED_SHORT, pData);
 
glTexImage1D(GL_TEXTURE_1D, 0, GL_RGBA, 65356, 0, GL_RGBA, GL_UNSIGNED_BYTE, pPalette);
    

My Fragment Shader is :

uniform sampler2D oTexture2d;                                                                               
uniform sampler1D oPalette2d;                                                                               

'void main()                                                                                                 
{                                                                                                           
 gl_FragColor = texture1D(oPalette2d, texture2D(oTexture2d, gl_TexCoord[0].xy));                        
};
  

The result is always a white image, it seems that the shader did not work at all, since the palette does not have any white…

What did I do wrong? I am using a GeForce 6800 Ultra.

I suppose 1D textures are also limited to the maximum size of 4069 pixels on the GeForce6800 as are the 2D textures. You should get a gl error and the 1D texture should be undefined, which will give you white pixels.
Anyway, you were probably thinking of 65536 and not 65356…

Thank you Def for your reply… I missed out the fact of texture size limitation. Anyway, you were probably thinking of 4096 and not 4069 :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.