Texture Format Problem

I use a texture to keep the position of a cube.
if i define the texture like:

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,2,2,0,GL_RGBA,GL_UNSIGNED_BYTE,edge);

Then i have to mutiply 255 to get the correct position in the Vertex Shader

pos.xy = tex2D( tex , float2(0.0,1.0) ).rg*255;

However if i define the texture like that :

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA32F,2,2,0,GL_RGBA,GL_FLOAT,edgefloat);

pos.xy = tex2D( tex , float2(0.0,1.0) ).rg

Then i do not have to mutiply 255 to get the correct position.
Why???

Because the former will use an unsigned normalized texture internal format (e.g. GL_RGBA8), thus the unsigned byte data that you pass in will be normalized to the [0,1] interval while this is not the case with GL_RGBA32F.

You can only have non-normalized integer texture formats if you use GL_RGBAxI or GL_RGBAxUI enums (e.g. in your case this should be GL_RGBA8UI).

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.