PDA

View Full Version : GLSL texture data format



shapeare
01-12-2012, 06:02 PM
I defined the texture like the following:


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB32UI, 512, 312,
0, GL_RGB_INTEGER, GL_UNSIGNED_INT, triIndex);

so you see each color channel of a texel is of type unsigned int. However, when I sample the texture in the shader using the following statement:


uvec3 value;
value = texture(myTexture, index).rgb;

an error generated saying implicit cast from "vec3" to "uvec3". What's the problem here?

Ludde
01-13-2012, 01:37 AM
Declare the sampler as "uniform usampler2D myTexture"

thokra
01-13-2012, 02:17 PM
The problem is that you try to cast from a signed to an unsigned data type. texture() returns a vec4, you swizzle to a vecThrizzle (sorry for that) and then try to assign it to a uvec3.

If you want unsigned values, use a sampler like proposed by Ludde.