View Full Version : using an integer texture

02-09-2013, 09:03 AM
I need to use a texture(rgba) where each channel can store unsigned integer value from a range <0,7> I set a texture this way:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32UI, screenWidth, screenHeight, 0, GL_RGBA_INTEGER, GL_UNSIGNED_INT, NULL);

First I fill a texture with some data. Then when I read a texture I use sampler2D and vec2 as a texture coordinate (<0,1>, <0,1>).
The following texture function with sampler2D returns float:

float data = texture(mySampler, texCoord).x;

but I can use that value this way:

if(data == 0)
//do something

and it works. I realize that I can improve my code. I try to change GL_RGBA32UI to GL_RGBA16UI but then the program doesn't work
(RGBA12UI seems to be minimum internal format to store value from a range <0,7> in each channel). Can I use a usampler2D with vec2 as a
texture coordinate ? What should I change in the above code ?

02-09-2013, 05:18 PM
Can I use a usampler2D with vec2

You should be using a usampler2D to read your texture or you will confuse things. To get the data into float format try

ivec4 int_val = texture2D(Texture0, TexCoord0);
vec4 float_val = vec4(int_val);


Alfonse Reinheart
02-09-2013, 07:44 PM
You can't use a usampler2D with `texture2D`. That's a pre-GL 3.0 texture accessing function.

02-09-2013, 09:17 PM
Sorry needed "texture" not "textue2D"

02-11-2013, 10:02 AM
You can't use a usampler2D with `texture2D`.
Pedantically, yes you can, with EXT_gpu_shader4.