Creating texture with GL_UNSIGNED_INT

I am creating a texture to hold unsigned int values as follows:

GLuint simplearray = new GLuint [WH4];
for(GLuint i = 0; i < W
H*4; i++)
simplearray[i] = i;

glGenTextures(1, &textureid);
glActiveTextureARB( GL_TEXTURE3_ARB );
glBindTexture(GL_TEXTURE_RECTANGLE_NV, textureid);
glTexImage2D(GL_TEXTURE_RECTANGLE_NV, 0, GL_RGBA, W, H, 0, GL_RGBA, GL_UNSIGNED_INT, simplearray);

But this does not load the values in the texture. All the values in the texture are 0. How do we create a texture that will hold unisigned integer values ?

http://www.opengl.org/sdk/docs/man/
check out glTexImage2D
The internal format should be GL_RGBA16 and if the graphics card supports it, the pixels you supply are directly uploaded.

If you use GL_RGBA, the pixels you supply are probably converted to GL_RGBA8

To know if the GPU suuports it, you would need to check out the documents from nVidia and ATI. Other companies don`t publish this info AFAIK.

If you really need 32 bit unsigned integer values in a texture, you need hardware that supports EXT_texture_integer, and EXT_gpu_shader4.

I work on NVIDIA GeForce 8800 with Linux. It does support the EXT_texture_integer and EXT_gpu_shader4 extensions. But I cannot still get the texture to load unsigned integer values. I tried the internal format as GL_RGBA16, but it gave me weird results. Some values were loaded in the texture and in a random manner. All other values were still zero.

I feel there should be a format like GL_RGBA32 to make this work. Any clues ?

If you had taken a look to the EXT_texture_integer spec you would have new tokens like RGBA32UI_EXT

Check the gluErrorString to see if you have a problem in your texture declaration. However, all values on GPU are float values except if you work with EXT_texture_integer (internal format = GL_RGBA32UI_EXT). You can also explicitly avoid clamping for texture using ARB_color_buffer_float extension.

So you can try with the “right” internal format and check gluErrorString if it doesn’t work !

tnx guys, works with internal format = GL_RGBA32UI_EXT and format = GL_RGBA_INTEGER_EXT for type = GL_UNSIGNED_INT.