PDA

View Full Version : Creating texture with GL_UNSIGNED_INT



damma
06-11-2008, 02:44 PM
I am creating a texture to hold unsigned int values as follows:

GLuint *simplearray = new GLuint [W*H*4];
for(GLuint i = 0; i < W*H*4; i++)
simplearray[i] = i;

glGenTextures(1, &textureid);
glActiveTextureARB( GL_TEXTURE3_ARB );
glBindTexture(GL_TEXTURE_RECTANGLE_NV, textureid);
glTexImage2D(GL_TEXTURE_RECTANGLE_NV, 0, GL_RGBA, W, H, 0, GL_RGBA, GL_UNSIGNED_INT, simplearray);

But this does not load the values in the texture. All the values in the texture are 0. How do we create a texture that will hold unisigned integer values ?

V-man
06-11-2008, 02:53 PM
http://www.opengl.org/sdk/docs/man/
check out glTexImage2D
The internal format should be GL_RGBA16 and if the graphics card supports it, the pixels you supply are directly uploaded.

If you use GL_RGBA, the pixels you supply are probably converted to GL_RGBA8

To know if the GPU suuports it, you would need to check out the documents from nVidia and ATI. Other companies don`t publish this info AFAIK.

arekkusu
06-11-2008, 04:59 PM
If you really need 32 bit unsigned integer values in a texture, you need hardware that supports EXT_texture_integer, and EXT_gpu_shader4.

damma
06-13-2008, 08:14 AM
I work on NVIDIA GeForce 8800 with Linux. It does support the EXT_texture_integer and EXT_gpu_shader4 extensions. But I cannot still get the texture to load unsigned integer values. I tried the internal format as GL_RGBA16, but it gave me weird results. Some values were loaded in the texture and in a random manner. All other values were still zero.

I feel there should be a format like GL_RGBA32 to make this work. Any clues ?

dletozeun
06-13-2008, 08:22 AM
If you had taken a look to the EXT_texture_integer (http://developer.download.nvidia.com/opengl/specs/GL_EXT_texture_integer.txt) spec you would have new tokens like RGBA32UI_EXT

djedge@ogl
06-13-2008, 08:31 AM
Check the gluErrorString to see if you have a problem in your texture declaration. However, all values on GPU are float values except if you work with EXT_texture_integer (internal format = GL_RGBA32UI_EXT). You can also explicitly avoid clamping for texture using ARB_color_buffer_float extension.

So you can try with the "right" internal format and check gluErrorString if it doesn't work !

damma
06-16-2008, 07:40 AM
tnx guys, works with internal format = GL_RGBA32UI_EXT and format = GL_RGBA_INTEGER_EXT for type = GL_UNSIGNED_INT.