View Full Version : 32-bit texture component: invalid operation

Lost Lont
12-03-2009, 03:15 PM

I've tried to create a 1-dimensional texture with GL_R32UI internal format but the glTexImage1D function returns with "invalid operation".

vector<GLuint> faces;
glTexImage1D(GL_TEXTURE_1D, 0, GL_R32UI, faces.size(), 0, GL_RED, GL_UNSIGNED_INT, &amp;faces[0]);

Originally my SDL-OpenGL header file doesn't contain the gl3 R32UI define, but I declared it myself, as the OpenGL registry defines:

#define GL_R32UI 0x8236
I've tried different formats (like GL_R32UI instead of GL_RED, of course it failed) and different types, but I couldn't solve the problem, gluErrorString reports "invalid operation".
The glTexImage1D documentation doesn't say anything about GL_R32UI-specific errors, as it's relatively new and the documentation doesn't contain it yet.

My question is what could cause the problem and what should I do? I want to use integral textures in my fragment shader with more than 16-bit components and I don't want to distribute the bits amongst three or four 8-bit components (like in RGBA).

12-03-2009, 03:26 PM
I think you need to use one of the GL_*_INTEGER tokens for format, but you might check the spec on that to be sure.

12-03-2009, 04:13 PM
I think Brolingstanz is right. I found the following information in the OpenGL Spec 3.0 page 177, last paragraph:

"The error INVALID_OPERATION is generated if the internal format is integer and format is not one of the integer formats listed in table 3.6..."

table 3.6, page 152:

Format Name | Element Meaning and Order | Target Buffer
RED_INTEGER | iR | Color

"Components are floating-point unless prefixed with the letter 'i', which indicates they are integer.

ref: http://www.opengl.org/registry/doc/glspec30.20080811.withchanges.pdf

Lost Lont
12-04-2009, 06:06 AM
GL_RED_INTEGER solved the problem, thank you very much!