PDA

View Full Version : NV: Problems using integer attributes



Jan
01-13-2011, 08:06 AM
Hi

I want to pass an integer as an attribute into a shader, which is then used to index a uniform. However, it seems to fail on my nVidia card and garbage is returned.

However, once i switch to a float-attribute and cast it to an int in the shader, everything works fine.

Doing the same thing in D3D11 works both with integers and floats.


Did anyone encounter similar problems? Are integer attributes problematic in any way? Or could I have forgotten to enable some state, that is not necessary when using floats?

Thanks,
Jan.

mobeen
01-13-2011, 08:39 AM
Hi,
Are u using glVertexAttribIPointer to pass the attribute?

Jan
01-13-2011, 09:15 AM
Yep:

int iArray = getVertexAttribBindPoint (Program, "somename");
glEnableVertexAttribArray (iArray);
glVertexAttribPointer (iArray, 1, GL_INT, false, 0, BUFFER_OFFSET (offset));

Replacing GL_INT with GL_FLOAT makes it work.

Dan Bartlett
01-13-2011, 09:43 AM
mobeen asked whether you were using "glVertexAttribIPointer" not "glVertexAttribPointer" (notice the extra I).

With glVertexAttribPointer, the integers will be converted to floats, whereas with glVertexAttribIPointer they will be left as integers.

glVertexAttribIPointer is described at http://www.opengl.org/sdk/docs/man4/xhtml/glVertexAttribPointer.xml

Jan
01-13-2011, 04:09 PM
Damn! Neither did I know that, nor did I notice that extra "I"!

I will try that immediately.

Jan.

Jan
01-13-2011, 04:16 PM
Yes, it works now.

Thank you both very much, that was very helpful.

Cheers,
Jan.