PDA

View Full Version : integer vertex shader input varyings



Bodlay
05-03-2011, 03:05 AM
Hello!

As I read, from GLSL 1.30 onward, the vertex shader supports true integer input varyings...

I would like to render something using glDrawArrays.

And now I have a problem: How can I configure the GL_ARRAY_BUFFER buffer and glVertexAttribPointer using GL_UNSIGNED_INT type to get "in uint var;" support inside vertex shader. I do not need and want automatic conversion to float inside vertex shader. "in float var;" works fine, but I have to cast input varying var into uint then. (To let you know, I use this varying to access RECTANGLE texture, which is accessed by integer coordinates.)

Doing it this way is ugly, does not cover all integer numbers, and it is probably inefficient as well. I think, it has to be a better way to do this.

Do you have any ideas how to specify mapping between buffers and vertex shader input varyings (not to automatically convert everything to float) or maybe some hint how to address the problem in the other way?

Thank you, regards,
Nejc

Alfonse Reinheart
05-03-2011, 03:11 AM
You must use glVertexAttribIPointer. It is used for signed and unsigned integral inputs. The version without I will always convert to floats, and cannot be used with integral vertex shader inputs.

Bodlay
05-03-2011, 04:19 AM
I was looking for the answer in at least 3 books from 2009 on, and on countless websites, and not even one does mention the function above.

It is an obvious solution :), but the missing "I" took me at lest two whole days to eventually come here, ask for help, and get a solution. It works now.

Thank you again!