I can’t get glVertexAttribIPointer to work with shader inputs of type ivec3.
If I do glVertexAttribIPointer(0, 1, GL_BYTE, stride, offset), it works fine with a shader input variable declared as “int”. But when I do glVertexAttribIPointer(0, 3, GL_BYTE, stride, offset), I can’t get it to work with shader type “ivec3”.
What am I missing?
A related question is using the following combined with “vec3” and “float”, respectively:
glVertexAttribPointer(0, 3, GL_BYTE, GL_FALSE, stride, offs);
glVertexAttribPointer(3, 1, GL_BYTE, GL_FALSE, stride, offs);
The second line will generate a warning, but not the first.
glDrawArrays uses input attribute 'VERTEX_ATTRIB[3]' which is specified as 'type = GL_BYTE size = 1'; this combination is not a natively supported input attribute type.
That is, if I change GL_BYTE of the second statement to GL_FLOAT, there will be no warnings. The problem can’t be GL_BYTE, can it? As it is accepted by the first example. The byte with the specific problem is located at an even 4-byte border, and if I move it I will also get a warning about non-optimal alignment. I am using AMD with OpenGL 4.2.