Hi,
I’m having trouble with some transform feedback algorithm.
here’s my (ultra) simplified code :
////////////////////////////////
// main.cpp
////////////////////////////////
glBindBufferARB(GL_ARRAY_BUFFER_ARB, id);
glEnableVertexAttribArrayARB(0);
glVertexAttribIPointer(0, 2, GL_INT, 0, BUFFER_OFFSET(0));
// set the transform feedback mode
glDrawArrays(GL_POINTS, 0, count);
// end the transform feedback
////////////////////////////////
// Vertex Shader
////////////////////////////////
in ivec2 culled;
flat out int id;
flat out int keep;
void main(void)
{
id = culled.x;
keep = 0;
if(culled.y == 1) keep = 1;
}
////////////////////////////////
// Geometry Shader
////////////////////////////////
layout(points) in;
layout(points, max_vertices = 1) out;
flat in int id[1];
flat in int keep[1];
flat out int listi;
void main()
{
if(keep[0] == 1)
{
listi = id[0];
EmitVertex();
EndPrimitive();
}
}
When I check the data from the buffer, I get some crazy values such as : 0 1065353216 1073741824 1077936128 1082130432 1084227584 1086324736 1088421888 (used map/unmap).
I’m sure it has to do with the attribute pointer I’m setting because I know the data in the buffer I use for the draw call is valid (I used map/unmap to check aswell) and if I use hard coded values in the shader, everything goes fine.
Has anyone had trouble with the AttribIPointer before ?
Thanks for your help !
B