So I’m trying to turn my fixed-function OpenGL application into something that uses VBOs, for improved performance. However, I’m having a bit of trouble - my application ends up rendering things in a bizarre pattern. I’m 90% sure this is because it’s using normal data for vertices and vice versa, but I don’t completely understand what I’m telling OpenGL versus what I think I’m telling it, or the exact use of the various functions, and I’m not sure how to deal with my array.
Currently, the array I use to create the VBO is a straight vector of floats, in the form {x, y, z, nx, ny, nz, x, y, z, nx, ny, nz, …}, and it represents a simple cube. If I removed the normal data from the array and comment out the code relating to normal data in the loop, I get a perfectly fine white cube; but if I add normal data, things mess up terribly, as though my cube exploded while a black hole was trying to eat it from the inside out.
Here is how I have been using the VBO, normals included (this is in the game loop):
glBindBuffer(GL_ARRAY_BUFFER, WorldBuffer);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, 0);
glEnableClientState(GL_NORMAL_ARRAY);
glNormalPointer(GL_FLOAT, 0, (void*)12);
glDrawArrays(GL_TRIANGLES, 0, 36);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, 0);
Interestingly, if I use the above but leave the normal floats out of the array, I still get a cube, just one with ridiculously disjointed shading.
So can someone tell me what I’m doing wrong, and why? I have a feeling it’s because I’m not using the offset parameter properly in glNormalPointer(), but I can’t find any tutorials that clear things up for me.