Feeling like a total noob thanks to OpenGL’s VBOs. I’ve been in DirectX land for the last 8 years, and I’m in the middle of writing an OpenGL implementation for my rendering abstraction layer.
Everything has gone fairly smoothly except for implementing VBOs. For whatever reason, my shader isn’t operating on vertex attributes as it should. I’ve tested the shaders in RenderMonkey and have inspected the VBOs with gDEBugger to make sure they’re all valid. And, to further top it off, I’ve tried rendering some fake data in immediate mode and looking at the results against a comparable fake data VBO. The immediate mode works whilst the VBO fails to render.
Ignoring the fact that this code creates and deletes a VBO each frame, this encapsulates my problem:
static bool bImmediate = false;
if (bImmediate)
{
glBegin(GL_TRIANGLE_STRIP);
glColor4f(1.f, 0.f, 0.f, 1.f);
glVertex3f(-.5f, -.5f, 1.f);
glColor4f(0.f, 1.f, 0.f, 1.f);
glVertex3f(.5f, -.5f, 1.f);
glColor4f(0.f, 0.f, 1.f, 1.f);
glVertex3f(-.5f, .5f, 1.f);
glColor4f(0.f, 1.f, 1.f, 1.f);
glVertex3f(.5f, .5f, 1.f);
glEnd();
}
else
{
float vertexData[28] =
{
-.5f, .5f, 1.f, 1.f, 0.f, 0.f, 1.f,
.5f, .5f, 1.f, 0.f, 1.f, 0.f, 1.f,
-.5f, -.5f, 1.f, 0.f, 0.f, 1.f, 1.f,
.5f, -.5f, 1.f, 0.f, 1.f, 1.f, 1.f,
};
uint32 buffer;
glGenBuffers(1, &buffer);
int stride = 7 * sizeof(float);
glBindBuffer(GL_ARRAY_BUFFER, buffer);
glBufferData(GL_ARRAY_BUFFER, 4 * stride, vertexData, GL_STATIC_DRAW);
int32 location = glGetAttribLocation(program, "POSITION0");
glEnableVertexAttribArray(location);
glVertexAttribPointer(location, 3, GL_FLOAT, GL_FALSE, stride, 0);
location = glGetAttribLocation(program, "COLOR0");
glEnableVertexAttribArray(location);
glVertexAttribPointer(location, 4, GL_FLOAT, GL_FALSE, stride, (void*) (3 * sizeof(GLfloat)));
glDrawArrays(GL_TRIANGLE_STRIP, 0, 2);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glDeleteBuffers(1, &buffer);
}
The layout of my vertex data there is 3 floats for position and 4 floats for color. The shader defines attributes POSITION0 as a vec3 and COLOR0 as a vec4. The shader simply multiplies by a matrix uniform that represents a model -> world -> projection transform.
I really have no idea what I’m doing wrong. Any pointers?