Minimal 1.5 VBO test just wont work?

Hi.

Im trying to get a minimal vbo to work, but it wont display anything (everything seems fine, but all i get is a black screen). Anyone sees the (probably stupid) mistake i have done?

(code edited)

  
void initScene(){
	GLfloat vertexdata[]={0,0,0, 1,0,0, 1,1,0, 0,1,0};
	glGenBuffers(1, &bufferVObject);
	glBindBuffer(GL_ARRAY_BUFFER, bufferVObject);
	glBufferData(GL_ARRAY_BUFFER, sizeof(vertexdata), &vertexdata, GL_STATIC_DRAW);
}
void renderScene(void) {
	glClear(GL_COLOR_BUFFER_BIT);
	glTranslatef(-0.5,-0.5,-1);
	glColor3f(1.0,0.0,0.0);

	glEnableClientState(GL_VERTEX_ARRAY);
	glBindBuffer(GL_ARRAY_BUFFER, bufferVObject);
	glVertexPointer(3, GL_FLOAT, 0, 0);
	glBegin(GL_QUADS);
		glArrayElement(0);
		glArrayElement(1);
		glArrayElement(2);
		glArrayElement(3);
	glEnd();
	glBindBuffer(GL_ARRAY_BUFFER,0);
	glDisableClientState(GL_VERTEX_ARRAY);
	glFlush();
}

glArrayElement() obtains only one vertex, so you need to submit the other three vertices, too:

glBegin(GL_QUADS);
  glArrayElement(0);
  glArrayElement(1);
  glArrayElement(2);
  glArrayElement(3);
glEnd();

yep, that was my first stupid mistake :slight_smile:

Corrected it, but still nothing onscreen.

Does anyone got a smal VBO test done in 1.5 style (not EXT) i could look at?

Anyway, im using the typical viewport stuff from every glut example, so the error should be in the way im using VBO, and not positioning, right? (this should, as far as i can see, draw the quad in the screens center):

  
void changeSize(int w, int h) {
	if(h == 0)
		h = 1;
	float ratio = 1.0* w / h;

	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();

	glViewport(0, 0, w, h);


	gluPerspective(45,ratio,1,1000);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	gluLookAt(0.0,0.0,5.0, 
		0.0,0.0,-1.0,
		0.0f,1.0f,0.0f);
}

In this line:
glBufferData(GL_ARRAY_BUFFER, sizeof(vertexdata), &vertexdata, GL_STATIC_DRAW);
you shouldn’t use the ‘address-of’ operator on ‘vertexdata’.

You sure of this? I thought that should be a pointer, not the data itself.

Anyway, i tried it, but it dont work now either.

Actually, i tried the code on another computer/card, a normal nvidia 6800, compared to a nvidia 6800 GL on the other comp. (both using driver 61.77).

On the comp. with the normal 6800 card, this code actually crashes the computer (both when using a pointer or the data), so i guess the error is something else (or could there be a bugg in the driver?)

Your stride parameter in glVertexPointer is wrong. It should be 0.

If you have cullface enabled, then you should notice that you have defined the vertex clock wise.
disable cullface (glDisable(GL_CULL_FACE) or change the vertex order in your array or set glFrontFace(GL_CW);

Hope this helps

Thanks for the suggestions.

Fixed the stride parameter, and made sure cull facing is off.

But, still nothing is drawn in the glut window (other then the black background), and it still crashes my comp. Shortly after starting the program, the monitor screen goes black for a sec, every few seconds (seems it tries to change resolution), then the comp. reboots.

Any suggestions?

In your render loop, you start by calling glClear with the color_buffer flag, but not the depth buffer flag!
This means that your geometry is (probably - unless there are other errors) drawn once and z culled the rest of the time.

As for the vertexdata pointer, Now is right - it is already a pointer, so you should use ‘&vertexdata[0]’ or just ‘vertexdata’. Otherwise you get the address of a pointer to the first element!

Enjoy,

Anders

Yay, it works…

The final stupid mistake was that i forgot to clear the z-buffer as Abrodersen suggested.

Thanks all.