glDrawArray not seeing VBOs

Hi guys,

I seem to have the same error as in this post:
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=240776

However, I am not using Display Lists. My graphics card is an ATI 4330HD (mobile) with openGL 3.0 drivers (though I should be in a 2.1 context). I can’t even get this far with my secondary Intel card but that was to be expected.


glGenBuffers( 1, &vboID );

glBindBuffer( GL_ARRAY_BUFFER, vboID );

GLint guiVertexArray[] = { realX, realY,
		           realX, realY + h,
                           realX + w, realY,
                           realX + w, realY + h };

glBufferData( GL_ARRAY_BUFFER, sizeof(guiVertexArray),
&guiVertexArray, GL_STATIC_DRAW );

glBindBuffer( GL_ARRAY_BUFFER, vboID );

glVertexPointer( 2, GL_INT, 0, null );

glDrawArrays( GL_TRIANGLE_STRIP, 0, 4 );

I’m getting an access violation for attempting to access 0x000000000 on the glDrawArrays command. This same code works fine on my nVidia card on my PC so I do not think it’s an error on my part. I use GLEW 1.5.1 to enable any extensions.

Any ideas guys?

Many thanks

I don’t see “glEnableClientState(GL_VERTEX_ARRAY);” in that code, did you omit it?

Sorry, yes I did omit it but it is in there.

I had a strange similar bug on geforce 3 driver 95-something
where Using display lists with non-VBO-resident-geometry crashed if I did not disable (glBindBuffer(GL_ARRAY_BUFFER, 0)) vertex buffers first. Shouldn’t crash but did anyway. My guess is that perhaps you’ve left some state enabled that you shouldn’t…?In general it is very easy to get access violations/strange results with VBO if you’re not careful. As for the GL 3.0 context, yeah, noticed it myself, when using wglChoosePixelFormatARB (NOT wglCreateContextAttribsARB)it always created a 3.0 context in nVidia 8400(uber crappy card BTW, even a 6600 outperfoms it)
Not a problem though since it is backwards compatible anyway.

glBufferData( GL_ARRAY_BUFFER, sizeof(guiVertexArray), &guiVertexArray, GL_STATIC_DRAW );

Are you really, really sure you want a pointer to the pointer? I’m not. Drop the ‘&’.

Good point, though the compiler seems to understand I’m wanting the reference of the array, but I’ll see if it helps.

I’ve tried all the above advice and I’m afraid that the error hasn’t changed.

Have you tried using floats instead of ints, and three coords instead of two? Just to see if there’s another nice bug in the ATI drivers :slight_smile:

Also, is this (in essence) the only code you’ve got? If not, what if you reduce it to only draw this?

why do you draw the buffer after creating it anyways? don’t you have sort of loading stage/rendering stage in your app?

“glBufferData( GL_ARRAY_BUFFER, sizeof(guiVertexArray),
&guiVertexArray, GL_STATIC_DRAW );”
is definitely wrong. omit the &.

“glEnableClientState(GL_VERTEX_ARRAY);” make sure noone calls glDisableClientState(GL_VERTEX_ARRAY) especially if you use third party OpenGL classes for something.

make sure no other buffer (color, normal) is bound before rendering. glDisableClientState all buffers before rendering.

Hoho! Thank you! It’s an ATI bug (Catalyst 9.3 - sadly can’t go 9.5 due to mobile gfx), it doesn’t like GL_INT. After changing to GL_FLOAT (and the associated array) it worked fine. It’s a tad annoying as I was wanting a clear absolute co-ordinate system but I guess that can just be hidden from the end user.

In regards to passing an array as &array, I’ve looked it up and for whatever reason it is acceptable in most if not all compilers. It’s to do with commands like (&array) being used to allow you to pass and array and have it maintain its dimensions.

Excellent, well at least that you figured it out :smiley:

I suspect that internally the driver would have converted to float anyway, but perhaps I’m wrong.