Handled exception: Memory Violation on first call of drawElements

Hello,
I’m using VAOs with glew to render stuff. I didn’t noticed first but I get an handled (!) not-interrupting exception on (only) my first call of glDrawElements:

First-chance exception at 0x027AFD84 (ig7icd32.dll) in MyApp.exe: 0xC0000005: Access violation reading location 0x00000000.

It’s listed under “Win32 Memory Violation Excetion” in VS. As I said it doesn’t crash or stop my app but I wonder why it get’s thrown in the first place.

Thanks!

Could you post your code here?

What part exactly? Here is my init, update (upload data) and draw-function:

void Vao::initGL() {
	if(vaoHandle == 0) {
		glGenVertexArrays(1, &vaoHandle);
		glGenBuffers(1, &vboHandle);
		glGenBuffers(1, &iboHandle);
	}
	glBindVertexArray(vaoHandle);
	bindVboGL();
	bindIboGL();
	glBindVertexArray(0);
}

void Vao::drawGL() {
	if(vaoHandle == 0) initGL();
	glBindVertexArray(vaoHandle);
	for(std::vector<VaoEntry>::size_type i = 0; i < entries.size(); i++) {
		if(!entries.at(i).isVisible()) continue; 
		checkShaderProgramGL(entries.at(i).getShaderId());
		checkEntryUpdatesGL();
		entries.at(i).getShaderProgram()->bindGL();
		entries.at(i).getShaderProgram()->uploadUniformsGL(entries.at(i).getUniformKey());
		// First call throws handled exception
		glDrawElements(mode, entries.at(i).getCount(), GL_UNSIGNED_SHORT, 
			reinterpret_cast<GLvoid *>(entries.at(i).getOffset() * sizeof(GLshort)));
		entries.at(i).getShaderProgram()->unbindGL();
	}
	glBindVertexArray(0);
}

void Vao::updateGL() {
	int vertexSize = 0;
	int indexSize = 0;
	for(std::vector<VaoEntry>::size_type i = 0; i < entries.size(); i++) {
		vertexSize += entries.at(i).getSize();
		indexSize += entries.at(i).getCount();
	}
	GLfloat* vertexBuffer = new GLfloat[vertexSize];
	GLushort* indexBuffer = new GLushort[indexSize];
	vertexSize = 0;
	indexSize = 0;
	for(std::vector<VaoEntry>::size_type i = 0; i < entries.size(); i++) {
		entries.at(i).collectVertexData(vertexBuffer, &vertexSize);
		entries.at(i).collectIndexData(indexBuffer, &indexSize);
	}

	glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
	glBufferData(GL_ARRAY_BUFFER, vertexSize * sizeof(GLfloat), vertexBuffer, usage);
	glBindBuffer(GL_ARRAY_BUFFER, 0);

	glBindVertexArray(vaoHandle);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, iboHandle);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, indexSize * sizeof(GLushort), indexBuffer, usage);
	glBindVertexArray(0);

	delete[] vertexBuffer;
	delete[] indexBuffer;
}

Just few checks,
are you binding vertexarray before uploading data in array buffer? Also, where are BindVertexArrayPointer and EnableVertexattribPointer call?

void Vao::bindVboGL() {
	glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
	for(std::vector<VertexAttribute*>::size_type i = 0; i < attribs.size(); i++) {
		attribs.at(i).bindVertexPointerGL();
	}
}

void VertexAttribute::bindVertexPointerGL() {
	glEnableVertexAttribArray(index);
	glVertexAttribPointer(index, size, type, normalized, stride, reinterpret_cast<GLvoid*>(index * sizeof(type)));
}

are you binding vertexarray

Not quite sure what you mean with that but all the GL-calls to upload data are in Vao::updateGL().

Binding vertex array means :


         glBindVertexArray(vaoHandle);
	glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
	glBufferData(GL_ARRAY_BUFFER, vertexSize * sizeof(GLfloat), vertexBuffer, usage);
	glBindBuffer(GL_ARRAY_BUFFER, 0);
 
	
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, iboHandle);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, indexSize * sizeof(GLushort), indexBuffer, usage);
	glBindVertexArray(0);

Few things to mention:
if you are not passing any other data into index buffer than indices, your last parameter should be 0 offset it will state the starting point. Same applies to glVertexAttribPointer, the last parameter should be 0 and stride as well as I dont see you are passing texture coordinate or normal or any other data in array buffer.


	glEnableVertexAttribArray(index);
	glVertexAttribPointer(index, size, type, normalized, stride, reinterpret_cast<GLvoid*>(index * sizeof(type)));

here you should be passing array buffer attribute not index buffer. Data in AttribPointer should be of array buffer. In addition make sure that you are calling bindbuffer and enableattribpointer calls in a sequence to match them to particular buffer.

[QUOTE=debonair;1254724]Binding vertex array means :
Few things to mention:
if you are not passing any other data into index buffer than indices, your last parameter should be 0 offset it will state the starting point.
[/QUOTE]
I use offsets into the index-buffer to change shader-programs. e.g. I have 2 triangles in my indexbuffer, drawing one with offset 0, the other with 3 * sizeof(GLushort) and change shader inbetween.

as well as I dont see you are passing texture coordinate or normal or any other data in array buffer.

I interleave vertex attributes in one vbo. Thats where the offset and stride come from

here you should be passing array buffer attribute not index buffer. Data in AttribPointer should be of array buffer. In addition make sure that you are calling bindbuffer and enableattribpointer calls in a sequence to match them to particular buffer.

        glBindBuffer(GL_ARRAY_BUFFER, vboHandle);
	glEnableVertexAttribArray(index);
	glVertexAttribPointer(index, size, type, normalized, stride, reinterpret_cast<GLvoid*>(index * sizeof(type)));

These calls follow each other immediately and there should be nothing wrong with it. If so please elaborate.

Well I just want to pointer out few odd things:

> Exception is handled and doesn’t crash the program
> Exception lists the intel driver dll (ig7icd32.dll) (I use an Intel HD4000)
> Exception only occurs on the very first call of glDrawElements
> I can’t detect errors in my OpenGL state at that very moment

If someone could help I’d be very thankful!