Debugging glDrawElements any tips?

Hello all,

I am trying to use VBOs in conjuction with glDrawElements. I was using it fine up till now with my own models. But now I wrote a very simple .3ds loader which fills up my VBOs and now glDrawElements throws one segmentation fault. Here is the stack trace:

#0 76EFE23E ntdll!LdrWx86FormatVirtualImage() (C:\Windows\system32
tdll.dll:??)
#1 00000000 ??() (??:??)

As you can see my OS is Windows. I guess that the problem lies at the indices of the arrays. But how can I figure it out? I tried moving into the VBO only say the first 50 triangles of the model and then drawing them with glDrawElements but I still get the same error.

I am trying to think of other ways to figure out the problem. Do you have any tips as to how to debug such a problem?

This is how I transfer the data into the VBOS:


    glBufferSubData(GL_ARRAY_BUFFER, m->vboOffset, m->getVerticesN()*sizeof(smVertex),m->getVertices() );
    glBufferSubData(GL_ELEMENT_ARRAY_BUFFER, m->iboOffset, m->getTrianglesN()*sizeof(unsigned short)*3,m->getTriangles() );

And this is how I use glDrawElements:

 glDrawElements(GL_TRIANGLES, getModel(0).getTrianglesN(), GL_UNSIGNED_SHORT, BUFFER_OFFSET(0));

I know that code won’t help a lot in determining the problem, so that’s why I am asking for things to look at so I can find it myself.
Thanks in advance people!

Do you really mean to be loading vertex attributes and DrawElements indices into the same buffer?

And this is how I use glDrawElements:

 glDrawElements(GL_TRIANGLES, getModel(0).getTrianglesN(), GL_UNSIGNED_SHORT, BUFFER_OFFSET(0));

So why are you subloading indices to offset m->iboOffset but telling DrawElements they’re at offset 0.

[quote]I know that code won’t help a lot in determining the problem, so that’s why I am asking for things to look at so I can find it myself.

The above code of course assumes that an appropriately allocated and sized VBO is already created and bound before all this.

Also you didn’t reveal your vertex attribute setup and enables. When I’ve had a core dump like this due to batch setup, it’s usually because one of these is bogus.

Do you really mean to be loading vertex attributes and DrawElements indices into the same buffer?

Thanks for the reply!
Hmm… what do you mean by that? I needed to make sure the buffers are bound before using glBufferSubData? Strange that my own shapes worked with VBOS so far then. So it should be something like this?


glBindBuffer(GL_ARRAY_BUFFER,vboID[VERTICES_BUFFER_OBJECT ]);
    glBufferSubData(GL_ARRAY_BUFFER, m->vboOffset, m->getVerticesN()*sizeof(smVertex),m->getVertices() );
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboID[INDICES_BUFFER_OBJECT]);
    glBufferSubData(GL_ELEMENT_ARRAY_BUFFER, m->iboOffset, m->getTrianglesN()*sizeof(unsigned short)*3,m->getTriangles() );

By reading glBindBuffer documentation once more that seems to be the case. Unfortunately This does not solve my problem.

As for my vertex atrrib setups and enables along with VBO generation here it is:

 //Generate buffer object identifiers
    glGenBuffers(BUFFER_OBJECTS_NUM,vboID);

    //Bind  the first buffer saying it will be used for vertices (GL_ARRAY_BUFFER)
    glBindBuffer(GL_ARRAY_BUFFER,vboID[VERTICES_BUFFER_OBJECT ]);
    //pointer is NULL, which means I want GL to allocate memory but not initialize it. (glBufferSubData does that l8er)
    glBufferData(GL_ARRAY_BUFFER, VERTEX_BUFFER_SIZE, NULL, GL_STATIC_DRAW);
    //specifying how data are stored in the VBO

    //If using shaders:
    //will sumbit vertex coords on index 0
    glEnableVertexAttribArray(smATTRIB_I_VERTEX_COORDS);
    glVertexAttribPointer(smATTRIB_I_VERTEX_COORDS, 3, GL_FLOAT, GL_FALSE, sizeof(smVertex), BUFFER_OFFSET(0));
    //will sumbit normal coords on index 1
    glEnableVertexAttribArray(smATTRIB_I_VERTEX_NORMAL);
    glVertexAttribPointer(smATTRIB_I_VERTEX_NORMAL, 3, GL_FLOAT, GL_FALSE, sizeof(smVertex), BUFFER_OFFSET(12));

    //Bind  the second buffer saying it will be used for indices (GL_ELEMENT_ARRAY_BUFFER)
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboID[INDICES_BUFFER_OBJECT]);
     //pointer is NULL, which means we want GL to allocate memory but not initialize it.
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, INDICES_BUFFER_SIZE, NULL, GL_STATIC_DRAW);

I made this code by reading various guides on VBO usage so I can have possibly misunderstood something about their usage. So any help is appreciated :slight_smile:

I needed to make sure the buffers are bound before using glBufferSubData?

Yes. Each bind target can have a different buffer bound to it. Since you’re only using one array buffer and one element buffer, you could get away without it, but if you tried to change buffers (for different objects, etc), you’d have problems.

Typically you load vertex attributes and index lists into separate buffers. Though I don’t think that’s strictly necessary. Your example in the last message (with a different bind before each subload) is more what I was expecting to see.

I made this code by reading various guides on VBO usage so I can have possibly misunderstood something about their usage. So any help is appreciated :slight_smile:

Every parameter is parameterized or defined in terms of something else that’s not included above, so it’s pretty hard to check for errors.

I suggest you first get this right with hard-coded constants in a little GLUT test program. If you have problems with it, post it in its entirety.

Then once you get it working go crazy parameterizing everything again.

Just off-the-cuff, you’re “still” loading your vertex attribute buffer at offset m->vboOffset, but then setting up your vertex attributes as if m->vboOffset == 0. Does it?

And you load your index list at buffer offset m->iboOffset, but then draw as if m->iboOffset == 0. Does it?

Also try disabling all 16 vertex attributes first, just to make sure you haven’t left some enabled you forgot about.

Hello again. I have been trying to find this damn problem for some days now and still no luck. I have reduced it to its most basic form but I can’t post it as an example here since it is scattered over lots of source files which use my own data structures. And I am building this on top of another program I had so … well… you get the picture.

DarkPhoton thank you for all your input up till now. Yes vbo and ibo offset are zero … and since I only draw 1 model they are always zero. It’s the same as if I gave 0 as a constant there.
I do not mess with vertex arrays or vertexattribpointers in any other parts of my code except the one I posted above which is in the initialization section so I can’t have left any enabled I forgot about.

I really can’t understand how to find this error. Does anyone know of any good example of using openGL 3.0 code with VBOs, shaders and vao which loads a .3ds model? Because when I try to return to my primitive shape loading and rendering function it works just fine. The problem is when I try to load the .3ds model. No matter which model I tried.

And the funny thing is that it can’t even draw 1 triangle of it.
So if I do this:


glDrawElements(GL_TRIANGLES, 1, GL_UNSIGNED_SHORT, BUFFER_OFFSET(0));

It still crushes. Or some times it draws the triangles of the model’s mesh (I can see them in my glCanvas) and then immediately crush. The fact that it is not consistent in the way it crushes should be telling me something but I don’t have enough openGL experience to know what that something is. Any pointers?

That is not its most basic form. It’s most basic form would be a simple GLUT test program with just one batch of data (possibly completely contrived) hard-coded into arrays. This you could post.

I really can’t understand how to find this error. …It still [crashes].

You have memory problems. Either in your OpenGL code, or elsewhere in your program. If you can’t see the error, you need to use a memory debugger or OpenGL debugger to point out the problem. For the former, I use Valgrind. For the latter, gDEBugger (Linux, but available for MSWindows too). For the latter, in the free world have also read good things about GLIntercept. Suggest you run your app with the latter and at least capture the OpenGL calls you’re making and the order you’re making them. You may spot the bug immediately. If not, post the call list, and the GL code.

The fact that it is not consistent in the way it crushes should be telling me something but I don’t have enough openGL experience to know what that something is. Any pointers?

Beside the above, start with a working test program and gradually make changes to make it like your current (broken) code. See what breaks it. You’re off in the weeds and aren’t sure which direction is back to the road. Or just post a short GLUT test program so we can give you some concrete help.