How to use glMultiDrawElements and glDrawElements

One thing that bothers me with glDrawElements is the way it’s used for VAOs with IBOs. This is the way I would normally use it to draw a bunch of triangles, for example:


glDrawElements(GL_TRIANGLES,numIndices,GL_UNSIGNED_INT,(void*)(startingIndex));

where numIndices is the number of indices to fetch from the IBO (a multiple of 3 in this case) and startingIndex acts as an offset within the IBO.

Having to cast an integral like startingIndex to a pointer is awkward, unintuitive and reeks of dirty reuse of legacy function names. It’s so obscure that if you check glDrawElements - OpenGL 4 Reference Pages or OpenGL 4 Reference Pages for this method’s description, it doesn’t even mention this usage mode and instead only mentions an array as possible parameters.

So my first question is: have I been using the wrong method all this time and is there a more suitable function to draw “elements” from an IBO-based VAO? I assume the answer is no, but it’s worth a try to ask.

My second question is of similar fashion. Look at the prototype of glMultiDrawElements:



 void [b]glMultiDrawElements[/b](GLenum mode​, const GLsizei * count​, GLenum type​, const GLvoid * const * indices​, GLsizei drawcount​);

count is clear enough, you give it an array of GLsizi, which happens to be defined as int, which is 32 bits. Good enough.

But now look at indices. This is defined as void **, however to use this method with VAOs you don’t need to pass a pointer to a pointer, you need simply to pass a single array (thus a single pointer). However, because it’s asking a void , you need to pass an array that contains indices which are the size of a void, which obviously varies depending on the platform and for which you have no other GL-defined type to the exception of GLvoid.

So in other words, this doesn’t work (in fact, it crashed the nVidia drivers on my test machine, unsurprissingly):


GLsizei counts[] = { 3,3,3 };
unsigned int starts[] = { 0,6,11 };

glMultiDrawElements(GL_TRIANGLES,counts,GL_UNSIGNED_INT,(const void **)(starts),3);

because, obviously, unsigned ints are not 64bit. So to fix this, one must do this:


GLsizei counts[] = { 3,3,3 };
int64_t starts[] = { 0,6,11 };

glMultiDrawElements(GL_TRIANGLES,counts,GL_UNSIGNED_INT,(const void **)(starts),3);

Which works, because int64_t is exactly the same size as a pointer on a 64bit program.

But what happens when you compile as 32bit? Well, obviously now void* is going to be 32bit of size, so int64_t is no longer going to be a good choice of size for the “starts”.

The obvious solution would be to use the GL-defined type, but because this type is GLvoid, you must declare things like this:


GLsizei counts[] = { 3,3,3 };
GLvoid* starts[] = { (GLvoid*)0,(GLvoid*)6,(GLvoid*)11 };

glMultiDrawElements(GL_TRIANGLES,counts,GL_UNSIGNED_INT,(const void **)(starts),3);

What kind of backwards madness is this?

I must be doing something wrong because I cannot honestly see how this was designed this way. With the amount of functions OpenGL has, including some of biblic passage proportions like glDrawElementsInstancedBaseVertexBaseInstance or glDrawTransformFeedbackStreamInstanced, there’s no way there isn’t a more suitable function to render VAOs with IBOs.

So, is there anything I’m missing here? Perhaps I’m doing something wrong.

Thank you for reading.

It’s so obscure that if you check glDrawElements - OpenGL 4 Reference Pages or OpenGL 4 Reference Pages for this method’s description, it doesn’t even mention this usage mode and instead only mentions an array as possible parameters.

Nonsense. It says so right there: “starting at indices​ (interpreted as a byte count)”. And I didn’t just add that to the Wiki. I’m not kidding; check the edit history; someone else did that last year.

Granted, it’s not on the man pages. And it’s not very prominent. But the information is still there.

The obvious solution would be to use the GL-defined type, but because this type is GLvoid, you must declare things like this:

No, the obvious solution would be to use an integer that is the size of a pointer. Which is exactly how GLintptr is defined.

What kind of backwards madness is this?

The OpenGL kind. Best to get used to it.

The OpenGL kind. Best to get used to it.

Using GL is like riding a roller coaster. I love some of its things and can’t help but hate others.

Thanks for answering!