glDrawElementsBaseVertex problems with base value

Hi!

I’m importing some meshes using Assimp and putting them inside 3 buffers, positions, normals and indices, queueing each data after the previous one and building sets of drawing ranges indices (start index, indices count, etc) to use with glDrawElements and glDrawElementsBaseVertex. The first mesh added in the buffers uses glDrawElements because the indices start from the begin of the buffer but for in the next ones the indices must have an offset to their values so i use glDrawElementsBaseVertex for that.
The problem is that the ones that uses glDrawElementsBaseVertex produce garbage on screen. So i tried to change the indices while creating their buffer, adding manually the offset to their values. With that everything is ok (both with using glDrawElements or glDrawElementsBaseVertex with a base value of 0).

Do i calculate the base value wrong?

For example i’ve 2 cubes, one near the other. Each have 12 triangles and so 3 * 12 = 36 indices (drawing with GL_TRIANGLES). The second mesh will have a base value of 36 so that when, for example, one of its indices is 0 it will grab the vertex at position 36 in the VBO. Instead i see like a distorted triangle only inside the first cube…
I’ve also tried a more complex scene (a building) and the meshes are garbage but you can see a little the shape of the mesh because some triangles (not much) seems fine.

I’ve created a test program with static data. The positions and normals of the two cubes next to each other (12 triangles, 8 vertices/normals each) are:


GLfloat vertices[] = {
    1.00000f, -1.00000f, -1.00000f,
    1.00000f, -1.00000f, 1.00000f,
    -1.00000f, -1.00000f, 1.00000f,
    -1.00000f, -1.00000f, -1.00000f,
    1.00000f, 1.00000f, -1.00000f,
    -1.00000f, 1.00000f, -1.00000f,
    -1.00000f, 1.00000f, 1.00000f,
    1.00000f, 1.00000f, 1.00000f,
    // second cube vertices
    -0.60000f, -0.60000f, 1.99787f,
    -0.60000f, -0.60000f, 3.19787f,
    -0.60000f, 0.60000f, 3.19787f,
    -0.60000f, 0.60000f, 1.99787f,
    0.60000f, 0.60000f, 1.99787f,
    0.60000f, -0.60000f, 1.99787f,
    0.60000f, 0.60000f, 3.19787f,
    0.60000f, -0.60000f, 3.19787f
};

GLfloat normals[] = {
    0.66667f, -0.66667f, -0.33333f,
    0.40825f, -0.40825f, 0.81650f,
    -0.66667f, -0.66667f, 0.33333f,
    -0.40825f, -0.40825f, -0.81650f,
    0.33333f, 0.66667f, -0.66667f,
    -0.81650f, 0.40825f, -0.40825f,
    -0.33333f, 0.66667f, 0.66667f,
    0.81650f, 0.40825f, 0.40825f,
    // second cube normals
    -0.81650f, -0.40825f, -0.40825f,
    -0.33333f, -0.66667f, 0.66667f,
    -0.81650f, 0.40825f, 0.40825f,
    -0.33333f, 0.66667f, -0.66667f,
    0.81650f, 0.40825f, -0.40825f,
    0.33333f, -0.66667f, -0.66667f,
    0.33333f, 0.66667f, 0.66667f,
    0.81650f, -0.40825f, 0.40825f
};

the indices array is this


GLuint indices[] = {
    0, 1, 2,
    0, 2, 3,
    4, 5, 6,
    4, 6, 7,
    0, 4, 7,
    0, 7, 1,
    1, 7, 6,
    1, 6, 2,
    2, 6, 5,
    2, 5, 3,
    4, 0, 3,
    4, 3, 5,
    // second cube indices

    0, 1, 2,
    0, 2, 3,
    3, 4, 5,
    3, 5, 0,
    4, 6, 7,
    4, 7, 5,
    1, 7, 6,
    1, 6, 2,
    1, 0, 5,
    1, 5, 7,
    6, 4, 3,
    6, 3, 2
};

then when i draw i use

    

    #define BUFFER_OFFSET(i) ((char *)NULL + (i))

    // first cube
    glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);

    // second smaller cube in front of it
    glDrawElementsBaseVertex(GL_TRIANGLES, 36, GL_UNSIGNED_INT, BUFFER_OFFSET(sizeof(GLuint) * 36), 36);
    

and the second cube is rendered as a single triangle inside the first cube, not even the size/shape of one of the triangles of the second cube…

then i tried to change and the indices so that the array has already the second set of indices added with the base value


GLuint indices[] = {
    0, 1, 2,
    0, 2, 3,
    4, 5, 6,
    4, 6, 7,
    0, 4, 7,
    0, 7, 1,
    1, 7, 6,
    1, 6, 2,
    2, 6, 5,
    2, 5, 3,
    4, 0, 3,
    4, 3, 5,
    // second cube indices with added base value
    36 + 0, 36 + 1, 36 + 2,
    36 + 0, 36 + 2, 36 + 3,
    36 + 3, 36 + 4, 36 + 5,
    36 + 3, 36 + 5, 36 + 0,
    36 + 4, 36 + 6, 36 + 7,
    36 + 4, 36 + 7, 36 + 5,
    36 + 1, 36 + 7, 36 + 6,
    36 + 1, 36 + 6, 36 + 2,
    36 + 1, 36 + 0, 36 + 5,
    36 + 1, 36 + 5, 36 + 7,
    36 + 6, 36 + 4, 36 + 3,
    36 + 6, 36 + 3, 36 + 2
};

then i changed the draw call to


    // first cube
    glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);

    // second smaller cube in front of it
    glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, BUFFER_OFFSET(sizeof(GLuint) * 36));

but in this test application continues to render that wrong triangle…
If i remove the data of the first cube and just use one glDrawElements the second cube is ok, so it’s not wrong data :tired:

I can provide the full source code if needed (i’ve simplified everything to the bare minimum). I’m using GLEW + a Radeon card with 14.4 drivers, OpenGL 3.3

I’ve found the issue, the base value i was assigning was the offset in the indices buffer, not the offset in the positions/normals buffers. I was even (un)lucky that it rendered sometime even with so wrong indices values :doh: