Moving Index Buffers from Client-side to IBOs

I have been rendering my height maps using client-side index buffers for each triangle strip in each patch (height map is divided into smaller blocks a la Geomipmapping). I am now trying to move these index buffers over to IBOs, but have come across a real puzzling problem.

For simplicity, I’m only rendering a single 32x32 patch.

The picture below is using client-side index buffers:

… and here is what happens if I try to move these buffers to an IBO:

What I couldn’t get to show from this picture is the vertices on the last column seem to connect to the first vertex in the patch.

The confusing part to me is these same client-side index buffers (that work fine) are the data being passed to glBufferData, shown below:


// "indexBuffers" are the client-side buffers
protected void initIBOs(GL gl)
{
   for(int i = 0; i < this.indexBuffers.length; i++)
   {
      gl.glGenBuffers(1, this.iboIDs[i]);
      gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, this.iboIDs[i].get(0));
      gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, this.indexBuffers[i].capacity(), this.indexBuffers[i], GL.GL_STATIC_DRAW);
   }
}

Here is how the IBOs are being rendered:


gl.glBindBuffer(GL.GL_ARRAY_BUFFER, this.vboManager.getBufferID());
gl.glEnableClientState(GL.GL_VERTEX_ARRAY);
gl.glVertexPointer(3, GL.GL_FLOAT, 0, 0);

// just one block as a test
this.blocks[0][0].renderWithIBOs(gl);

gl.glDisableClientState(GL.GL_VERTEX_ARRAY);

. . .

protected void renderWithIBOs(GL gl)
{
   // number of vertices in each triangle strip
   int numVertices = indexBuffers[0].capacity();
      
   for(int i = 0; i < this.iboIDs.length; i++)
   {
      gl.glBindBuffer(GL.GL_ELEMENT_ARRAY_BUFFER, this.iboIDs[i].get(0));
      gl.glDrawRangeElements(GL.GL_TRIANGLE_STRIP, 0, numVertices-1, numVertices, GL.GL_UNSIGNED_INT, 0);
   }
}

Has anyone experienced such a problem?

That’s not a huge amount of data and I notice that you’re using 32 bit indexes. 32 bit indexes aren’t supported on all hardware; have you tried converting to 16 bit?

32 bit indexes aren’t supported on all hardware

They most certainly are. They may not be as fast as 16-bit indices, but they certainly are supported.

I have tried converting to 16-bit, but no fix. Besides, the height map I’m rendering is 1025x1025, so most of the blocks’ indices would require an int.

I should mention that there’s only one VBO for the entire height map, so each of the blocks’ index buffers reference this “master” list of vertices.

Found the issue. Careless mistake:


gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, this.indexBuffers[i].capacity(), this.indexBuffers[i], GL.GL_STATIC_DRAW);

should be


gl.glBufferData(GL.GL_ELEMENT_ARRAY_BUFFER, this.indexBuffers[i].capacity() * BufferUtil.SIZEOF_INT, this.indexBuffers[i], GL.GL_STATIC_DRAW);

/wrists

The BufferData command accepts the size of a block passed, while you are passing the output of ‘capacity()’ method. I guess your ‘capacity’ returns the number of elements, but not the total size in bytes.

Sorry, I did’t update the page before posting :slight_smile:

I would be a millionaire if I received a buck for every time I made that mistake…

/wrists…