PDA

View Full Version : glDrawRangeElements and _nv001100gl segfault



Brian C. Dilley
06-22-2005, 09:22 PM
Hey guys, new comer here... please forgive me if i don't post my stuff the way you'd like to see it :) .

Anyway, I'm working on a 3D engine and I'm responsible for the Linux and OpenGL portions. I'm implementing Vertex and Index Buffers at the moment and I've ran into a problem that has me stumped. Here's what i do to setup the buffers:

Index Buffer

GLExtensions::glGenBuffersARB(1, &m_buffer);

// make active
GLExtensions::glBindBufferARB(GL_ELEMENT_ARRAY_BUF FER_ARB, m_buffer);

// allocate it
GLenum bufferType = (isDynamic() ? GL_DYNAMIC_DRAW_ARB : GL_STATIC_DRAW_ARB);
GLExtensions::glBufferDataARB(
GL_ELEMENT_ARRAY_BUFFER_ARB,
m_size,
NULL,
bufferType
);
Vertex Buffer

GLExtensions::glGenBuffersARB(1, &m_buffer);

// make active
GLExtensions::glBindBufferARB(GL_ARRAY_BUFFER_ARB, m_buffer);

// allocate it
GLenum bufferType = (isDynamic() ? GL_DYNAMIC_DRAW_ARB : GL_STATIC_DRAW_ARB);
GLExtensions::glBufferDataARB(
GL_ARRAY_BUFFER_ARB,
m_size,
NULL,
bufferType
);
Activation:

if (pos>0) {
int oset = m_pStreamSetDefinition->getElementOffset(index,IStreamDefinition::POSITION );
glVertexPointer(
3,
GL_FLOAT,
vertexSize,
BUFFER_OFFSET(oset)
);

glEnableClientState(GL_VERTEX_ARRAY);

} else {
glDisableClientState(GL_VERTEX_ARRAY);

}

// normal?
if (normal>0) {
int oset = m_pStreamSetDefinition->getElementOffset(index,IStreamDefinition::NORMAL);
glNormalPointer(
GL_FLOAT,
vertexSize,
BUFFER_OFFSET(oset)
);

glEnableClientState(GL_NORMAL_ARRAY);

} else {
glDisableClientState(GL_NORMAL_ARRAY);

}

// texcoords?
if (texcoords>0) {
int oset = m_pStreamSetDefinition->getElementOffset(index,IStreamDefinition::TEXTURE0 );
glTexCoordPointer(
2,
GL_FLOAT,
vertexSize,
BUFFER_OFFSET(oset)
);

glEnableClientState(GL_TEXTURE_COORD_ARRAY);

} else {
glDisableClientState(GL_TEXTURE_COORD_ARRAY);

}Rendering:

int indexCount = (triangleCount*3);

GLExtensions::glDrawRangeElements(
GL_TRIANGLES,
firstIndex,
firstIndex+indexCount,
indexCount,
GL_UNSIGNED_SHORT,
NULL
);Ok, so I'm running Linux (kernel 2.6) latest stable Nvidia drivers. I have a 6800oc. The same engine code runs on windows with OpenGL... but an ATI card. When i run the direct rendering method (glBegin().....glEnd()) of OpenGL it works fine. I get an error in what looks like the nvidia driver, in a symbol named "_nv001100gl()". What happens is that 4 iterations of the main loop render everything fine. On the 5th iteration i get a segfault in that symbol. I've verified that the values passed to glDrawRangeElements are accurate (they are the same every time, the desired result).

Thanks in advance for any help!

sqrt[-1]
06-22-2005, 10:23 PM
int indexCount = (triangleCount*3);

GLExtensions::glDrawRangeElements(
GL_TRIANGLES,
firstIndex,
firstIndex+indexCount,
indexCount,
GL_UNSIGNED_SHORT,
NULL
); Looks a bit strange with the firstIndex+indexCount stuff. Is every vertex unique? (ie is a vertex not referenced more than once by an index?)

As an experiment, try using standard glDrawElements and see what happens.

Brian C. Dilley
06-23-2005, 08:14 AM
I tried using glDrawElements and i get the same exact result. I'v also verified that the arguments passed to the glDrawElements and glDrawRangedElements functions are the same on every iteration... and the first 4 iterations function perfectly.

sqrt[-1]
06-23-2005, 02:50 PM
Not sure what else it could be.

I am sure you have already checked:
- ensure the index buffer references all valid vertices.
- ensure the number of indices is correct
- ensure the vertex buffer pointers are set up correctly.
- Try not using VBO and render directly from system memory.
- ensure you are not getting any GL errors.

As a test, try creating the VBO's (index and vertex) to be twice as big as they need to be.

You may have to create a demo that reporoduces this.