glDrawElements

Having trouble getting triangles with glDrawElements.

Question 1) What is the max size I can pass it? It seems that even with GL_UNSIGNED_INT I can’t go much higher than 32k without it crashing. I reach no such limit with glDrawArrays.

Question 2) When I call it with GL_POINTS the points are in the right places. But GL_TRIANGLES gives me bizarre triangles. It looks like the indeces aren’t correct. I have verified the data type for indeces. I’ve tried different types of arrays. Nothing works. It’s giving me random looking triangles on correct points.

Here is the code that correctly renders the first 3000 points and proves my indexing is right (BTW I’m using std::vectors for my arrays, therefore: continuous memory)

        glBegin(GL_TRIANGLES);      
        for(int i=0; i<3000; i++)
        {
            int idx  =  mTriIdxGrid[i];
            float* p = &mPointGrid [idx];
            float* n = &mNormalGrid[idx];

            glNormal3fv(n);
            glVertex3fv(p);
        }
        glEnd();

I thought that the following should do the same thing… but it doesn’t… I get messed up triangles why?:

            glEnableClientState( GL_NORMAL_ARRAY);
            glEnableClientState( GL_VERTEX_ARRAY);

            glVertexPointer(  3, GL_FLOAT,  0, &mPointGrid[0]  ); 
            glNormalPointer(     GL_FLOAT,  0, &mNormalGrid[0] ); 
                        
            glDrawElements ( GL_TRIANGLES, 3000, GL_UNSIGNED_INT, &mTriIdxGrid[0] ); 

            glDisableClientState( GL_NORMAL_ARRAY);			GL_ASSERT;
            glDisableClientState( GL_VERTEX_ARRAY);

Since I’ve seen people make mistakes with index data type I did this sort of thing, it’s exactly the same:

            GLushort shorts[3000];
            for(int i=0; i<3000; i++)
            {
                assert( mTriIdxGrid[i] < 0xFFFF );
                shorts[i] = mTriIdxGrid[i];
            }

            glEnableClientState( GL_NORMAL_ARRAY);
            glEnableClientState( GL_VERTEX_ARRAY);

            glVertexPointer(  3, GL_FLOAT,  0, &mPointGrid[0]  ); 
            glNormalPointer(     GL_FLOAT,  0, &mNormalGrid[0] ); 
                        
            glDrawElements ( GL_TRIANGLES, 3000, GL_UNSIGNED_SHORT, shorts ); 

            glDisableClientState( GL_NORMAL_ARRAY);			GL_ASSERT;
            glDisableClientState( GL_VERTEX_ARRAY);

I know I’m being a retard I just can’t see where…

It seems that even with GL_UNSIGNED_INT I can’t go much higher than 32k without it crashing.

What drivers/hardware are you using?

QuadroFX 3450 11.6996 on XP 32bit

Question 1)
glGetIntegerv(GL_MAX_ELEMENTS_INDICES,&maxi);
glGetIntegerv(GL_MAX_ELEMENTS_VERTICES,&maxv);

Question 2)
thats correct that indices are organized wrong. depends on how you get them. if you would read them out from say 3ds files they would be displayed correct. also the primitive type has to match. if you render data designed for triangle list with triangle fan you would get weird results.

i remember on a gf6 that using unsigned_int for indices forced a software fallback

u need to split up your object in multiple subobjects with each having a maximum of 2^15 (16bit ?) indices.

Yeah, i had the same problem on a Gf7 (i think it was 18 Bits there), but nVidia told me they would fix it.

However, that doesn’t explain why his triangles are messed up. I cannot see any errors in his code.

Jan.

OK question 1 is resolved: the range returned by GL_MAX_ELEMENTS_INDICES is the same as GL_MAX_ELEMENTS_VERTICES in my case both are 2^20. I’m well below that. I changed my indeces to repeat the same triangle over and over and it doesn’t crash. So clearly the crash has something to do with indeces.

So Question 2 is the real problem. My indeces are NOT messed up. If they were I’d get junk on the first code snippet from the first post. I don’t. Every triangle looks fine. With glDrawElements every triangle is messed up.

For consistency though this also fails:

            glEnableClientState( GL_NORMAL_ARRAY);			GL_ASSERT;
            glEnableClientState( GL_VERTEX_ARRAY);

            glVertexPointer(  3, GL_FLOAT,  0, &mPointGrid[0]  ); 
            glNormalPointer(     GL_FLOAT,  0, &mNormalGrid[0] ); 

            glBegin(GL_TRIANGLES);
            for (int i = 0; i < mTriIdxGrid.size(); i++)
                glArrayElement( mTriIdxGrid[i] );
            glEnd();

            glDisableClientState( GL_NORMAL_ARRAY);			GL_ASSERT;
            glDisableClientState( GL_VERTEX_ARRAY);  

It crashes around 40k I looked at the index and it’s legit. Since you guys aren’t telling me that these functions don’t work it must somehow be something in my state that is messing this up… but what could it be? I disabled all the other possible ClientStates is there anything else that would influence this behavior?

Thx,

Vincent

Can’t see the problem.
Should be glDrawElements(GL_TRIANGLES,mTriIdxGrid.size(),GL_UNSIGNED_INT,&mTriIdxGrid[0]);

are your indices in mTriIdxGrid all unsigned int (4 bytes each)?

try glDrawRangeElements(). although dont think it helps.

Ah! the problem is that I’m a retard. There was no question… it’s just how am I being a retard. My first code snippet holds the key:

            int idx  =  mTriIdxGrid[i];
            float* p = &mPointGrid [idx];
            float* n = &mNormalGrid[idx];

If mTriIdxGrid held valid indeces then the code that worked should have been:

            int idx  =  mTriIdxGrid[i] * 3
            float* p = &mPointGrid [idx];
            float* n = &mNormalGrid[idx];

So indeed I was computing my indeces incorrectly… and verifying them incorrectly… and the two errors compensated each other.

It’s very reassuring that the world makes sense again!

Sometimes even 10 pairs of eyes don’t see the error :wink: Good that you found it, i didn’t see that either.

Well then, hope you get it fixed now soon.

Jan.