View Full Version : Optimising triangle meshes

Thomas Harte
08-30-2002, 07:25 AM
First of all, I have to admit to being a bit confused about the different remits of this forum and the beginners analogue. So I apologise if this question is in the wrong place!

My 3d artists are using the 3ds file format, and my program is using OpenGL. It is easy to load a 3ds and display it using GL_TRIANGLES, but all the information I have suggests you should be as implicit as you possibly can be - e.g. using fans and strips and so on if possible.

Therefore, my question is are there any clever algorithms for computing the optimal GL primitive description from a set of triangles? At the minute I am contemplating a brute force search through all the possibilities (which would be an offline process, so isn't so terrible . . .), but it strikes me there is almost certainly a smarter method. Is there?

08-30-2002, 07:29 AM
This is pretty good:- http://developer.nvidia.com/view.asp?IO=nvtristrip_library

It generates tristrips from triangles.

08-30-2002, 08:07 AM
The most important thing is to use glDrawElements() (or even better DrawRangeElements()) to get out of immediate mode.

Once that's done, you often don't get a whole lot of improvement going to GL_TRIANGLE_STRIP instead of GL_TRIANGLES drawmode, assuming that your triangles are sorted in a logical order.

If you are not transform bound, and not AGP bound, you're unlikely to see any difference at all -- most apps are actually fill bound. I believe the thing that triangle strips may have over triangle lists is that there's less bandwidth for the triangle index list, and there might be a one-cycle stall to look up a cache hit in the vertex transform cache -- but I've never seen that in practice.

08-30-2002, 08:07 AM
Pretty good but *not* optimal as the bottleneck is at the primitive level.
the fastest = 1 prim = 1 strip ;)
(forget about fans)