gluNurbsSurface - Nvidia vs SGI ...

I’ve written some code to do skinned animation using polys and nurbs and have been doing some testing…

On win32 with a geforce 2GTS, the frame rate for the polys will be somewhere about 250-300fps compared with 30-40 on an SGI.

When switching to nurbs however, on the same PC the framerate will be about 10fps, compared with about 40fps on the SGI.

My question is this, to be fair, the SGI O2 I use goes about as fast as a very lazy snail thats had his one foot amputated, however it beat’s Nvidia geforce cards hands down when drawing Nurbs (significantly). I’m using gluNurbsSurface stuff in the hope that it will be done in hardware, but from my results I presume that geforces dont support any nurbs stuff in hardware, Am i right?

If so, are nvidia planning to support nurbs in the future at all?

The GLU NURBS code uses evaluators. We do not support evaluators in HW. I don’t know that anything other than InfiniteReality ever did support them in HW. There are a lot of things that are broken about standard OpenGL evaluators.

Now, NV_evaluators is a different matter.

  • Matt