Hardware change did not improve GLUT program perf

Hi

I have a simple GLUT animation program.
I replaced my game oriented card with a professional one.
SpecViewPerf 10 shows 3 to 6 times improvement
but not my application.
Is is because of GLUT?

Thanks

Possible, but unlikely. You need to profile it and see. It could be your GL code is very non-optimal, and you’re basically CPU bottlenecked, not GPU bottlenecked. If you are, the fastest GPU out there obviously isn’t gonna make a hill of beans difference.

Give us more details on what/how you’re doing (basic rendering strategy, number of batches, number of state changes, etc.), current GPU, current draw times (e.g. ms, fps, etc.), and what your draw time target is. With that we can make some suggestions on how you might restructure things to get better perf.

Thanks for the quick response.
Actually the things are even worse.
So I am comparing ATI HD3650 AGP with NVIDIA Quadro FX380 PCIe.
Based on the SpecViewPerf 10, the Quadro beats the gaming card.
But my program, all GLUT examples and NEHE tutorials are SLOWER
on the Quadro.
The only explanation I find is that the gaming drivers can detect the SPEC benchmark and reduce their performance.
What a rip-off!

that happens. what if you change your executable name to Doom3.exe? any improvements? read few articles about how drivers disable this and that to make a game run faster and claim that something is really optimized in the pipeline. guess it works the other way around then.

I think it has something to do with FSAA.
According to http://www.nvidia.com/object/quadro_geforce.html
professional cards have hardware FSAA.
And according to http://www.spec.org/gwpg/publish/fsaa_in_vp10.html
SpecViewPerf makes use of FSAA.
I should have known…

Doubtful. That stuff’s on the GPU on GeForce too.