Workstation VS. Consumer Video card

I am developing a realtime 3D application and I am trying to find the best video card for the job.

I have tried out a few cards so far: the ATI FireGL X2 256T and the Nvidia Geforce FX 5950 Ultra. So far and it seems that the performance, at least in my tests, of the Consumer level card (5950) seems to out perform the workstation card (FireGL) by about 20% as far as CPU loads go. The ATI also had a problem with the system locking up when we used very large Display lists with vertex arrays. Are the Workstation level cards not made for realtime applications like mine or am I just not taking advantage of the card’s more powerful features (shaders etc.)?

I will also be trying out the Quadra FX 1100 in case it is a Nvidia VS ATI problem. I have read on this board that ATI seems to not have the best OpenGL drivers (at least in the past).

I may be biased (Because I use nVidia cards), but I would definitely go with the 5950 any day. :wink:

Yeah, ATIs OpenGL drivers still has a bit of catching up to do to reach Nvidias level. The main difference between professional and consumer cards (if we’re talking about ATI and nvidia cards) is that wire frame rendering typically is faster on pro boards and the driver tends to be tuned both for specific programs (like Maya) and for typical DCC and CAD workloads, not games. They can also have genlocking capabilities and better signal quality, but this probably depends on who the board vendor is.

ATI and nvidia still use the same chipsets basically for the pro boards though, so the actual hardware doesn’t differ much. If you buy a card from 3DLabs you can expect superior AA and sub pixel rasterisation performance, but bad performance for bandwidth limited texture heavy workloads (e.g. games).

What OS are you using and how much RAM do you have?

If your using Win 98 or ME and have more then 256 megs RAM, there’s a fix I can tell you about.

Unfortunately I am running Windows 2K/XP and a Gig of RAM.

Thanks for the help guys.