Programming in Quadro?

I bought a QuadroFX4000 card in recent,and I test some video effect programs writtened by me.But to my surprise,the framerate in Quadro is only get small improving than GF5900GT,and lower than GF6800GT.My video effect programs were written using OpenGL,and use NVIDIA CG to do the effect,I develop the programs under GF5900 card,and I think the GF5900 is not a professional GPU,so I bought the Quadro to test my program,and hope it should show me a higher performance because it is a professional GPU.But the result is letdown,Quadro do not show any advantage than GF5900,and is slower than GF6800.But I heart that many nonlinear editor app use Quadro as its effect generator,such as Adobe’s After Effect,so I think is it my program’s bug?Does programming in Quadro need some particular skills?I think Quadro should be better than GF5900 & GF6800,so if anyone has some advise to me,please tell me,thanks!

Just a question: Before than 6800 series GPU exists, what were the nv professionals GPUs ? 5900 ? Why now it can’t be pro ? Because another generation exists ?

I know that 6800 is the state of art, and I wonder if the Quadro series are specially fueled to develop computer graphics application with classical API, because the latest special effects we can see, are coming from 6800… :confused:

I never worked with professional hardware but if i’m not mistaken they are developed for special needs(like CAD application with very high polygon count). Actually, Quadros should be a bit slower on gaming stuff(read - cool effects and such) than 6800s. So I think it is a normal result.

Yes, I agree with you Zengar, but check
it out

Performance:
Highest Workstation Application Performance

Next-generation architecture enables over 2x improvement in geometry and fill rates with the industry’s highest performance for professional CAD, DCC, and scientific applications.
Precision:
NVIDIA High-Precision Dynamic-Range (HPDR) Technology

HPDR sets new standards for image clarity and quality through floating point capabilities in shading, filtering, texturing, and blending. Enables unprecedented rendered image quality for visual effects processing.
Programmability:
Next-Generation Vertex & Pixel Programmability

NVIDIA Quadro FX 4000 GPUs introduce infinite length vertex programs and dynamic flow control, removing the previous limits on complexity and structure of shader programs. With full support for Vertex and Shader Model 3.0, NVIDIA Quadro FX 4000 GPUs deliver sophisticated effects never before imagined for real-time graphics systems.
Quality:
Rotated-Grid Full-Scene Antialiasing (FSAA)
The rotated-grid FSAA sampling algorithm introduces far greater sophistication in the sampling pattern, significantly increasing color accuracy and visual quality for edges and lines, reducing “jaggies” while maintaining performance.
These features are symbol of high performance, all products are available to program high-def effects or game: super-HDR, FSAA, infinite vertex program-length, Shader Model 3.0…

And they said too:

The NVIDIA Quadro® FX 4400 and NVIDIA Quadro FX 4000 set a new bar for workstation graphics, shattering the limits of performance, programmability, precision, and quality for professional CAD, DCC, and scientific applications.
not for gamedev =/
So, maybe this is not the best card to explode Doom3’s highscores, but you can render n billions of vertices /sec

AFAIK, quadro cards has have some features that gamers cards doesn’t have. It’s a pro class only because of this few features:

  • Faster wireframe rendering (good for modelers)
  • Improved VP/FP instruction set or extended programs len and program params size
  • Faster texture upload and faster backbuffer readback
  • Some of Quadros have genlock, SDI key & fill
  • Unified back-buffer
  • Overlays in OpenGL

If you find usefull some of this features for your job then you are lucky guy! If I have it I will know what to do with it.

yooyo

If I have it, it willbe bad for my optimisation code capabilities, isn’t it ?
So I guess it’s only a bonus for people who are gurus, and who have money to spend: the pro ? :smiley:

The main thing you’re paying for with quadro’s is quad buffered stereo and improved anti-aliasing. Better accumulation buffer performance, and more of the imaging subset are other bonuses. But they’re usually behind geforces on perpixel features.

Originally posted by knackered:
The main thing you’re paying for with quadro’s is quad buffered stereo and improved anti-aliasing. Better accumulation buffer performance, and more of the imaging subset are other bonuses. But they’re usually behind geforces on perpixel features.
in my experience the quadro lines generally come with more features that are planned to go mainstream with the next round of consumer cards.

they do better for modeling environments and scientific apps. the consumer (game) cards are stripped down (and maybe beefed up) for what is popular in gaming at the time i figure.

they tend to have a lot of extra buffer space i think, which you can either use for overlays and stuff… but i prefer to spread it out over multiple monitors.

edit: good to hear they are good a line rasterization… which comes in handy for debug and modeling.

i’ve been thinking about going for a consumer card on my next purchase, because i can’t in good conscience go for the price tags on the higher end quadro cards.

i’m not really sure which is a better move. i figure the quadro cards just come with a bloated price tag to some extent. so i’m interested in hearing anything people can weigh in with in here.

i’m personally holding out for a 64bit PCIe system… not hardly opengl, but neither is this thread, so while i’m at it, i’m curious if intel processors will ever support 3Dnow and will AMD ever do hyperthreading. and i get the impression there are alternatives to IDE ports these days. any way to speed up hard drive throughput, or is this still limited by motor speeds? i also noticed today that lcd monitors are becoming more affordable comparable to crt, but seem to take a fair hit in resolution and look worse (hard pixelation).


why shell out for antialiasing when a crt can do it for free??? ghosting as well, i will mis ghosting, it looks cool. i plan to keep my crt setup as long as it will hold together, but i’m thinking about a flat screen for the next machine, a wide screen lcd would be nice… there is not enough room around here for another crt.


Thanks everyone above,but I think I should describe my demand more exactly:
-My program is not a game,it is a video editor app,and it generate video effect using GPU,the effect include rotate,page peel/roll.

-my program has some vertex shader,and the mesh size is about 100x100,but the VS speed in Quadro are all lower than it in GF6800.

-my program has some heavy pixel operation,such as multiple pixel shader[get the previous ps’ result as the input of next ps],so the PS performance of Quadro is better than GF6800?

-Some of my video effects is very simple,it only draw a quad or cube.I write them using vertex array method,and them run well in GF6800 & 5900[FPS>1000fps],but to my surprise,these effects are very slow in Quadro,slow than the effects that has complex VS & PS,why?Is it say that Quadro is not suit for the operation has few vertexes?

-What I want to get in Quadro is:more high PS & VS performance,more fast tex upload and buffer readback,more accurate image clarity & quality[I found in GF,higher FSAA will make the texture some blured,so is it improved in Qdo?],which one I can get?

The main differences between a GeForce6800 and a Quadro4000 are drivers, quality and added functionality.
I think Quadro drivers are optimized for OpenGL, so Direct3D stuff might be slower.
On Quadro everything is high quality, if you archieve higher speeds on a 6800 because of low quality settings in the driver you will not be able to do that on the Quadro card. Again a driver issue.
The mentioned added functionality such as antialiased lines, overlay bitplanes are targeted at professional CAD programs. Quadros can also handle multiple OpenGL windows better, including overlaps.

My experiences, performance wise, with OpenGL and a Quadro4000 have been more or less identical to a 6800Ultra, including shading.

The consumer cards are where the money is, so that’s where the new chipsets make their debut. The quadro’s usually lag behind by a month or two. It’s not the other way round.

Originally posted by knackered:
The consumer cards are where the money is, so that’s where the new chipsets make their debut. The quadro’s usually lag behind by a month or two. It’s not the other way round.
when i picked up my last card, i looked around, and eventually settled on quadro because it had NV30 and a double sided stencil buffer, and better dual monitor features than gaming cards at comparable prices. the fill rates and geometry processing numbers were slightly less though.

edit: the card i settled on though was actually a pretty quirky card. looks like it was designed for low budget developers with crt monitors who wanted to play with cutting edge features at the time. the retail price was also around 100$ off the suggested value, presumably cause no one really wanted it, because it wasn’t an awesome game card, nor an awesome developer card, but bang for buck it was awesome.

Hi,is it say Quadro is not suite for my app?Geforce 6800 has enough power for my app?

Originally posted by pango:
Hi,is it say Quadro is not suite for my app?Geforce 6800 has enough power for my app?
if you are looking for ‘pixel shader’ power, from what i gather from the information here… i would deduce the answer is YES. but i will fill better if someone would offer a second opinion.

Originally posted by Gollum:
[b]Yes, I agree with you Zengar, but check
it out

[/b]
Yes, twice the performance of
previous Quadro hardware,as Geforce6 offered twice the performance of GeforceFX
So I see no conflict here…

Quadros are better wuality thought. Still, if you are developing gaming or demo apps, a plain GeForce6 will go. I’m very happy with my 6600GT :wink:

better quality? in what sense? I’ve used both consumer geforces and quadros for years, and there’s no difference between similar clock rates other than those already mentioned (quadbufferedstereo,imagingsubset,unifieddepthbuffer, fasterwireframe,etc.)
if you don’t need those features, then buy a geforce6.

Oh, so I was mistaken, sorry everyone

As I said, I have never worked with Quadros. I simply read somewhere that Quadros provide better image quality.