nVidia Quadro - VBO streaming performance

Hi folks,

I’m streaming positions/normals/uvcoords per frame to VBOs. On my Geforce 285 GTX (and on an old ATI FireGL) it runs nice, but customers report heavy lagging on Quadro FX 3700.

Since I wasn’t able to repeat the problems on my 285, I bought a testing setup with that Quadro. And it’s true: every 50-100 frames, the performance drops and the app stalls for some milliseconds.

Bound to OpenGL2.1, I simply used glBufferSubData since then. Now I tried it with orphaning using glBufferData(NULL) - and the lagging got worse!! Same with glMapBufferRange() and INVALIDATE.

After days of try and error, I discovered a driver setting in the nVidia Quadro control panel:
“Manage 3D Settings -> Global Presets -> Workstation app - Dynamic streaming”
(default is: “3D app - Default global settings”).

Setting to “dynamic streaming”, seems to solve all problems. Oh boy, my Geforce hasn’t got this switch. So that expensive Quadro seems not smart enough to recognise an app that uses streaming, eh?

Now I’m interested in the underlying techniques. To me, it’s not obvious what this setting changes. And how can I achieve these changes from within my app when default driver settings are used?

CatDog

I don’t run Quadro but was still curious what this does, so I did a little searching. No guts-level answers yet, but this:

confirms this is definitely a setting you want if you’re aiming for a consistent frame rate. Another one you want (that I’ve personally tripped over) is PowerMizer – disable that so the driver doesn’t dynamically down-clock the card every so often, killing your frame rate.

[QUOTE=Dark Photon;1238311]I don’t run Quadro but was still curious what this does, so I did a little searching. No guts-level answers yet, but this:

confirms this is definitely a setting you want if you’re aiming for a consistent frame rate.[/QUOTE]

Whoa, thanks, how did you find this? Also very interesting is the description for the setting “3D App – Game Development”: Turns card into Geforce card. Oh yes, please! Just tried it and this works. :slight_smile:

“3D App – Visual Simulation” works, too.

PowerMizer: Maybe I missed something, but the PowerMizer thing seems not available for my desktop card. Seems to be something aimed at the mobility series only.

CatDog

Just a quick google search for: Dynamic streaming nvidia

“3D App – Visual Simulation” works, too.

Makes sense. In this domain, a consistent framerate typically supercedes all other requirements.

PowerMizer: Maybe I missed something, but the PowerMizer thing seems not available for my desktop card. Seems to be something aimed at the mobility series only.

It exists on a number of GeForce desktop cards too, including the highest-end cards of each generation.

If you ask me, the internal difference between default- and streaming/simulation-setting is the way GPU memory is organized. Stalling every 50-100 frames smells like buffer overrun and cache reorganization. As said above, with orphaning it gets worse. So the default setting seems to eat up internal memory until it needs garbage collection, while the streaming setting uses some kind of dynamic memory management.

But that’s just me guessing wildly.

CatDog

Could be. But I have to say that the behavior you describe sounds a lot like what happened to me when PowerMizer kicked in periodically while the app was running, with a constant graphics load no less, causing it to break frame every few seconds – like clockwork.

Don’t know what automated tools you have for this on Windows, but on Linux:


> nvidia-settings -t -q GPUCurrentClockFreqsString
nvclock=772, memclock=2004, processorclock=1544

You can set this to run in a loop every 0.25 seconds or so and see if your app, when it “flips down” in performance also corresponds to the GPU dynamically flipping down the clocks. You can also see this dynamically in the nvidia-settings GUI (driver control panel in Windows probably).

In Linux at least, you can change the PowerMizer setting (via nvidia-settings) from “Adaptive” (driver moves clocks up and down at its discretion) to “Prefer Maximum Performance” (stop screwing with my clocks, dude!) mode. Wouldn’t be a bit surprised if you reset your current settings and nailed PowerMizer to the “Max Performance” setting that your problem went away. …but just like your assertion, this is a wild (albeit semi-educated) guess.

Thanks for your input, Dark Photon. After googling PowerMizer I’ve got the impression that this feature has been removed in Win7 drivers, or it’s not accessible from the control panel anymore. I’ll look into that.

For now, my “solution” to the problem is to remind all customers using Quadros to check their driver settings. sigh

CatDog

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.