NVIDIA's new policy on optimizations

This is moderately encouraging. It gives me hope that despite the past PR bluster serious people inside NVIDIA want to approach benchmarks with some integrity in the future.
http://www.tomshardware.com/graphic/20030818/index.html

All that checking and double checking makes me worry a bit about development efficiency, but if they can do it without grinding their driver development to a halt & pissing off their smart people its a good thing.

The bit about no pre canned state and drivers accelerating more than an individual title is especially welcome. No more trojan shaders.

To me, it looks like just another PR “clean-up”.

>> * An optimization must produce the correct image

The definition of “correct” is unclear. There’s no GPU that can match exactly the reference renderer, so the interpretation is left for optimization judges.

Recent discussions regarding Trilinear filtering issues in ut2k3 (hardocp for example) with newer drivers show how “correction” can and will be bent.

>> * An optimization must accelerate more than just a benchmark

Most, if not all, games that are being used as benchmarking tools will and are being optimized for. And it’s in the top of their priority lists for sure. And being back in the Futuremark beta program without optimizing for 3dm2k3 is very unlikely.

>> * An optimization must not contain a pre-computed state

I don’t know if shaders with instruction reordering can be considered pre-computed state, but if they are, they’re dismissing some perfectly valid optimization IMO.

Anti-flame note: I’m no ATi fanboy.

first: mafia has a policy to work with, too…

second: fun to see that the first drivers wich went trough that check-mechanism show up great performance boosts in 3dmark03, and no real boosts in other apps. and its interesting to see that they actually behave much like the old wellknown cheating drivers. only the patch 330 doesn’t detect them this time.

i don’t think this is more than a pr-cleanup, too.

though, i am biased possibly. i lost the trust in nvidia due all the mess they did.

Such cynicism.

I want to know how can they score so high in 3DMark’03 (finaly got the shader source on hands to optimize for?). And I’d like to see checkbox in their drivers like ‘All f**n optimizations off’ though I think it would be checked all the time as there is almost no visible difference
BTW, we could start new flame war about Unreal tournament texture filtering on NV/ATI Who’s cheater this time?

Originally posted by M/\dm/
:
BTW, we could start new flame war about Unreal tournament texture filtering on NV/ATI Who’s cheater this time?

Last I saw, the only filtering optimization ATI was using was trilinear for texture layer 0, but bilinear for all others when anisotropic filtering was forced in drivers. When set to application preference it obeys the application. Did I miss something?

You can bet that there has been plenty of meetings and discussions before that was released!

But what about this :
“The Futuremark optimizations, which are deactivated through the 3D Mark 2003 Patch v330, seem to be active again in the new driver.”

That is inside the conclusion section.

Where do you guys read about the optimizations beeing done? What`s this about texture 0 an trilinear filtering on ATI.

What I want to see is a control panel listing every single optimization, bugfix or workaround in which the driver has to check the application it is running under. Then, have a checkbox beside each one so I can turn them off and on.

If there are any other optimizations in which there is a quality/speed tradeoff, but which are active for all applications, I would like to be able to see those as well (many are already listed in the control panel).

It would be perfectly valid for a video card review to list scores for both the unoptimized and optimized versions the driver.

I think it is perfectly legitimate for nVideo or ATI to optimize for popular games. But this needs to be highly visible, and I want to be able to opt out.

From what I’ve heard, the 45.23 drivers, still dont do trilinear filtering correctly with Aniso enabled in UT2k3. And its also app dependant. (renaming other .exe’s to ut2k3, stops trilinear working correctly) So to me these drivers still have app specific IQ degradation for the sake of performance. Not the best start really.

Beside me with new driver nVidia does not work Doom3 E3. This bad though earlier worked. I have GeForce 4 TI. GeForce FX did not check.

Doom3 E3 used Fragment Combiner Programs, which has been removed from our drivers.

See http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/010297.html

What I want to see is a control panel listing every single optimization, bugfix or workaround in which the driver has to check the application it is running under. Then, have a checkbox beside each one so I can turn them off and on.

Remember, normal people have to use this dialog too; you can’t flood them with 1001 useless options. And detailing what these optimizations are can get very lengthy.

For example, take “dead code removal” in a shader compiler. Makes good sense. Would you ever, even in a debug build, turn it off? No. Should it, therefore, be listed? What about, “VBO’s use AGP memory ever”? Are you ever really going to want to turn this off? Is it really even considered an optimization, or simply the fastest implementation of the extension?

Really, the kinds of optimizations you’re looking for are quality-vs-performance optimizations. Tuning the filtering level of textures, regardless of what the application asked for, that sort of thing.

However, since these are application specific, the description has to tell you which application it is for. And, therefore, the dialog has to mention a non-trivial number of trademarked names. Not to mention that this listing, essentially, serves as an endorsement of the product; I’d be pretty honked off if my game wasn’t on that list. Lawsuits start from this kind of thing.

For al_bob.

But I do not think that this even gets to NV_fragment_program.

Doom3 menu or console used NV_fragment_program?

When I start Doom3 immediately before console then white screen blinked. Realy this NV_fragment_program?

Regarding doom3 leaked alpha, the reason it no longer works is because nvidia prevent it from working at the request of id.

Its not because any features have been removed.

In order to work around it, try the 43.30 drivers, with NV30 emulator turned on, not Force Software Rasterizer tho!

Obviously, it must be a conspiracy between iD, nVidia and space aliens. Damn those aliens.

Edit: spelling.

[This message has been edited by al_bob (edited 08-29-2003).]

This not problem, Doom3 E3 this not game. This only demo. I understand that nVidia has not allowed such mistake if this was a game.

Hmmmm, I suspect NVIDIA has many employees who were more disappointed than most of us that this happened. Some of the safeguards suggest to me that they are trying to stop misguided individuals there doing this in the future and have had a serious review of how some of this crap got in their drivers.

Maybe I’m just being naive, but this is a good sign IMHO.

3dmark03 being the only piece of software actually running faster with the new driver raises the following question. Are nvidia’s ‘optimizations’ back? Or does this prove that 3dmark03 consist of unoptimized code which can’t be considered a decent benchmark?

I agree Dorbie. I dont care what they do under the hood, as long as IQ is not changed, or degraded.

3dmark03, is a valid benchmark, but its not indicative of actual gaming performance, their doom3 engine clone is horrendously inefficient.

…yeah, that’s trully amazing how did they manage to convince people to trust those synthetic B*TCHmarks, while all that consumer video hw is designed mainly for games these days…