nVidia drivers & 3dmark2003

seems like FutureMark confirmed the cheating
by nVidia’s new drivers which increases performance by 24%…
http://www.theregister.co.uk/content/54/30860.html

Why do nVidia have to cheat for synthetic benchmarks and deceive the gaming community

Well it’s not like they are the only ones. Every other graphics card maker out there has proven to have done the same kind of thing. Nvidia was just the last company to do it, well as far as we know. But I could care less about 3dmark. What I look at is performance in games, which is where it really counts. I’m not going to sit there and “play” 3dmark. As long as they fix this cheat I’m ok.

-SirKnight

[This message has been edited by SirKnight (edited 05-25-2003).]

Well FutureMark has filed this report:
http://www.futuremark.com/companyinfo/3dmark03_audit_report.pdf

and they have released a new patch as well.

why would you care about 3dmark scores, when every review clearly showed 5900Ultra ahead of 9800 in games … What are you going to play more ? games or some Banch program ?

jubei_GL,
I’m just pointing out that it is wrong to cheat just to increase the scores.

I dont care whether GeforceFX 5900 is the king or not.

and FYI, i dont play games

ATI still have cheats in their drivers in one form or another, and this has been shown with the nVidia fairy demo being run on the Radeon cards. Renaming the executable of the demo to 3dmark’s filename, or to quake3.exe, will produce different results to each other, as will they produce different results to using the demos normal filename.

Can we at least keep this discussion in ONE thread? It’s been all over the NV35 thread for the last week.

I hate to fuel this thread further but… Dopefish you haven’t actually ran the fairy demo have you? Sure renaming the exe causes “different results” but not of the kind you’re talking about…

Originally posted by harsman:
I hate to fuel this thread further but… Dopefish you haven’t actually ran the fairy demo have you? Sure renaming the exe causes “different results” but not of the kind you’re talking about…

No? They’re ignoring certain rendering calls in order to improve performance, aren’t they?

– Tom

Nah, I think it’s meant to distract so you don’t pay attention to fps at all hehe.

It’s bad that because of s**t like 3DMark companies are trying to cheat & so driver sizes are getting bigger and real problems are forgot.
In 44.03 (WHQL!!!) GFFX5200 has texture releated bugs in Comanche4, IGI2 & polygon bugs in GTA3:ViceCity. And that’s what matters, not freakin’ 3D Marks (they could better hack the final score ). It’s real pain that FX line drivers are optimized for benchs, because card is simply super
And things like Cg helps a lot.
So for now I’m waiting for the next 22MB driver pack :’(

Originally posted by M/\dm/
:
… companies are trying to cheat & so driver sizes are getting bigger and real problems are forgot.

This is THE problem of this!

Personally I wouldn’t care about what bench is being used. Not too much at least.
The point is having companies cheating MAY actually make developer’s work harder.
I have mixed feeling on the last generations of video cards and the fact the drivers were ‘optimized’ simply confuses me even more.

I personally don’t like benchmarks however. Performance is not everything. I really would like customers to understand that. No matter what company you like. It applies to all.

EDIT: added a line about the quote.

[This message has been edited by Obli (edited 05-26-2003).]

i’m very dissapointed. a benchmark is like a sport-event at olympic games. and there, cheating would lead to discualification. fair sports!
same should happen with benchmarks, but no, companies like this time nvidia shown to be capitalistic thinking, so they need to cheat to look cool in the benches.

thats so primitive and dissapointing.

and btw, nvidia, your 22MB drivers simply SUCK. i mean, HEY. 22MB!! whats in there? cheats for about all programs, or what? this driver is bigger as some full server software. i don’t get what you put in there, but its definitely not just sweet optimized driver code.

anyways, nvidia has shown to not be trustable, relyable, or actually “good” (in good or evil) at all anymore.

nvidia, you should really fokus on get what you get called actually: best drivers in business. you haven’t provided them for a long long time, still, tons of peoples believe you do. now quickly get back on the standards you had.

btw, dawn on the radeon looks great

This is THE problem of this!

Personally I wouldn’t care about what bench is being used. Not too much at least.
The point is having companies cheating MAY actually make developer’s work harder.
[/b]

That’s the problem: you don’t care about benchmarks, I don’t care about benchmarks, no developer would care much about benchmarks. GFFX for developers is simply süppi! But gamers, who actualy buy such cards do care about benchmarks and 1% more performance would be enought to make a desision for them. And Nvidia must sell some cards or … hmm… the NV40 will never appear. What can they do but cheat?

and btw, nvidia, your 22MB drivers simply SUCK. i mean, HEY. 22MB!! whats in there? cheats for about all programs, or what? this driver is bigger as some full server software. i don’t get what you put in there, but its definitely not just sweet optimized driver code.

The majority of the size comes from all the language pack files. Removing these generally brings the drivers down to about 9meg, as some sites release these cut down versions.

It’s funny, but I was just reading this article and the guy says :

If you’re a 3DMark freak (it’s ok, I am one too) and must have the absolute highest score among your circle of friends, …

from
http://www.pcstats.com/articleview.cfm?articleID=1392

How very smart of him for beeing a “3Dmark freak”.

As for 22MB of driver files. I doubt that the “language packs” double the file size. And even if it did, it wastes bandwidth. Actually, it was 18MB when I downloaded. Weird…

Originally posted by davepermen:
and btw, nvidia, your 22MB drivers simply SUCK. i mean, HEY. 22MB!! whats in there? cheats for about all programs, or what? this driver is bigger as some full server software. i don’t get what you put in there, but its definitely not just sweet optimized driver code.

Perhaps it’s that nview thingy and all of that other extra stuff that’s making the file so large?
NVIDIA Linux drivers are just 7 or 8 MB afaik and does not include the things mentioned before.
(I’m not sure because I don’t download drivers anymore… ‘nivia-installer --update’ rulez )

This is, once again, another reason for switching to Linux

anyways, nvidia has shown to not be trustable, relyable, or actually “good” (in good or evil) at all anymore.

Just like ATi and nearly every other company on this planet.

btw. If someone could make Dawn run on Linux, I’ll buy a GeForceFX

Originally posted by jubei_GL:
why would you care about 3dmark scores, when every review clearly showed 5900Ultra ahead of 9800 in games … What are you going to play more ? games or some Banch program ?

Well. . . you’ll have to note that every one of those games are DirectX 8 (or OpenGL equivalent) or older games. The result is that if those games use pixel shaders, it’s integer precision shaders; if not, then it’s hardware T&L. The GeforceFX has a dedicated T&L unit and performs integer instructions at at least twice the speed of floating point instructions. These conditions combined with the high clock speed makes it obvious that it’ll win against the Radeon 9800 Pro in such games. If a game were to use floating point precision in shaders, however. . . the 9800 Pro would most likely be well in the lead.

Basically, the GeforceFX rocks for DirectX 8 generation and older games. ATI’s cards and their very good performance with floating point precision are a bit more future-proof, though.

Originally posted by richardve:
[b] Just like ATi and nearly every other company on this planet.

btw. If someone could make Dawn run on Linux, I’ll buy a GeForceFX [/b]

dawn runs on the radeon as well, hehe (see the main page)

well, at least ati officially agrees we did those things and we’ll remove those things. nvidia instead flames other companies for being evil and directly attacking them. futuremark has nothing to do that the gfFX is simply crap hw for dx9…

oh, and yes, seeing gfFX perform well in benches runable on my gf2mx as well doesn’t show me much future proof same for gf3 benches… where i can see if a card is suitable for the future is currently stuff like 3dmark, like shadermark, and others.

and, if nvidia cheats in 3dmark, who knows how they “optimize” for other benches. i mean, no 24% performance gain… but a little “tweaking” if some ut-bench for example runs would not really be noteable but give some 5% advantage without bigger problems.

and those 22MB are not only language packs, are they? well… even if they are… they should get some workaround… thats the benefit of unified drivers: all in one. even while you only need the part for your gpu for your language, you get it all…

and the nView and all that could be optionable, too. not everyone needs or wants that. i, for example, have no use for it…

well, anyways… i have to install the radeon now at this work pc to test if its my mobo at home wich sucks… wish me luck (at least humus did not had the bug i have…)

Is there no way of enabling floating point precision for older games on those new cards?

I would really like to get the last bit of IQ out of games like Quake3.
With the newest cards there is some performance to spare for IQ, so it would be a good idea IMO.