Thoughts on the NV35?

what do you think?

9 months later, nVidia is finally on par. I wonder how long it will hold.

Anand I think mentioned something about Ati releasing faster 9800 soonish. Digit-life news section mentions 9800 256 meg model being a limited production. Nv35 does well so far. I’ll have to check [H] review for shadermark scores as last time gffx was up to 6x slower in them compared to r300. My only problem is Ati’s lack of gl support. Seems like ati is going with d3d and gl might suffer. Nvidia on the other hand favors gl much more than d3d, understandably so as they’ve said that nv35 is built to run doom3(super shadows) and it shows.

My only problem is Ati’s lack of gl support. Seems like ati is going with d3d and gl might suffer.

First of all, if it weren’t for ATI, you’d have no choice but to use NV_fragment_program/vertex_program_2, and glslang would, almost certainly, never be well-supported by nVidia (compared to propriatery nVidia program extensions coupled with Cg), and therefore, by the gaming graphics community at large. ATI pushes for ARB extensions, while nVidia is perfectly willing to go for propriatery ones.

Secondly, it’s about time nVidia actually game ATI some real competition

Thirdly, I’m not sure that it matters terribly much that the $500 card from nVidia beats the $500 card from ATI. The $200 cards will sell more, and the $100 or less cards will sell even more than that. ATI has the edge in the mid-range market, with their 9500Pro (get’em fast before they’re gone. The best $175 you can spend on a graphics card) and 9600Pro. nVidia’s 5200 FX, at the very least, offers DX9 support, which ATI’s 9200 does not. Sure, Doom3 will kill both of these cards put together without batting an eyelash, but you get what you pay for

[This message has been edited by Korval (edited 05-12-2003).]

Originally posted by Lurking:
what do you think?

I think we need moderators if advanced OpenGL discussion is talking about how a $500 product 99% of the market will ignore from company N is better/worse than a similar product from company A.

It’s a nice card. Congratulations nVidia.

More importantly thank you for 5200! Making a sub $100 card that fragmnet/vertex programs is awesome. HUGE congratulations on that!

According to Anandtech in Doom3 the at 1024x768 the 5200 scores 37fps. Quite playable. Switch to 1280x1024, enable 4x AA/8x anisotropic filtering and the performance drops to 9fps while the brand new 5900 drops to 38fps.

So you can spend $80 on a 5200 or $499 on a 5900 and get 35-40fps. That extra $400 gets you the same performance as the 5200, but you get anti aliasing and anisotropic filtering. yay.

I can’t justify spending $400 for better filtering. I think I would get more enjoyment out of $400 by getting a GameCube, Mario Party, a case of beer, and having some friends over. Hell $400 will get you a weekend holiday in Mexico.

Sorry, I’m ranting. It’s nice. I’d love to have one. I never will though.

Originally posted by Korval (aka ATi fanboy):
First of all, if it weren’t for ATI, you’d have no choice but to use NV_fragment_program/vertex_program_2, and glslang would, almost certainly, never be well-supported by nVidia (compared to propriatery nVidia program extensions coupled with Cg), and therefore, by the gaming graphics community at large. ATI pushes for ARB extensions, while nVidia is perfectly willing to go for propriatery ones.
Are you on the ARB, or are you talking out of your *ss?

Secondly, it’s about time nVidia actually game ATI some real competition
They took a year off after pounding them for three years in a row. Despite ATi’s recent success, their stock still stagnates.

Thirdly, I’m not sure that it matters terribly much that the $500 card from nVidia beats the $500 card from ATI.
Bragging rights are the best PR you can get.

The $200 cards will sell more, and the $100 or less cards will sell even more than that. ATI has the edge in the mid-range market, with their 9500Pro (get’em fast before they’re gone. The best $175 you can spend on a graphics card) and 9600Pro. nVidia’s 5200 FX, at the very least, offers DX9 support, which ATI’s 9200 does not. Sure, Doom3 will kill both of these cards put together without batting an eyelash, but you get what you pay for
I agree.

i think its at least a good card by nvidia, something not seen over a long long time. good job nvidia.

now i want a .13 high end card by ati

9 months later, nVidia is finally on par. I wonder how long it will hold.

On par in terms of what?

Features? NV30 had similar feature set.

Performance? 5900 in faster, sometimes significally (Doom 3).

Driver support? Now that’s where ATI is not up to par, as my experience with Radeon 8500 shows.

My opinion on 5900? Totally overpriced for what it is. For development, the best card for the money is FX5200, and for gaming it is Radeon 9500 Pro (if you can find one).

[This message has been edited by JackM (edited 05-12-2003).]

I agree with JackM. Even if ATI has a better card (when you sit down for a couple of days with both of them, you have to admit that ATI’s product is better overall), NVIDIA has better drivers. ATI needs to bore this into their skulls: they have to match NVIDIA’s driver offering, point for point. Even if NVIDIA’s solution is not perfect, it’s quite acceptable. ATI’s, at the present time, is laughable at best. Before fixing that, ATI poses no relevant competition to NVIDIA. Get this: if someone comes to me asking for advise, I’ll recommend them to buy NVIDIA hardware, even if I think ATI’s is better.

I hear the summer might hold a surprise in this department, but summer is too far away, and maybe even too late.

i just think its funny to see still everyone hail and praise the nvidia drivers, after seeing that they needed about half a year to develop a detonatorFX driver that actually works. have seen tons of people buying FX cards just seeing they crashed all around, had different grafic-errors, and where buggy, slow, and ugly.

hey dudes! they started with the “nv30 rocks” last august and NOW there is a good driver out for them!!

not that they don’t build good drivers. but for one, they are not perfect (crashes at home and at work), and second, espencially for FX cards, they now really had a very very long time till they got it working!

during all this time, i had about 3 full crashes while running on the radeon9700pro, and in the first drivers some small image-glitches. quite okay…

both companies now have good drivers. cat3.4 was not ready yet for the benchmarks, that was bad…

i just somehow dislike how nvidia really had a blackout for over 8 months now, both with hw (nv30), with drivers(no detFX, no WHQL), with cg(sorry… that thing simply doesn’t…).

now it looks all quite okay. still, the list of proprieraty nv-extensions is too long for me.

Originally posted by davepermen:
the list of proprieraty nv-extensions is too long for me.

What’s wrong with this? ATI feel some of them are so good they adopt them to (e.g. NV_occlusion_query). I wish they would adopt more.

So far every project I have done has been made faster because NV have an extension to help (VAR,Point sprite,Occlusion query, PDR).

The main reason I use NVidia cards is the extensions.

The fact that NVidia are pioneering these extensions and developers are using them will hopefully get them more widely supported. The functionality they have added had been very useful, to me at least.

[This message has been edited by Adrian (edited 05-13-2003).]

Indeed. Most ARB extensions are generalized versions of existing vendor extensions, and I would say that you can’t have one without the other. Before there was VBO, there were VAR and VAO. Before there was ARB_vertex_program, there were NV_vertex_program and ATI_vertex_shader. Before there was ARB_fragment_program, there were reg combiners/tex shaders and ATI_fragment_shader. And so on.

Any newly released card will have more features than can be covered by the current set of ARB extensions or core GL functionality. Would you rather have it that new cards didn’t expose their new functionality at all until new ARB extensions are ratified that cover it? Both vendors already support the exact same set of ARB extensions (with the exception of ARB_shadow_ambient which is not supported by NVIDIA, and ARB_imaging which is not supported by ATI). If you don’t like vendor extensions, don’t use 'em.

– Tom

[This message has been edited by Tom Nuydens (edited 05-13-2003).]

So hows this new UltraShadow enabled then?

Nice card, but the new doom3 trailer gets me more excited!

Originally posted by Nutty:
So hows this new UltraShadow enabled then?

GL_NV_depth_bounds_test. Interestingly, it’s in the NV30 emulator but not on NV30 itself: http://www.delphi3d.net/hardware/extsupport.php?extension=GL_NV_depth_bounds_test

– Tom

I have to say, I agree with adrian and tom about proprietary extensions. These extensions are the main reason for me sticking with opengl. If I used d3d, then I would have to wait for M$, nvidia, ATI, matrox and whoever else to agree on a new feature to be added to the next release of dx - there’s no such thing as an extension in d3d.

i’m not talking about vendor specific extensions, but about proprietary extensions. hw near, but not at all nice to use. and often just “bugfixing” some tiny feature… nvidia even releases extensions to their own extensions, for about every new card… 1,1_1,2,3 etc…

their extension spec is longer than the gl spec (i don’t remember wich version, i think they beat opengl1.3 specs) or something like this. currently i can’t see clear, that day was just much too much work. can’t concentrate brain…

Originally posted by davepermen:
i’m not talking about vendor specific extensions, but about proprietary extensions.

I don’t understand the difference - and I’m english!
They mean the same thing.

Originally posted by JackM:
Driver support? Now that’s where ATI is not up to par, as my experience with Radeon 8500 shows.

True. And that’s even worse under Linux, where ATI drivers just sux : verrryyy slow and unstable.

[OT] a new doom3 trailer : http://www.gamershell.com/download_filehell.pl?hellID=2124&mirror=2&a=3137774211260.61&b=init
This is a screener, so don’t expect very high quality. Still interesting and shows some new monsters and moves

Originally posted by tfpsly:
And that’s even worse under Linux, where ATI drivers just sux : verrryyy slow and unstable.

Huh? They’ve been VERY stable and VERY fast for me.

Originally posted by JackM:
[b]On par in terms of what?

Features? NV30 had similar feature set.

Performance? 5900 in faster, sometimes significally (Doom 3).

Driver support? Now that’s where ATI is not up to par, as my experience with Radeon 8500 shows.[/b]

Features. Well, NV30 was on par already. Though the D3D drivers lacks loads of features making them useless for developers. NV35 AA still sucks. Only 4x MS and not gamma correct.

Performance. NV30 was slow, NV35 seams better. DoomIII performance is very questionable. The game is far from done, and the test was arranged by nVidia who had access to the build, while ATi had not. Far from being a meaningful test. Also, it seems nVidia’s drivers are doing some ****ty stuff. xbitlabs review highlighted some oddities going on in OpenGL with AF, though not in D3D. AA in OpenGL seems to be nothing but blur, and not an AA. In fact, in some games enabling AA improved performance.
Also, while shader performance has improved it still does not seem to match that of a R9800pro. Shader performance is the single most important performance indicator for me as a developer.

Driver support. I keep hearing how bad ATi drivers sucks, but seldom do I hear any specifics. Interestingly though, lately I’ve heard more complaints about NV30 drivers, despite its market penetration being far lower than the 9x00.