OT: Noisy Geforce FX

I hate noise (the static cooling-fan-type). I was planning to custom build a super-quiet PC where I would consider every component, make the most silent machine one can build. Now how does the NV30 fit into that picture?
Anyone picked up any comments from Nvidia about this? It is a very serious thing. Will this early critisism just die off (similar to the naming issue on the Geforce4-MX). Cinematic computing and noise? I do love machines, I would even be happy to build a separate, air-conditioned machine-room myself, but what about the end users playing our games, will they also? How about water-cooling? Nitrogen?

It’s the blower that makes the noise. I expect you could hook up some alternative cooling if you are prepared to mod the card. Why are you set on the NV30?

If you don’t mind paying a bit more you could get the Gainward version which supposedly has a maximum noise output of 7db. http://www.rivastation.com/go_e.htm?http://www.rivastation.com/news/news_de.htm#1043924326

Thanks for a great link (Gainward). Seems like a good choice, but I must solve some type of shutter-glasses output for my stereo-scopic sessions.

Some highly brilliant images from that page:
http://www.rivastation.com/cgi-bin/show/show.pl?1384
http://www.rivastation.com/cgi-bin/show/show.pl?1379

;-

Originally posted by Adrian:
If you don’t mind paying a bit more you could get the Gainward version which supposedly has a maximum noise output of 7db. http://www.rivastation.com/go_e.htm?http://www.rivastation.com/news/news_de.htm#1043 924326

7db?! I’d like to see this cooling setup.

Heh, check out the flowFX paradies on rivastation.
This one should be called the GeForceFX TypeR:

edit: hmmm. UBB code doesn’t like me.

[This message has been edited by JONSKI (edited 01-31-2003).]

Some highly brilliant images from that page:

Sweet!
This inspired me to invent new brilliant name: JetForceFX

Look at that:
http://www.forum-3dcenter.org/vbulletin/attachment.php?s=&postid=666091

Pretty weird results for the FX.
Tested in Shader Mark 1.6 (.: ToMMTi-Systems :: Hinter den Kulissen moderner 3D-Hardware :.), a tool to measure DX9 PS2.0 Shader performance.
Only in 3 fixed function tests the FX is able to beat the Radeon 9700 Pro.
Hard stuff …

Diapolo

[This message has been edited by Diapolo (edited 01-31-2003).]

That’s because the drivers suck.

After it’s release, surely they will have official/revised drivers.

When the R300 was fresh out the lab, it appeared to have problems too. Sucky drivers, required to much power (110 million transistors) so you needed an extra cable to power up, overheated, …

Those are the engineering samples.

Unfortunatly it seems like the FX needs that weird fan and will be sold like that.

When the R300 was fresh out the lab, it appeared to have problems too. Sucky drivers, required to much power (110 million transistors) so you needed an extra cable to power up, overheated, …

Actually, the R300 drivers were relatively good out of the box. Certainly, they weren’t impairing the performance of the card, as it had as much of a 50% margin over the Ti4600 (granted, it wasn’t close, considering the hardware differences).

Only the FX Ultra will have the huge fan. The regular FX (memory clocked at 400DDR rather than 500 on the Ultra) uses a regular fan.

The really sad thing is that not even the FX Ultra gets a truly significant performance boost over the 9700. Maybe 10%, sometimes. Indeed, if you turn on anisotropic filtering/antialiasing, the 9700 comes out ahead consistently. What happened to nVidia? Did they just get too comfortable with a lead to let a 4-month old card beat them like this?

A large part of the performance difference, as Carmack says, is probably due to the fact that the NV30 is running the PS2.0 tests with 32 bit precision, as compared to the R300’s 24 bit precision. It’s still kinda strange though, since the NV30’s performance in half that of the R300 and only with an 8 bit per channel difference.

There’s a precision hint that’s available in PS2.0, but not used in those tests, that allows data to be stored at lower precisions. ARB_fragment_program has something similar, but it gets applied to the entire program, while with PS2.0 it can be applied to individual instructions. In the case of PS2.0, I’m not sure how it works. ARB_fragment_program sends the precision down lower automatically, with the hint enabled (and if the video card supports it), but there’s no indication as to how PS2.0 decides if a low precision should be used.

[This message has been edited by Ostsol (edited 01-31-2003).]

If you look at the NVIDIA cooling setup they are obviously not just concerned with cooling the chip. They want to expell the hot air from the case and get cool air from the outside. This suggests to me that they are also concerned with the thermal load the card adds to the system case and that the cooling isn’t hampered by the temperature inside the case.

If you go with the gainward you may want to ensure your case is well ventilated. Otherwise you may see some instability, not necessarily with the graphics card but with other components like the CPU & chipset due to the increased temperature.

[This message has been edited by dorbie (edited 01-31-2003).]

Originally posted by dorbie:
If you look at the NVIDIA cooling setup they are obviously not just concerned with cooling the chip. They want to expell the hot air from the case and get cool air from the outside. This suggests to me that they are also concerned with the thermal load the card adds to the system case and that the cooling isn’t hampered by the temperature inside the case.

yeah… just… uhm… the other side of the card, directly under your processor, gets 140°C… so you need a good case with good airflow anyways…

well, its a funny card, we’ll see the future. we will never forget that card. too much pics yet around the web

Originally posted by V-man:
required to much power (110 million transistors) so you needed an extra cable to power up

Um… That power cable isn’t needed if you run the card on AGP 8x.

Originally posted by Ostsol:
A large part of the performance difference, as Carmack says, is probably due to the fact that the NV30 is running the PS2.0 tests with 32 bit precision, as compared to the R300’s 24 bit precision.

I’m just wondering if the additional 8 bits really make a difference. I mean the color values should be normalized between 0.0 and 1.0 anyway.

It sounds like the old disscussion if a 32Bit Z-buffer is any better than a 24Bit one (which is not the case so everybody uses the additional 8 bits as stencil).

Any thoughts?

The card cooler is strange in that both intake and exhaust ports are very close together, and the pics I saw, there is no seperator to go inbetween them. So wouldn’t it mean that it would suck in the exhaust air also?

I wouldn’t mind having either a GF Fx, or a 9700 though. They both seem like good products. The biggest issue for most of us is driver support, and history favors Nvidia on this. I read someplace that the GF FX should have good support for openGL 2.0 (when it comes out), so that also plays a big role in the decision process.

Nvidia has said that the version your reading the reviews from are about there version. As usally there gona sell licences for the chip so other people can use it and make there own verison. I bet that someone will slap a water cooler on that. Im waiting for that. The quatro FX is only one card I think its less noisy Im considering that as well.

Goto www.tomshardware.com and you can hear the FX.