Geforce MX good for dev?

I am doing a bit of 3d programming, I curently have a viper tnt 16 meg(I know its old)

I cant afford the Geforce 2 GTS, but the MX is in my range, will this be good for programming 3D games and how much performance increase will I get over my Tnt?

go read some of the benchmarks on it.

Essentially, its a GeForce2 GTS core, stripped down. So it has T&L, all the wonderful GeForce features including FSAA (albiet slow), etc, etc.

As far as it goes, yeah it’ll be decent for dev, but then again, so is your TNT. My suggestion is that if you’re looking at developing modern 3D apps that REQUIRE T&L, the NVSR (Shading Rasterizer) etc etc etc, then yeah, the MX is a good idea.

And finally, if you’re planning on developing REAL (commercial quality) 3D apps, my suggestion is to hold out until you can afford a real 3D card, like the 3DLabs Oxygen series or an E&S Tornado, etc. Of course you’re talking $1000+ on average here.

Siwko

I have often wonder what these work station cards have over Geforce GTS 2’s?

Very little these days.

SO er why are they 3x the price?

Who knows.

Actually the new wildcat cards which cost $3000+ which my company should be getting very soon allow for the developement of software just not possible with consumer based cards. The cards we’ll be getting will have 256mb of texture memory and a seperate 256mb of frame buffer memory.

I’d be the first to admit these cards probably wouldn’t play games very well, but they do volume based rendering much better than the psudo volume textures that can be generated on current consumer cards. Nvidia will probably have true 3d textures in there next card and I see that ati claims to have it on there current radion (or whatever) card. So maybe true view alined volume rendering will soon be possible with consumer cards but then there will be another major feature that they don’t have. Also these workstation cards have drivers developed with a different mentality that gaming video cards, namely quality over speed.

So basically what I want to say is that there are some differences, but for most people these differences don’t matter much cause there not ment for gaming there ment for sci vis.

For most openGL stuff I like using a nvidia card with a gpu just fine.

I know that these 3000 dollar cards are not meant for gaming but I would actually like to know how they perform in benchmarks (gaming and non gaming ones).

Anyone have any link where I could find this info?

Chris

SPEC (Standard Performance Evaluation Corporation) has a number of OpenGL related benchmarks for those interested in workstations.
Here are the links. You will find an array of impressive systems there.
http://www.spec.org/gpc/opc.data/summary.html

There is also an application focused group. See results here: http://www.spec.org/gpc/apc.data/apc_proe2000summary.html http://www.spec.org/gpc/apc.data/apc_solid99sum.html http://www.spec.org/gpc/ug.data/apc_ug15sum.html

I agree with ribblem. I’m developing CAD-Software and we get calls from customers who have problems with the visualization… in 90% they got an ATI-Card, in 9.999% they got a consumer card. In 0.001% they got a high end workstation card.

Kilam.

[This message has been edited by Kilam Malik (edited 07-19-2000).]

A few years back, the high-end cards did almost everything in hardware (maybe not HW T&L, but quite alot). For example, SGI’s workstations performed stencil-buffering in HW long before nVidia implemented it on their TNTs.

But now consumer cards are catching up quite fast in performance. But there still is a few big differences.

Memory: High-end cards have WAY more memory. This is needed for high-res images with loads of enormous textures.

Type of memory: High-end cards can have a different type of memory. Seen memory-banks that support reading and writing at the same time, but they cost more.

of pipelines: A high-end card can have several pipelines, which means they can push even more pixels through the card.

Features: A GeForce can handle a certain amount of lights in hardware before falling back to software (don’t remember, but it wasn’t enought for high-end cad programs in my oppinion), but high-end cards maybe support loads of light.

And remember that different manufacturers cards (based on same chip and with almost same testresults) can differ quite alot in price. More respected companies’s cards can cost more. This can also be a reason why high-end cards costs so dang much