PDA

View Full Version : Stable OpenGL card for Development?



Nocturnal
04-13-2001, 10:03 PM
I am looking for an OpenGL card that is stable under adverse conditions; specifically when an OpenGL program in development crashes repeatedly to a debugger, which is then halted and restarted without proper shutdown of the program under test.

My Matrox G400 is not cutting it under these conditions in Windows 2000. If I crash my program, sometimes I must restart the computer or else any further OpenGL program will cause the driver (ICD) to crash repeatably.

Glossifah
04-13-2001, 10:17 PM
Nocturnal,

As many folks on this board will attest to, the development card of choice is an nVidia GeForce2-based card. Based on the amount you are willing to spend, look at:
1. GeForce2 MX
2. GeForce2 GTS
3. GeForce2 Ultra
4. Quaddro 2 Pro

Option 1 is the least expensive, while option 4 is the most expensive (but offers hardware accelerated line antialiasing and other neat features). If you are willing to continue using the G400 for a bit longer, GeForce3 will be available in retail outlets, and my reccomendation swings to that card without hesitation.

Why GeForce? I, like many other folks on this board, spend time developing OpenGL code than can cause egregious errors at times, and having a card that is capable of handling those errors and weathering the storm is essential to efficiency. Some cards simply don't provide the kind of stability that an A-rev nVidia driver thread does.

I was skeptical of nVidia for a long while; especially coming from Wildcat and FireGL-based cards. I have since seen the light. Simply put, nVidia provides some of the best OpenGL development solutions on the market. Period.

Regards,

Glossifah

Nocturnal
04-13-2001, 10:34 PM
Thanks for the reply. The only problem that I can see is that nVidia has a poor reputation for 2D quality (or perhaps that is simply certain board manufacturers). This is important to me because I use a 21" monitor at 1600x1200 with small fonts (90 dpi?).

I will seriously consider a Geforce 2 GTS since they are down to $132.

Glossifah
04-13-2001, 10:48 PM
Well, I'm a 1280x1024 man myself, so I can't testify to the quality of the 2D quality in the range you are looking at.

Most retail vendors don't stray too far from the reference boards, which IIRC have DAC speeds around 300mhz on the GTS cards. DAC speed is more relational to the resolutions that can be attained at 85+ hz resolution; and I'm not sure if a direct correlation can be drawn between higher DAC speeds and quality at 16x12 or greater resolution. More margining should be "A Good Thing", but I've never toyed with that aspect of the GF2 cards.

Glossifah

Nocturnal
04-13-2001, 11:18 PM
Argh! GeForce 2 does not do EMBM?

LordKronos
04-14-2001, 03:52 AM
Unfortunately, I dont think anyone can quite match the 2D quality of Matrox (maybe a GeForce 3 can, but I've never seen one or heard anyone compare its 2D quality, so I wouldn't know). I wouldnt say my GeForce's 2D is poor (its actually quite good), but its not quite king of the hill.

And no, GeForce doesnt support EMBM...it supports a lot of other things, but not that. If you have cash to spare, a GeForce 3 will support more features than anything else out there.

As for quality, yes I can comfirm that nvidia cards/drivers under Windows 2000 are rock solid. I had a lot of problems under 98 where (as you said) after crashing or being terminated by the debugger several times, resources would begin to disappear and the openGL driver would eventually fail to initialize. After switching to Win2000, I havent had a single problem of that sort. I have used various 6.* and 7.* drivers, and am now using the 11.01 drivers, all with no problems.


[This message has been edited by LordKronos (edited 04-14-2001).]

jwatte
04-14-2001, 07:03 AM
> DAC speed is more relational to the resolutions that can be attained at 85+ hz resolution

Ahem.

You can stick a Porsche engine in a Yugo, but it will still be a Yugo.

In this case, the "Yugo" might be the board layout (especially analog section) of the screen card, and the connectors/cable going from your machine to the monitor. Analog design is hard, and while any geek can pull it off at 100 MHz (well, almost any geek :-) 300 MHz is another matter. Using a tool which routes your traces wrong, or not building the appropriate ground planes, not properly managing clock jitter, or using (cheaper) below-spec connectors are favourite ways of screwing up high-resolution display cards, even though the DAC may be a good part in itself.

I've used Matrox cards, and while they were pretty good in 2D, the driver, compatibility and performance issues just made them not workable in the end. Meanwhile, my Elsa GF2 GTS has enough to drive an analog flat panel at 1280x1024, which only needs 75 Hz to stay flicker free. What I'm waiting for in the next card is digital out, and 1600x1024 support.

Umm... what's this soapbox, and why am I standing on it?

Michael Steinberg
04-14-2001, 07:07 AM
Hmmm... Wasn't there kind of such discussion for audio amplifiers? http://www.opengl.org/discussion_boards/ubb/smile.gif

mcraighead
04-14-2001, 07:20 AM
Originally posted by LordKronos:
I had a lot of problems under 98 where (as you said) after crashing or being terminated by the debugger several times, resources would begin to disappear and the openGL driver would eventually fail to initialize.

This problem has been fixed in newer drivers. I can't remember the exact version where it was fixed, but it is fixed (to the extent that is possible given the OS's brokenness; it's not our fault they don't give us a process detach!).

Win9x is still far less stable than NT, but that has a whole lot more to do with the OS than with the drivers.

- Matt

HFAFiend
04-14-2001, 08:48 AM
It probably is not the best, but I love my Radeon...the dev relations arn't perfect, but decent, and I have not had any problems developing on it.

duckman
04-14-2001, 09:12 PM
Geforce 2 gts 2d for 2d is good enough.
I run my monitor at 1600 x 1200 @ 85hz no probs.

Its not the speend that kills geforce cards on 3d. its when you get to1920 x 11400
at true col and the 32 meg of mem will not allow much more than a single on
screen buffer.

OGL stability under any card is normaly pretty good.

I sugest that you look seriosly at more pressing development issues.
- a fast card lest you code, without having to optimise (its normaly best to get code working then optimise it)
- support of extensions that you wish to use
- developer relations support (nvida is very good here)
- open gl conformance test results
- multiple moniter debugging (there is a dual head gf mx out now)
- Quality of drivers (lack of bugs)

under all the GeForce stuff shines a good choice for real time development if you
cant afford a SGI. (and who can)

duckman
04-14-2001, 09:14 PM
>let me spell corect my 3rd line

ts not the speed that kills geforce cards on 2d. its when you get to1920 x 11400

Nocturnal
04-15-2001, 01:32 AM
duckman, which manufacturer?

I am considering an MX for $79-85 shipped from newegg.com. That way I could replace the card at any time (even the next day) with a reasonably priced GeForce 3 or card-after-the-Geforce-3. The GTS for $132 is tempting, but $80 is low enough not to have to play the eternal wait-for-a-lower-computer-component-price game. I could also talk myself into "correcting" the RLC filters on an $80 card.

Also does anyone have info about setting up remote debugging to another computer? (Visual C++) Since I have tons of extra computer components, including an extra 19" monitor, I could keep the compiler, the matrox, and 21" on one system while having the program run remotely on another.

It would be nice to have a truly stable OpenGL driver. I remember under Win98 that my G400 would literally eat WM_DESTROY messages (the generic Microsoft OpenGL driver did not). This was one of many reasons for moving to Windows 2000 (which was relatively painless and entirely worthwhile).

Enough of my rambling http://www.opengl.org/discussion_boards/ubb/smile.gif Time for sleep.

[This message has been edited by Nocturnal (edited 04-15-2001).]

LordKronos
04-15-2001, 01:50 AM
Originally posted by duckman:
>let me spell corect my 3rd line

ts not the speed that kills geforce cards on 2d. its when you get to1920 x 11400

Well, you still missed your "1920x11400" error. I doubt any card is going to let you have even a single buffer at that size, given that it would need about 83.5MB per buffer http://www.opengl.org/discussion_boards/ubb/smile.gif

And you also introduced a new spelling error... "Its" turned into "ts" http://www.opengl.org/discussion_boards/ubb/smile.gif

[This message has been edited by LordKronos (edited 04-15-2001).]

duckman
04-15-2001, 02:44 AM
Ye L.K. I make many typos and spelling errors.
I ment 1920 x 1440 which is 19mb of mem
(try to find any one sentance i have written that does not contain a spelling error)

any way Ive never bothered with network debuging. But I might sugest this
there is a 32mb geforce 2 mx card out there that has support for two monitors.
And it cost about 5 dollars more than an ordinary mx card.

I think http://www.asus.com.tw/products/Addon/Vga/agpv7100/index.html
has twin view information (if you like asus cards)

problem is that the memory is shared between the tho displays...

mcraighead
04-15-2001, 09:19 AM
Nocturnal,

It's very easy to do remote debugging, far easier than I would have thought. I do virtually all of my OpenGL driver debugging using remote debugging. (It falls apart once you have to enter the kernel...)

Install MSVC6 (I'm assuming version 6) on both computers.

Run msvcmon.exe on the slave computer (i.e. your test system). Configure msvcmon.exe to know the name of the master computer.

Go to Build/Debugger Remote Connection on the master computer, select Network, do Settings and type in the name of the other computer.

Under Project/Settings, Debug tab, there are four blanks to fill in. The following is the general idea:

\\test-machine\c\blah.exe (executable)
\\test-machine\c\ (working directory)
-xyz (command line arguments)
c:\blah.exe (remote filename)

You may need to fill in some Additional DLL's in some cases.

- Matt

duckman
04-15-2001, 05:50 PM
Well Ok since were on about graaphics cards....
Has any one got a recomondation for a graphics card to use as a second display adapter.

ie do the thing where you have 2 video cards in the machiene..
My first card is a GF2 GTS and I want the second card to be PCI based and non detrimental to the systems performance.
ie no #$%@ drivers that stuff up all the time.

of course not all cards work well when set up as a secondy video adaptper.

What cards hyave people had sucsess with...?

rts
04-15-2001, 09:21 PM
Originally posted by duckman:
What cards hyave people had sucsess with...?

At work I use a Quadro Pro as my primary and a PCI Voodoo 4 as my secondary under Win 2000. No problems so far.

Hope this helps.

Nocturnal
04-16-2001, 02:14 AM
Under Win98, using 2 video cards dropped you into software OpenGL. Is this fixed in Win2K somehow (by MS or video card makers)?

How about the PCI Geforce 2 MX cards? Do they show any significant speed drops?

Also, has any company fixed the problem of dropping to software OpenGL with a dual-monitor-port card with different resolutions on each monitor? As far as I am aware, only Matrox allows different resolutions in Win2000 at the present time.

I am beginning to think that setting up a second computer for debugging will rapidly become expensive since my current second computer is a P2 266 without an AGP slot.

[This message has been edited by Nocturnal (edited 04-16-2001).]

paddy
04-16-2001, 03:11 AM
The RadeON is rock stable on my 2k system.
And it's image quality can be compared to the Matrox one.
If you need dual monitor, just get a RadeON VE http://www.opengl.org/discussion_boards/ubb/smile.gif

[This message has been edited by paddy (edited 04-16-2001).]

JoeMac
04-16-2001, 05:14 AM
I just picked up the Radeon VE last week. The dual monitor (monitor and TV in my case) is flawless, and the performance isn't much worse than my Geforce2 at work.
Joe

MarcusL
04-16-2001, 06:13 AM
I'm using a voodoo2 as secondary adapter, and a GeForce DDR as primary. You have to do a bit of searching to find those 3dfx beta drivers that were written to use the v2 as a normal card (and not as a multimedia device as in the later driver revs)

Anyway, it does work, although the colors are a bit botched up on the v2. (But that could be me using a mac monitor and a homegrown adapter.. http://www.opengl.org/discussion_boards/ubb/smile.gif

My second computer is a Linux box, so there's no way to do remote debug from that currently.

mcraighead
04-16-2001, 07:09 AM
Originally posted by Nocturnal:
Also, has any company fixed the problem of dropping to software OpenGL with a dual-monitor-port card with different resolutions on each monitor? As far as I am aware, only Matrox allows different resolutions in Win2000 at the present time.

This is an issue for Microsoft. You'll have to wait for WinXP (which will have its own set of issues, just wait).

- Matt

Zak McKrakem
04-16-2001, 07:21 AM
Will XP have OGL 1.2 dll?

NeoTuri
04-16-2001, 12:22 PM
What's a good dual monitor vcard I could get for OpenGL dev? Right now, I have my FireGL, G400 DualHead, and TNT2 Ultra. I'd be willing to wait for GeForce3 if it had the option.

duckman
04-17-2001, 04:24 PM
Nocturnal wrote:
>Also, has any company fixed the problem of dropping to software
>OpenGL with a dual-monitor-port card with different resolutions on
>each monitor? As far as I am aware, only Matrox allows different
>resolutions in Win2000 at the present time.

I thought diferent 2 moniters wernt alloud to have different resolutions in 2k, An OS imposed restriction.

This is one of the resons I stuck with 98se.

mcraighead
04-17-2001, 06:36 PM
Multimon OGL is hardly a good reason to stick with Win9x. After all, Win9x doesn't support any form of accelerated multimon OGL.

- Matt

Nocturnal
04-18-2001, 05:42 AM
Originally posted by duckman:
Nocturnal wrote:
>Also, has any company fixed the problem of dropping to software
>OpenGL with a dual-monitor-port card with different resolutions on
>each monitor? As far as I am aware, only Matrox allows different
>resolutions in Win2000 at the present time.

I thought diferent 2 moniters wernt alloud to have different resolutions in 2k, An OS imposed restriction.

This is one of the resons I stuck with 98se.

This is one of the reasons I stuck with 98se too until Jan 01, when I got fed up with Win98. So I switched to 2000 and 1 monitor. Then Matrox came out with independent resolutions for Windows 2000 http://www.opengl.org/discussion_boards/ubb/smile.gif Then my 21" broke http://www.opengl.org/discussion_boards/ubb/frown.gif

And in answer to my own question, Matrox came out with a new W2K driver yesterday that supports hardware OpenGL in "true DualHead" mode, which I take to mean independent resolutions (since it already worked in 'fake' DualHead). I have yet to try the new drivers, however.

Funk_dat
04-19-2001, 10:04 AM
I would wait and get the GeForce3 if you are planning on doing games. By next year, a lot of games will be taking advantage of the new features. You don't want to be left behind!

If you are doing content development, a quadro or one of the newer FireGL cards might be appropriate. The 3Dlabs Wilcat series is usually pretty spiffy but a bit on the expensive side.

It all depends on your budget and goals.

FUnk.

dustsmoke
04-19-2001, 01:08 PM
Geforce is not stable by any means. In fact there are a lot of dev issues with the geforce chipset AND the base chip drivers. This card was optimized under 16 bit rendering was later (took so long to develope) enhanced to do 32 bit, (but not as well as it should had it been designed for 32, and the enhancements mess with system settings to allow quicker builds in 32 bit mode) Technolodgy of these chipsets are actually pretty old, It runs at the level that it works at not due to designed stability of normal operations but as just being overclocked by default. (this causes the lockups everybody experiances since they don't add hertz locks to thier chips) nVidia has always had issues with it's chip design as most of you can all remember crazy showdows and other such dumps on color palettes so on and so on. The fab is old and not even a certified FSSCA lab just into the micron manufacturing robots. All of this leads to why nVidia does not build thier own card or provide support for them. It works out better that way for them. But Matrox your original card is a very reputable company with very stable cards in the professional idustry. These cards are almost designed exclusivly for NT based kernals. But designed more of a DV editing and 2D office applications does not cut a whole heck of alot on the 3D. Especially the G400 Series chipsets. Very good fabs and an actual R & D team with QA backup. ATI also has a good R & D team with knoledgeble Engineers. Too bad for the Rage pro-128 mistake otherwise the name would be solid. The Radeon/Rage6 is very redeaming though. And most likely the most advanced chip on the market. All I am really saying here is there is not very much to pick from these days but the nvidia based cards are the last thing you should be looking at for dev work. That is why more in-experianced companies are developing on them like Dynamix and many other companies have so many problems utilizing 3D acceleration and everybody has such big problems with thier games. Try to get a card that is recognized and built by the same company and I assure you, that you will have a much more successful build. Just don't pick up a hunk of junk from nvidia.

Eric
04-19-2001, 10:37 PM
dustsmoke, that's a pretty interesting, although probably unique, point of view...

Sincerely, loads of people would advise you to get a GeForce (or a Radeon) for developing OGL apps. Are they all so wrong ? All developers but you are wrong ? Developers don't know which gfx card suits their needs ?

You say GeForce is old technology: perhaps it is... I am a software developer, not a hardware guru... And actually, I DO NOT MIND !! Who's providing me with the fastest rendering ? nVIDIA (or ATI !)... Who's REALLY making OGL evolve through extensions ? nVIDIA (and ATI)... Who's providing me with good developer info ? nVIDIA (hem, ATI... http://www.opengl.org/discussion_boards/ubb/wink.gif).

As far as lockups and other problems are concerned: yes, GeForce had some issue with some motherboards... Wasn't that because the motherboard manufacturers cheated on their AGP power ???????

Actually, I just had a look at your profile and this was your very first post on this board... You are probably one of those guys who go to a forum, leave a stupid message just for the sake of it, and never come back again...

Hence, I am gonna stop here...

But I guess you will have more elaborated answers...

Eric

mcraighead
04-20-2001, 11:23 AM
I am utterly confused by most of the things dustsmoke said. I don't even know what he is talking about! For example, our fab is TSMC, and they are a highly-respected foundry. If you want to bash TSMC, you will also have to bash a whole lot of other companies that also use TSMC as their fab, including a number of other graphics companies.

Please, if you're going to bash us, back up your claims with _facts_, not vague and ridiculous assertions.

- Matt

Funk_dat
04-20-2001, 11:43 AM
Dustsmoke,
Although your post seems somewhat well intentioned, I think dismissing Nvidia hardware as 'a hunk of junk' is not helpful and misleading.

It's not been a secret that a lot of the Nvidia boards have had some graphical glitches and not exactly stellar 2D performance(although I hear the GeForce3 has improved greatly in these areas). The overwhelmingly biggest factor of influence over your purchase should be the purpose for which you need a adapter. I think a lot of people on this b-board, including the person started this post (correct me if I'm wrong), use their cards for game development and anyone in their right mind would reccomend a game development card that is:
1) Widely used and supported
2) Has a bunch of neato functionality
3) Has a mad fillrate

The GeForce line (along with the Matrox cards you mentioned and ATI's radeon) are ideal reccomendations in this respect. Notice rock-solid stability is not one of the three points I mentioned above.

On the other hand, if your purpose involves CAD, content creation, medical-type things, etc, then obviously your desires are going to be different. Stability, visual quality and compatibilty make the list as important features. In this case, the FireGL series or a 3Dlabs Wildcat would be an ideal choice. Where these cards may not have ultra fillrates, they are known for their visual quiality and impressive wireframe performance.

I dont mean to egg on a troll or to shoot down dustsmokes opinion, I just think it's important to be informed. Buying a matrox card (or an Nvidia card for that matter) because someone with some kind of beef, told you so, just aint good. You gotta weigh the pros and cons and find out what would work best for you.

Funk.

[This message has been edited by Funk_dat (edited 04-20-2001).]

paddy
04-20-2001, 11:44 AM
Very confusing opinion !
Anyway, if you're a game developper, what you need is a card from the various manufacturers to test your software on... a game is supposed to run on various system specs !
So my guess for a good game dev. gfx card config is having at least all those cards:
- NVidia GeForce, GeForce 2 Ultra
- ATI RadeON DDR
- Matrox G450
- Kyro/Kyro II
- 3dfx Voodoo 5

I keep my RadeON on my main dev. PC only because it's display is better than the GeForces and it's OpenGL is far better than the Matrox, but I test my software on many cards before releasing it.

Funk_dat
04-20-2001, 11:58 AM
Sorry for being a little vauge in my above post, I just wanted to avoid starting a possible 'my card is better than yours' flame war.

Anyway, Paddy is right, if you are developing games you need to test it with cards that your customer is going to have. End of story.

Funk.

jwatte
04-20-2001, 05:44 PM
If I ever write an RPG, I will make sure you get 1,000,000 everytime you run away from a troll rather than try to slay it.