PDA

View Full Version : [OT]: Upgrading video card, would like opinions...



ToolChest
09-03-2002, 09:24 AM
I’m upgrading my video card and I’m looking at a gf4 (currently have gf2 and happy), however if the next generation of video cards are just around the corner I would rather wait. I guess my question is: are nvidia, ati, or 3dlabs about to release new high-end gaming cards. If so will the features be worth the wait?

Thanks...

John.

davepermen
09-03-2002, 09:33 AM
i suggest the ati radeon9700.. as it is here yet (at least, somewhere http://www.opengl.org/discussion_boards/ubb/biggrin.gif), while the nv30 will have some delays and hopefully will be around on christmas..

i would not take a gf4, as, except in speed, it doesn't provide much you can't have yet somehow on the gf2.. actually, a bit larget registercombiners, and texture shaders, as well as shadowmaps are the main features (vertex shaders can work in software quite okay..)..

remember, the r300 is the "doom3-card" http://www.opengl.org/discussion_boards/ubb/biggrin.gif

no, but i think its cool to get a card wich is dx9 standart, as there are a lot of fancy things in.. displacementmapping, fat vertex and pixelprograms, floatingpoint stuff everywhere, and its really quite much faster than the gf4..

if you want something cheaper, then the only thing i can suggest is the gf4ti4200.. anything less is stupid..

ToolChest
09-03-2002, 09:40 AM
r300 is 3dlabs right? the nv30 will be the gf5?

Thanks...

John.

harsman
09-03-2002, 09:49 AM
The R300 is the ATI Radeon 9700. It's a full DX9 part which means e.g. full floating point pixel pipeline. Expensive but übercool. If you want to get something cheaper you might want to wait for the 9500 which will appear somewhere around october or so. It's supposed to be a slowed down cheap 9700.

[This message has been edited by harsman (edited 09-04-2002).]

Morglum
09-03-2002, 10:31 AM
übercoll = very cool I presume...

ToolChest
09-06-2002, 05:25 PM
Ok, I thinking of going with nvidia, because that’s what I currently have. I do have another question:

I want to start using vertex and fragment programs. I’ve found a lot of info on vertex programs, but if I look up fragment info on nvidias developers website all I get is texture and pixel shader demos. Can you only use fragment programs with cg? Or is there an opengl extension I should be looking for (tried looking for NV_fragment_program)?

Thanks…

John.

IT
09-06-2002, 05:52 PM
I just got an ATI Radeon 9700 Pro and it's fast. Very fast. It also supports DX9, but this won't be available until Microsoft releases it sometime in a couple of months.

Here's an interesting report I just saw on Tom's Hardware that links to The Inquirer which links to a Japanese site that states:

nVidia's NV30 may not be coming out until Q1 2003 now.

jwatte
09-06-2002, 06:42 PM
If you want nVIDIA, your best bet is to wait for the NV30, which might be a 3-5 month wait (according to the tom's hardware rumor).

If you want the coolest possible consumer card right now, I believe that distinction goes to the Radeon 9700. $335 on pricewatch.

If you have a lot of nVIDIA specific code, and don't want to change horses, but also want a card NOW instead of later, then look at something like a GeForce 4 Ti4400, or maybe even a 4200. The idea being to save money while still getting a speed and capability bump, so you can upgrade to something cooler later, earlier :-)

Last, a possible alternative would be the 3dlabs Wildcat VP, which is about $450, and is "fully programmable". It allegedly supports large parts of OpenGL 2.0 in hardware already, using extensions (I haven't had the luxury of spending time on this, unfortunately).

Rob The Bloke
09-07-2002, 07:51 PM
I was down at ECTS in London and the most impressive hardware on show was the Radeon 9700 pro, definately go for that....

alexsok
09-08-2002, 12:11 AM
Originally posted by john_at_kbs_is:
r300 is 3dlabs right? the nv30 will be the gf5?

Thanks...

John.


3DLabs is P10, while R300 is R9700pro from ATI.

nVidia's next-gen card codenamed NV30 won't have the GeForce tagname anymore.

From the latest info, NV30 will be offically announed on Comdex in November 18-22.

If you want the fastest & best card on the market right now and money is not a problem definetly go for the R9700pro, but if you're on a tight budget and want a good price/perfomance card, definetly go for the gf4ti4200.

knackered
09-08-2002, 03:20 AM
Just search google groups for the keywords "ATI" "drivers" "crash" "locked up" "help" "is it my code?" "blue screen", then and only then make your decision.

Nutty
09-08-2002, 03:45 AM
I think thats a bit unfair knackered, ATI's drivers have come on a long way since the original radeon. Showing old archives of ppl with problems doesn't indicate the quality of their drivers _now_ does it?

knackered
09-08-2002, 04:00 AM
Nutty, I recommended my company buy a radeon 8500 to see if it would give us an edge in our simulators - I had to eat humble pie as it consistantly crashed in many machines, under various d3d and opengl apps.
See my previous thread on RenderMonkey:- http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/007307.html

I believe ATI are producing fantastic cards, and produce really nice research stuff like RenderMonkey, but their drivers are *still* terrible. Unfortunately, this means we can't target our simulators to their cards, just because of the reliability issue.
This still holds true with their latest drivers.
I'm just giving john_at_kbs_is the benefit of my unfortunate experience.

PH
09-08-2002, 04:15 AM
I've had only a 3-4 crashes/lockups since I installed the card ( all from D3D apps strangely enough ). As for driver bugs, I can mention maybe 3 bugs. The drivers are good, maybe not as good as NVIDIA's but definitely on the right track. As someone mentioned a long time ago, stability is a seperate issue from driver correctness ( Carmack in a .plan update ). There are too many outside factors that can affect the stability of the system. I remember reading a post from a gamer on Rage3D complaining about crashes and lockups, it turned out he had overclocked his card/CPU. The drivers seem good enough to run DOOM3 http://www.opengl.org/discussion_boards/ubb/smile.gif.

IT
09-08-2002, 04:54 AM
My 9700 hasn't locked up at all. I've never owned an 8500. My 9700 is in a Pentium 4 based machine.

My GeForce 3 and GeForce 2MX would all lock up consistently if I used nVidia's driver past v28.11 (or something like that) on my Pentium III setup (AGP 2x). They were rock solid if I kept the driver at v28.11.

So yes, there are a lot of factors at play.




[This message has been edited by IT (edited 09-08-2002).]

Nutty
09-08-2002, 04:57 AM
Well I've never owned an ATI card, so I can't comment from experience, but I'm sure their drivers are alot better than when the 8500 1st debuted.

As for bugs n stuff, well I've found a fair few in nvidia drivers too, and I dont even code for PC graphics in my profession. Both companies have bugs in the drivers, as for stability, well theres lots of things that can influence that. I still get the (very) occasional lockup with a Gf4 in my system. No idea what causes it. More likely something to do with the old via chipset http://www.opengl.org/discussion_boards/ubb/smile.gif

Nutty

PH
09-08-2002, 05:13 AM
I still get the (very) occasional lockup with a Gf4 in my system. No idea what causes it. More likely something to do with the old via chipset.


I had some problems with my GF3 on an Athlon 1.4Ghz. It turned out to be a the sound chip on the motherboard causing problems ( disabling it in the BIOS and installing a seperate soundcard solved it ). The motherboard used the old VIA chipset too.

knackered
09-08-2002, 05:34 AM
Then I'm even more perplexed...multiple PC's, multiple processors, multiple OS's (98/2k/xp)...doesn't seem to matter with this radeon (it's a replacement from the retailer, but behaves as badly as the initial one).

PH
09-08-2002, 05:46 AM
My 8500 is a Built-by-ATI version ( BIOS version 1.004 that came with it ). You could try installing that BIOS, I have no idea how or where you can get these but it might solve your problems ( control panel/options/details to see BIOS version ).

V-man
09-08-2002, 05:48 PM
You can switch down your AGP if you get lock ups. You can also mess around with your memory timings.I have an old VIA board too, but I only had a lockup once when I overclocked the GPU too much and the screen started to get covered by nicely spaced pixels starting from the top to bottom and the cursor got corrupted.

Guess I should be happy with my current setup http://www.opengl.org/discussion_boards/ubb/smile.gif
V-man

Ixox
09-08-2002, 11:22 PM
I have a question about a ATI 9700.
If i buy one, what can i do today with it using openGL, that i cannot do with a 8500.
Has ATI released any 9700 openGL extensions ?
I've never heard anything about that!

Ixox
09-08-2002, 11:33 PM
Oops, i've posted my previous message too fast, i've just found the thread :
'No New OGL Extensions. Waiting for DX9? '
Every-thing is inside.... Sorry.

pbrown
09-09-2002, 03:25 AM
Originally posted by john_at_kbs_is:
Ok, I thinking of going with nvidia, because that’s what I currently have. I do have another question:

I want to start using vertex and fragment programs. I’ve found a lot of info on vertex programs, but if I look up fragment info on nvidias developers website all I get is texture and pixel shader demos. Can you only use fragment programs with cg? Or is there an opengl extension I should be looking for (tried looking for NV_fragment_program)?

Thanks…

John.

John,

You don't need to use Cg to use vertex and fragment programs.

Currently shipping NVIDIA cards (except TNT) support NV_vertex_program and ARB_vertex_program. The extension specs can be found in the OpenGL extension registry (linked off opengl.org).

NV30 additionally supports NV_vertex_program2 and NV_fragment_program. The specs for these are published can be found at:
http://developer.nvidia.com

The latest "Detonator 40" driver (40.41) posted on NVIDIA's web site includes "NV30 Emulate" support to help developers use the new functionality exposed via these extensions.

While Cg is not necessary to use these features, it may make them easier to use.

LaBasX2
09-09-2002, 04:31 AM
Before buying a new card, I will look if the Trident XP4 will really offer what it promises. A DX9 compatible card for 99 Euros/Dollars. Well certainly it won't but I think it is still worth a look http://www.opengl.org/discussion_boards/ubb/wink.gif

ToolChest
09-09-2002, 05:02 AM
Trident XP4??? I didn’t know Trident was still in business… http://www.opengl.org/discussion_boards/ubb/smile.gif Technically if I wrote new drivers that supported DX9 for my old Stealth 3D that would make it a DX9 card, but that still wouldn’t make it a good card… http://www.opengl.org/discussion_boards/ubb/wink.gif

Sorry for the NV_fragment_program post in the middle of this post (I double posted, my bad). I did get some info on this extension from the other post. I was under the impression that fragment programs where introduced with the gf3. This was an assumption based on the fact the gf3’s core is ‘programmable’. Well come to think of it my crappy little gf2’s core is ‘programmable’ (register combiners, env combine), which is not even remotely as flexible as a true fragment program.

So, on the gf3/4 I guess I’m stuck with using the same fragment extensions I’m currently using (register combiners, env combine) or is there something more flexible?

Thanks!

John.

zangel
09-09-2002, 11:57 PM
Maybe this link is little OT but i think it interesting : http://www.bluesnews.com/plans/1/

davepermen
09-10-2002, 01:12 AM
Originally posted by john_at_kbs_is:
So, on the gf3/4 I guess I’m stuck with using the same fragment extensions I’m currently using (register combiners, env combine) or is there something more flexible?

well, it has 8 general combiners instead of four, and it has texture shaders to set up from one of some predefined dependend texture reads in front of the register combiners... i would suggest to get at least an ati radeon9700... go for dx9standard.. and the ati exts are based on the ARB_vertex_program, so quite handy as well..

John Jenkins
09-10-2002, 04:17 AM
Their drivers do suck. We regret ever using the ATI FireGL 2.

[This message has been edited by John Jenkins (edited 09-10-2002).]

davepermen
09-10-2002, 04:21 AM
the statement "their drivers suck" wich is so often seen is a) not true and b) stupid. they sucked, they had suck, or how ever spelled.. but its not true anymore, there aren't that much with driver problems..
AND
drivers change.. if they suck, get an update.. its not a natural rule, its just a current state wich can change every minute (okay, every week or month or so http://www.opengl.org/discussion_boards/ubb/biggrin.gif)..
and, nvidia drivers do suck as well.. i know a lot with problems, so they are sucking badass.. in fact, their drivers are very good and its quite normal that they can have problems on some configurations.. possibly another driver or even the hw does not work by 100%, so it messes the nvidia driver up themselfes...
both ati and nvidia drivers are of good quality, personal problems always can happen, but are not the general rule... this is badsaying against a company and illegal http://www.opengl.org/discussion_boards/ubb/biggrin.gif

gumby
09-10-2002, 09:28 AM
Are you saying that their drivers are bether now?

knackered
09-10-2002, 10:10 AM
Unfortunately, they've still a fair way to go as far as stability goes. Trust me, I've actually tried them.

davepermen
09-10-2002, 12:18 PM
and trust me, others say they work quite well.. so whom should i believe now? knacky or some other dudes (wich are more than the negative dudes)..

if i count, ati drivers are okay.. and if i count the ones that have bugs with nvidia drivers that i personally know, its about 50:50! so its not that well for nvidia drivers as people mentoin..

knackered
09-10-2002, 01:34 PM
Dave, if you haven't tried an ATI card, don't comment on the stability of their drivers - you simply aren't qualified in the most fundamental way.
ATI need to try harder. NVidia are ok.
I'm not being negative deliberately, it's just my experience.

kieranatwork
09-10-2002, 01:42 PM
I can confirm that the latest radeon drivers still have lots of bugs in them - and we're not talking fancy d3d/opengl features here, we're talking basic stability in stress-free situations. Unforgivable.

dorbie
09-10-2002, 02:57 PM
Well I have an NVIDIA card and it's not the most stable. I think these issues often boil down to mobo + card combinations. Carmack seemed to be impressed with ATI's latest effort. Is he qualified knackered?

With my GeForce 3 card I got random flickering polys that only seemed to worsen with each driver release, I upgraded my mobo drivers recently and the problems are largely solved although I did get some stability issues. A few more bios tweaks like AGP aperture size and PCI spectrum spread seemed to solve them (fingers crossed).

To get a stable system with any card I think takes a more wholistic approach to the problem. Blaming the graphics card driver seems a little naive. At the very least if you're trying to engineer a working system in a simulation environment you want to try a range of mobos and bios settings with the card you are interested in.

V-man
09-10-2002, 06:42 PM
Hey, what is the ATI equivalent of wglAllocateMemoryNV and anything new on R9700?

vertex_array_object seems to hint as being the equivalent but cant be sure.

PS: I think it was on the ATI's site where I read that ATI intends to change this idea that "their drivers suck". Hope they succeed.

V-man

NitroGL
09-10-2002, 09:41 PM
Originally posted by V-man:
Hey, what is the ATI equivalent of wglAllocateMemoryNV and anything new on R9700?

vertex_array_object seems to hint as being the equivalent but cant be sure.

That's the glNewObjectBufferATI function (I think that's it). You can map the array in memory to a pointer using the ATI_map_object_buffer extension, though there isn't a spec for it, it should be that hard to figure it out from the two entries in the glati.h file.

[This message has been edited by NitroGL (edited 09-10-2002).]

knackered
09-11-2002, 06:15 AM
Originally posted by dorbie:
Well I have an NVIDIA card and it's not the most stable. I think these issues often boil down to mobo + card combinations. Carmack seemed to be impressed with ATI's latest effort. Is he qualified knackered?

With my GeForce 3 card I got random flickering polys that only seemed to worsen with each driver release, I upgraded my mobo drivers recently and the problems are largely solved although I did get some stability issues. A few more bios tweaks like AGP aperture size and PCI spectrum spread seemed to solve them (fingers crossed).

To get a stable system with any card I think takes a more wholistic approach to the problem. Blaming the graphics card driver seems a little naive. At the very least if you're trying to engineer a working system in a simulation environment you want to try a range of mobos and bios settings with the card you are interested in.

Oh god, the big boys are here! http://www.opengl.org/discussion_boards/ubb/wink.gif
Look, stop this ATI have sorted things out talk. It's just not true (and I really do wish it were true - the 8500 is a lovely card, on paper).
Dorbie, I have tried 2 (count 'em) radeon 8500's in many different machines, with many different configurations, p3/4, athlon, duron, dual/single processors, with all recent driver releases, and I very commonly get show-stopping bugs - blue screens, mainly.
Why on earth would I be saying this if it weren't true?
Just look through the newsgroups to see people with similar problems.
John Carmack couldn't even get his quake console to render properly, only a few months ago - maybe they've fixed all the bugs that directly affect the paths his doom3 engine takes, but that's no help to the rest of us, is it?
I've no doubt you've had problems getting your nvidia card to work on your specific machine, dorbie - but that's pretty irrelevent to what I'm saying.

ToolChest
09-11-2002, 06:22 AM
Another deciding factor:

The r300 has 8-texture channels right? How many will the nv30 have?


Also a game theory related question:

When multi-pass texturing how many texture layers are the maximum you would put down to get the right effect on a poly? Right now I’m stuck with a card that supports only 2 texture channels, so I end up making 2 – 3 passes depending on the effects used. The problem is that my pipeline (due to the math involved) may still need to be split in to at least 2 passes on a card that supports 8+ channels (I’ll still end up using only 4).


Just wondering what you guys thought…

John.

Humus
09-11-2002, 09:31 AM
Originally posted by knackered:
John Carmack couldn't even get his quake console to render properly, only a few months ago

Ehm, that was a beta Radeon 8500, that would be more than a year ago.


Originally posted by knackered:
I've no doubt you've had problems getting your nvidia card to work on your specific machine, dorbie - but that's pretty irrelevent to what I'm saying.

Why are your problems more relevant?

ehart
09-11-2002, 09:53 AM
First, I don't want to get into any debate on driver quality, as I feel that is for the rest of you to decide/argue about.

I do want to ask people who have problems to file them with the appropriate IHV. I am sure that all IHV's participating on this board will agree with me on that. Everyone on this dev board can file issues with ATI products at devrel@ati.com.

-Evan

knackered
09-11-2002, 11:21 AM
Originally posted by Humus:
Why are your problems more relevant?

Because I've tried the radeons on more than a single machine, humus - dorbie's talking about problems getting his nvidia card to work with a single motherboard. That is why my 'experiences' are more relevant.

ehart - your drivers blue screen on lots of different configurations - sorry, but I don't have the time to give you the exact conditions these blue screens happen under - unless you've started some kind of paid beta testing scheme for the people who buy your cards.
One thing, try changing fillmode in the bump mapping shader of rendermonkey to points, on a radeon 8500 + abit kt7a + athlon 1.2ghz.
That's something for you to get started on.

PH
09-11-2002, 11:54 AM
Knackered,

Have you maybe considered that the replacement 8500 you got, is faulty too ( my first 8500 was broken too ) ?

I don't get random blue screens. In fact my system has never been as stable as it is now. Have you looked into what BIOS version is on the card ? Have you used a utility to make sure the card has not overclocked by mistake ( not sure this can happen but overclocking can be done with a simple utility, so maybe ) ?

As for not reporting bugs because ATI doesn't pay you, does your company not care about getting their software to run on the most hardware ?

I'll repeat my point, the 8500 in *my* system is rock solid. The "correctness bugs" are minor ( and *will* be fixed ). If the 9700 is anything like the 8500 in terms of drivers, then I would recommend it immediately.

knackered
09-11-2002, 12:49 PM
Ok - maybe its faulty. My apologies to ATI's driver department - my eyes now fall on the hardware manufacturers.
You sure you haven't had any stability problems, PH? What motherboard/cpu are you running with? What OS?

Humus
09-11-2002, 12:55 PM
Originally posted by knackered:
Because I've tried the radeons on more than a single machine, humus - dorbie's talking about problems getting his nvidia card to work with a single motherboard. That is why my 'experiences' are more relevant.

ehart - your drivers blue screen on lots of different configurations - sorry, but I don't have the time to give you the exact conditions these blue screens happen under - unless you've started some kind of paid beta testing scheme for the people who buy your cards.
One thing, try changing fillmode in the bump mapping shader of rendermonkey to points, on a radeon 8500 + abit kt7a + athlon 1.2ghz.
That's something for you to get started on.

If that means anything I can start to pull the story of my friend of mine who's GF would cause random spontaneous reboots and lockups every 15 minutes or so. (It was kinda amusing sometimes seeing him go up and down like a yo-yo on my ICQ contact list http://www.opengl.org/discussion_boards/ubb/smile.gif) Now moving it over to my roommates computer didn't help, well, it would a run little longer, like 30min before rebooting or freezeing. We did a lot of stuff to that card, but it would never run stable. It might just have been a faulty part, but at that time I heard a lot of similar reports from other people on the net.

I may also pull the story of another friend of mine, we helped building him a system with a GF2 pro. It sucked in every way possible. Games would either not work, crash or have severe image quality problem. Ut ran at 6fps. Quake was fast, but showed severe banding even though we turned TC off and used 32bit for everything. The screen was very blurry at certain refreshrates, looked fine at 60 and 85 but not at 75 in some resolutions ... other rules applied to other resolutions ... etc. Yes, we tried it on another machine, same problems. Put a Voodoo5 in that machine, worked just fine ...

I also find your attitude on reporting bugs quite irrational. How do you expect to ATi to be able to fix bugs if you don't report them? It's impossible to fix bugs you don't know and can't reproduce. Also, have you considered that your problems may be application problems? I assume the apps you've tested are those you've developed yourself. It certainly have happend a lot of times for me, my faulty application ran just fine on Radeon, but not on other vendors hardware. My GameEngine demo I released early this year is a good example. It took several revisions before I got it up and running on GF3's too. The problem was my app, not anyones driver, well, except that once the problems were fixed there remained a driver problem causing GF3/4 card to produce random output if anisotropic was enabled, which btw took nVidia like 4 months to fix. I might just aswell while I'm on it complain about nVidia's lousy developer support. They would not even reply on email when I reported bugs. ATi always does, fix problems quickly, and keeps you updated.
Also, some vendors bugs may only show up on other vendors cards (nVidia + GL_CLAMP ... ). Some vendors also refuse to fix such bugs ... http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

PH
09-11-2002, 01:11 PM
You sure you haven't had any stability problems, PH? What motherboard/cpu are you running with? What OS?


Yes, everything runs great. This is my system,

- Built-by-ATI Radeon 8500 ( 6143 drivers, BIOS 1.004 )
- AthlonXP 1800+
- ASUS A7V266-E motherboard ( has the improved VIA chipset )
- 768 MB DDR ( don't remember the name, but it's not a 'noname' )
- DVD drive is from ASUS ( E616 )
- HD is an 80GB Maxtor ( fluid )
- Windows 2000 SP3 ( worked fine with SP2 but I usually apply these updates )
- The sound chip is on the motherboard ( works fine for what I need ).

Actually, I really like stuff from ASUS. My GeForce3 is from them ( the main reason to continue buying their products ). I've had too many problems with noname products that I don't mind spending a bit more for quality.

I really hope you solve your problems, since you are missing out on some really great hardware http://www.opengl.org/discussion_boards/ubb/smile.gif.

[This message has been edited by PH (edited 09-11-2002).]

knackered
09-13-2002, 10:54 AM
Humus, are you speed reading my posts?
You seem to be missing crucial bits of imformation that would have saved you a lot of typing.
Do you really think I've got the time to build the PC's we use for our simulators?
We use Fugitsus, SGI's, Intergraph ZX10's, HP's and Dells. These, in case you weren't aware, are top quality, optimised builds. I've also tried both radeons on my home machine, which I built myself. Same results.
I know you write ATI specific applications, humus, so I can understand your passionate defense of their cards. As I keep saying, I'd love the radeon to be stable, as it's a well designed card, but it just does not seem to be.
It also seems to be required that I repeat what I have said in previous posts - these are not bugs in my own apps (although the lockups do happen in them too), they are most evident in little known apps such as 3dsMax4, 3dmark2001 and ATI's very own RenderMonkey.
Now please stop flaming me!

pkaler
09-13-2002, 11:46 AM
Alright, let's get this on the right track before this degenerates into a big poop-slinger. Both ATI and nVidia need better ways to inform developers on the state of their drivers. This is what I suggest, (if anyone listens is another matter).

- both companies should have Bugzilla databases set up
- make the databases readable to anyone, so that developers know which bugs exist and can build workarounds
- give write access to certain developers out in the community to submit bugs so that the databases don't get out of hand (prevent problems that the mozilla project sometimes encounters)

Will this ever happen? Probably not. Both companies fight tooth and nail for market share and probably don't want this to effect sales. But it is the Right Thing To Do(tm). And it would make development much smoother.

davepermen
09-13-2002, 12:22 PM
knackered, take me and some of my friends to the list of people with huge problems with geforces and drivers. currently its quite stable, but it took me about 3 years to get an nvidia gpu working _quite_ stable.. on this pc. if i try on winxp, for example, it would **** up blue screen again.. others are having same problems..


about the "carmack couldn't even get the console properly some months ago". the r300 is the card with wich doom3 was presented on e3. i haven't seen any blue screen on the movie that is not allowed to be seen at all.. and carmack just loves the gpu, sais its awesome, works like a charm, and the drivers are good. same for the 8500, more or less. but there is no statement about problems _AT_ALL_ with the 9700. thats different to the 8500, wich had problems in the past.

no gpu works on every pc perfect. sometimes there is just some tiny thing wrong that ****s it up. nothing is perfect. you had a bad happening with a very visible faulty ati gpu. so what? i know of dudes with the 8500. they don't have such problems. take it as a fact, both companies have good drivers, no company has perfect drivers.


>>When multi-pass texturing how many texture layers are the maximum you would put down to get the right effect on a poly? Right now I’m stuck with a card that supports only 2 texture channels, so I end up making 2 – 3 passes depending on the effects used. The problem is that my pipeline (due to the math involved) may still need to be split in to at least 2 passes on a card that supports 8+ channels (I’ll still end up using only 4).<<

it depends, in generic opengl you get 8 textures, i think. and thats it. or 6. when you use the new pixelshaders, you can sample up to 16 times from the textures, and calculate up to 64 instructions (full floatingpoint). and all interchanging.. so the amount of textures, and that, does not really count anymore.. its all mixing up to a generic shader. cute it is..

Humus
09-13-2002, 02:55 PM
Originally posted by knackered:
Humus, are you speed reading my posts?
You seem to be missing crucial bits of imformation that would have saved you a lot of typing.
Do you really think I've got the time to build the PC's we use for our simulators?
We use Fugitsus, SGI's, Intergraph ZX10's, HP's and Dells. These, in case you weren't aware, are top quality, optimised builds. I've also tried both radeons on my home machine, which I built myself. Same results.

<snip>

It also seems to be required that I repeat what I have said in previous posts - these are not bugs in my own apps (although the lockups do happen in them too), they are most evident in little known apps such as 3dsMax4, 3dmark2001 and ATI's very own RenderMonkey.
Now please stop flaming me!

You never mentioned exactly what apps you were talking about. I don't what kind of work you do, so "our simulators" certainly made it sound like you were developing some kind of simulators which would not run on the Radeons. Under these assumptions I think it's fairly reasonable for me to point out that it might just aswell be application errors.
Regardless, I think you're making some serious bad generalisations. Just because you have had bad luck with ATi card doesn't make them suck. That fact is that most people run their Radeons without problems. I've had nothing but bad experiences with nVidia cards, do I post claims about nVidia drivers suckyness? Nope. I assume I've had bad luck as most people don't have any problems with their Geforces. The only time I comment on nVidia driver quality is when someone raised their driver developers to the sky with word like "golden standard" etc, under which conditions I think leveling the field a little may be needed.


Originally posted by knackered:
I know you write ATI specific applications, humus, so I can understand your passionate defense of their cards. As I keep saying, I'd love the radeon to be stable, as it's a well designed card, but it just does not seem to be.

The reason some of my applications are ATi specific is not because of some special passoinate love of ATi, it's purely because I currently own an ATi card. I try to develop for as wide variety of graphic cards as possible, and prefer to use ARB extensions over any vendor specific. After all, I want as many people as possible to be able to run my demos. I have no interest in giving a particular vendor an advantage. Sometimes I just need special features only available as GL_ATI, or find them very interesting. So some demos obviously end up as ATi-specific for that reason. If I would have an nVidia card I'd have some nVidia specific demos too. There are some features on the GF3/4 I'd love to make a demo of, for instance shadow mapping, but can't since I as a not-so-wealthy student can't justify buying another graphic card.

NitroGL
09-13-2002, 03:04 PM
Originally posted by john_at_kbs_is:
Another deciding factor:

The r300 has 8-texture channels right? How many will the nv30 have?

16, only 8 texture coords though (I find that a little screwy, but I can live with it). Same on the NV30 (last I checked anyway).

dorbie
09-13-2002, 03:58 PM
Err no, my comments do not stem from a single experience. That was just an example to try and illustrate what I am saying. Knackered, all you've done is illustrate that you can't build a working PC. The kind of thing hardware reviewers do regularly and post the results all the time. Most of the driver related issues w.r.t. ATI have been functional issues and ATI have made dramatic improvements (see Carmack's comments), the driver issues are not BSOD under mundane conditions. Maybe it's your BIOS settings, some other PCI card you use or EMI from your monitor & bad shielding, I don't care. Just quit pretending your anecdotes of inexperienced PC building are indications of ATI's driver incompetence.

PH
09-14-2002, 05:56 AM
Just a quick comment ( slightly OT perhaps ), I ran the UT2003 benchmark on my 8500 and got some surprising results,

Flyby: 113 fps
Botmatch: 45 fps
at 1024x768

No graphical glitches, no crashes. I also tried the Battlefield 1942 demo, that works for me too ( lots of people seem to have problems with it ).

Humus
09-14-2002, 10:04 AM
I can't see what's surprising there ... did you expect higher/lower?

PH
09-14-2002, 10:41 AM
I was expecting lower for such a new game. I wasn't expecting any problems though.

zed
09-14-2002, 11:19 AM
since they dont have a lounge forum here ill post here
ive seen that demo's out 'UT2003'.
Q/ is it worthdownloading from a gamedevelopers point of view?, or is it just quake3 with better textures/physics (ie not improved in the graphics technique department, stencil shadows etc)


i ask cause i have a very slow conection + 100mb will take at least a week to download

PH
09-14-2002, 11:57 AM
It looks great ( of course ) but it's too fast for me http://www.opengl.org/discussion_boards/ubb/smile.gif. I'm not sure if there's anything special from a developers point of view, in that case nothing comes close to the Quake games. You'll need the editor to open the unreal archives but it's not included in the demo.
Did I mention it looks great ?

Anyway, no stencil shadows. There are some projected shadows textures ( rotating fan ) and the lightmaps appear to be of very high quality. It looks good but it's not anything special like DOOM 3...so you're probably correct when you say "Quake3 with better graphics".

Humus
09-14-2002, 12:40 PM
The most important aspect of the ut2003 demo is of course that it's fun. http://www.opengl.org/discussion_boards/ubb/smile.gif As a developer the most interesting thing about it I guess is the editor (well, it's not in the demo but in the full game).

ToolChest
09-14-2002, 03:08 PM
Not to start a new argument, but I prefer Unreal to ALL Quake games. When the first Unreal came out it had an impact on me that I hadn’t felt since the original Doom. I know everyone in this forum is all over Carmack and this is basically blasphemy, but Id games always go for graphics over playability. Q3 looked impressive and also sucked…

This however is just my opinion… http://www.opengl.org/discussion_boards/ubb/smile.gif

John.

ToolChest
09-14-2002, 05:27 PM
Just checked out the demo, very nice... Nice water, just a shame that it only reflects a static image. I like... http://www.opengl.org/discussion_boards/ubb/smile.gif

John.

Korval
09-14-2002, 07:39 PM
Id games always go for graphics over playability. Q3 looked impressive and also sucked

Q3 looked impressive? Are you that easily impressed? Unreal (not even UT) looked far better in its high-quality mode than Q3 did, and Unreal came out around Q2's time. Granted, you could run Q3 with some detail and maintain a decent framerate with hardware that was avaliable when it came out (whereas Unreal was really looking for TNT2's, Voodoo3's, and GeForces when TNT1's and Voodoo2's were the only cards around), but that's another issue.

In any case, so that this post actually contributes to the main thrust of the thread, as far as graphics cards, you have the following choices:

GeForce 3/4: Cheap, compared to new stuff.

Radeon 8500/9000: Cheap, but more powerful than GeForce3/4. Not quite as fast, though. Good for a developer, not a gamer.

Radeon 9700: Avalaible now. ATi plans on exposing their features (at least, the vertex and fragment features) via ARB extensions.

NV30: Wait till early next year. nVidia will support ARB_vertex_program, but ARB_fragment_shader support is in the air. Seems to be, functionally, more powerful than the 9700. If you don't mind waititng and using propriatary extensions, choose this.

Nutty
09-15-2002, 03:16 AM
Have to say I think Q3 drop kicked UT in the graphics department. Plays soo much better too IMHO http://www.opengl.org/discussion_boards/ubb/smile.gif

UT2003 demo doesn't look that much different. Except everythings got hi-res textures n stuff. Didn't see anything amazing in there. Just run of the mill stuff done very well.


Q3 looked impressive and also sucked…

I suppose it's personal preference. UT to me seemed just too sloppy, there was no precision in there like you can attain to in Q3. I have friends at work that still play Q3, I dont know anyone that still plays UT.

Nutty

[This message has been edited by Nutty (edited 09-15-2002).]

tarantula
09-15-2002, 05:05 AM
I've always felt Q3 looked much better than UT and the gameplay was much better too. I dont know if UT has splash damage but I keep pumping out rockets near my opponent's legs while he just dances around.

Though I know a _few_ guys who still play UT.

[This message has been edited by tarantula (edited 09-15-2002).]

davepermen
09-15-2002, 05:57 AM
its a religious war..

imho:
both look good, both can impress me visually, they just look different (and doom3 will be another good looking game in this place..)
both are fun to play, they are different to play but both can be fun. i prefer ut in the extensibility (love the leveleditor) so that we can even create racinggames and all just with some fancy leveldesign, hehe http://www.opengl.org/discussion_boards/ubb/biggrin.gif
both have good gameplay, just different

the rest is just personal preference..

i'm happy that ut2k3 runs on my system _playable_, never thought it does.. thats cool.. and it looks great.. nothing new, but what they use they use great http://www.opengl.org/discussion_boards/ubb/biggrin.gif

knackered
09-15-2002, 07:44 AM
Originally posted by dorbie:
Knackered, all you've done is illustrate that you can't build a working PC.

By that comment, all you've illustrated is you're a better talker than a listener, dorbie. I'm sure it's earned you a lot of respect among your contemporaries. http://www.opengl.org/discussion_boards/ubb/wink.gif

SThomas
09-15-2002, 08:43 AM
posted by knackered:
Do you really think I've got the time to build the PC's we use for our simulators?
We use Fugitsus, SGI's, Intergraph ZX10's, HP's and Dells. These, in case you weren't aware, are top quality, optimised builds. I've also tried both radeons on my home machine, which I built myself. Same results.

dorbie, it'd be worth your while to read the posts before you start attacking, man.

ToolChest
09-20-2002, 11:10 AM
I hate to reopen a potential argument, but I still have a few questions:

I know that ARB_fragment_program was just released, however is it possible that the 9700’s drivers will support that and ARB_vertex_program soon (I read on another post that the full drivers won’t be available until DX9 is out)? Also I know that the nv30 won’t be out for a while, but in you opinion will the drivers support these exts? Honestly the cards seem very similarly matched.

Thanks…

John.

pocketmoon
09-20-2002, 11:19 AM
Originally posted by PK:
Alright, let's get this on the right track before this degenerates into a big poop-slinger. Both ATI and nVidia need better ways to inform developers on the state of their drivers. This is what I suggest, (if anyone listens is another matter).

- both companies should have Bugzilla databases set up
- make the databases readable to anyone, so that developers know which bugs exist and can build workarounds
- give write access to certain developers out in the community to submit bugs so that the databases don't get out of hand (prevent problems that the mozilla project sometimes encounters)



this would be such a good move!

I've been trying for weeks to get a working nv30 shader for the latest cg compo. All I keep hitting are bugs in the emulator http://www.opengl.org/discussion_boards/ubb/frown.gif

ToolChest
09-20-2002, 12:18 PM
I found the answers I was looking for in here...
http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/007314.html

as soon as the ARB exts are made available I'm grabbing one, crappy drives on not I wold like to get a jump on development.

John.