PDA

View Full Version : ATI 8500- useability



Tim Stirling
12-29-2001, 07:56 AM
I have built a new PC and want a new gfx card to put in instaed of my tnt2m64. For the last year I have been wanting a GeForce 3 but now with the ATI 8500 on the market it makes my choice more difficult. The 8500 is a lot cheaper and as powerfull as a geforce 3 ti500. The drivers may not be excelent but they are getting better. I know the 8500 actaully has better pixel and vertex shaders than the Geforce 3 but how easy are they to use? How much support and documentation is there? I know the nvidia extensions are widely used and supported with many example programs and tutorials etc.

What are your views??

Tim

NitroGL
12-29-2001, 09:00 AM
I've been using one since September of this year (yes, I know that's BEFORE the card was released), haven't had a single driver problem (unlike what everyone says).

As for developer support, I think that ATI has done a GREAT job.

The shader code I've found to be easly move from D3D to OpenGL and back again with out problems, so you could use D3D PS 1.4 examples for OpenGL (they only differ slightly).

Check out this sample for a good example of how to use ATI's vertex and fragment shaders: http://www.ati.com/na/pages/resource_centre/dev_rel/R8500PointlightShader.html

And there are a few documents on the shaders (not too meny though, as it's still quite new), they all can be found on ATI's developer relations page (http://www.ati.com/online/sdk/).

[This message has been edited by NitroGL (edited 12-29-2001).]

Tim Stirling
12-30-2001, 01:07 AM
Thanks for the reply. What about performance, are there any nice techniques like VAR on Nvidia cards?


Tim

BlackJack
12-30-2001, 01:36 AM
You can't compare ATI to NVidia about developersupport in my opinion. NVidia offers tons of GL-demos with source and tutorials, depending on who you are you get a phonecallthrough to US or UK through which you can discuss problems by phone, it's extensions are much better supported by games and in the i-net than ATI and so on. Here in firm we've got two or three Radeon 8500 sponsored by ATI and three GeForce3 cards sponsored by NVidia.
ATI has got a couple of GL-extensions similar to the NVidia ones, in general they are implemented a worser way, worser way I mean, because you've for example no possibility to directly access the AGP- or videomemory, but can just update it by a command, a cleaner solution, but I don't like it this way, mainly if you do a lot of AGP-work you've all time to have a local buffer into which you calculate the data and then you are allowed to copy it, double memorytransfer. A very positive extension ATI offers is that you can store indices in videomemory and through this save a lot of memorytransfer, mainly when you reuse them often.
The Radeon 8500 is MUCH faster than the GeForce3, I don't know, if we got any non-consumer-market-cards sent by ATI, but in 3DMark2001 the polygonrate of the Radeon was nearly two times as high as the one of the G3Ti500 and I've really been nearly ashamed about my brandnew G3.
So in general...if you want to program for OpenGL...forget about the Radeon, else you will push away 90% of the market for may be 10%. If you just program for yourself or DirectX...a Radeon would slowly get interesting. I personally always first develope for NVidia. ATI sent us theirs cards, so I'm of course so fair to support theirs extensions too, but just as far as they are at least relative compatible to NVidia's extensions, if not...theirs problem, I wanna develope for the mass after all, not for the some percents of Radon users. I don't know where for you wanna use it...for gaming...take a Radeon...for OpenGL developing...take a GeForce3, cuz that's the standard. My opinion.

Michael

Korval
12-30-2001, 02:24 AM
The Radeon 8500 has superior per-fragment operations to that of the GeForce 3. That, alone, is worth it to me.

[This message has been edited by Korval (edited 12-30-2001).]

davepermen
12-30-2001, 03:29 AM
and the extensions are much nicer designed imho.. i go for a radeon.. much cheaper, yeah.. and i like the ps1.4-standart.. instead of the nvidia-gpu this pixelprogram is programable http://www.opengl.org/discussion_boards/ubb/smile.gif (i always wanted to say that http://www.opengl.org/discussion_boards/ubb/smile.gif)

mbespalov
12-30-2001, 04:05 AM
I'm going to get it too.
Does it inherit a Radeon's 3d textures limitations ? (no mipmaps, 3d texture takes up 2 tmu)
Thanks.



[This message has been edited by mbespalov (edited 12-30-2001).]

NitroGL
12-30-2001, 09:13 AM
Originally posted by mbespalov:
I'm going to get it too.
Does it inherit a Radeon's 3d textures limitations ? (no mipmaps, 3d texture takes up 2 tmu)
Thanks.

[This message has been edited by mbespalov (edited 12-30-2001).]

The orignal Radeon has issues with 3D texturing, but the 8500 is fine (fully ortho 3D texturing). It doesn't support mipmapping as of right now, but it does support it.

Tim Stirling
12-31-2001, 12:52 AM
Thanks, I know the 8500 appears to be alot faster but in every game it runs slower, somtimes by a significant margin. Any thoughts on how much better the 8500 will be compared to the TI500 in the next gen games like unreal 2 and doom3 powered games. I have a feeling that the 8500 should be a good bit faster if properly used. Didn't John Carmack say that a GeForce 3 will run Doom3 at 30FPS while an 8500 at 40FPS?

Tim

Adrian
12-31-2001, 01:40 AM
Maybe its because Doom 3 will have lot more geometry than any current games. The 8500 soundly beats the GF3 in the 3DMark TnL test.

Has anyone done TnL speed tests for the 8500 in OpenGL? I would be interested in seeing some results.

mfugl
12-31-2001, 07:25 AM
BlackJack:

It is dangerous to talk about standards regarding propritary extentions which also may be copyrighted. We dont want a single graphics card $company$.

Anyway, It is not that difficult to do advanced opengl work and stay away from most propritary extentions or at least to hide them behind an interface class. The most advanced 3d-gaming company (Id Software) handles this very well.

pulleyk
01-02-2002, 05:20 AM
I've a question for any one who's tried both a 8500 and a G3. From a lot of benchmarks I've seen the G3 suffers from heavy speed loss when dealing with 2 or more lights. I also thought I saw a benchmark from the 8500 that it did really well with 8 lights.

Has anyone run any tests?

PH
01-02-2002, 05:51 AM
I've heard a lot of positive things about the Radeon 8500 and I'm sure the drivers are just fine. The flexible dependent texture lookups alone makes this card very attractive.

Buying a new card right now may not be a very good idea - the GeForce3 is almost a year old now ( remember the presentation in Tokyo in Feb 2001 ), so maybe NVIDIA is close to releasing an even more powerful chip...

mbespalov
01-02-2002, 06:32 AM
Buying a new card right now may not be a very good idea - the GeForce3 is almost a year old now ( remember the presentation in Tokyo in Feb 2001 ), so maybe NVIDIA is close to releasing an even more powerful chip...


reminds me post by Cass
"There are great things coming soon!"
http://www.opengl.org/discussion_boards/ubb/smile.gif

yes, wait a little for a gf4, but ATI doesn't sleep, so wait a little for a next radeon, my god, there are GF5 and resurrected 3dfx with their voodoo9 on a horizon, etc ...
http://www.opengl.org/discussion_boards/ubb/smile.gif


[This message has been edited by mbespalov (edited 01-03-2002).]

Eric
01-02-2002, 07:00 AM
I am pretty sure NV25 is around the corner but there are things I find quite strange:

1) Almost no HW-dedicated web site seems to have info about it (remember the hype/guesses about NV20 ????)
2) There is no indication about this chip in the nv4_display.inf files that come with leaked/official drivers (there used to be hints on coming chips).

That can mean only two things:

1) There's nothing coming up right now...
2) NV25 is coming. It will be a really NEW product and they managed to keep its details secret.

Seeing that nVIDIA produced the NV2A (X-Box chip), I suppose NV25 will get some of its features... I just hope it will get more than that and that the price will be right...

If the ATI 8500 is that good and that cheap (I haven't followed the recent ATI history http://www.opengl.org/discussion_boards/ubb/wink.gif), nVIDIA had better having some good trick up their sleeves ! If they price NV25 as they did for NV20, they might run into some problem !

That being said, I hope they'll come up with a GOOD card http://www.opengl.org/discussion_boards/ubb/wink.gif !

Regards.

Eric

V-man
01-02-2002, 10:24 AM
Originally posted by PH:
I've heard a lot of positive things about the Radeon 8500 and I'm sure the drivers are just fine. The flexible dependent texture lookups alone makes this card very attractive.


What about that BS about Quake 3 "optimized" drivers.

The Radeon got me interested for my next card, but after seeing that....

I want clean and solid drivers.

V-man

Elixer
01-02-2002, 10:50 AM
I know I saw some specs for the GF4 and the next ATI R300(?) someplace. It wouldn't be wise for Nvidia OR ATI to come out now and give the specs of the new hardware, since then people wouldn't buy the old stuff.

I went from a Rendition V2200 (Great card for its time, and it was programmable...) and then went to Rage 128 (Talk about crappy drivers, took them more than 1 year to make half way decent drivers) then I got a GF2. I will skip the GF3 & Radeon/7500/8500, and go the next gen card.

So that would be a GF4, or a ATI R300 or a Kyro 4/5 or the Bitboys' eDram card(heh!) http://www.opengl.org/discussion_boards/ubb/smile.gif

NitroGL
01-02-2002, 11:02 AM
Originally posted by V-man:
What about that BS about Quake 3 "optimized" drivers.

The Radeon got me interested for my next card, but after seeing that....

I want clean and solid drivers.

V-man

That's exactly what it is, BS. Pay no attention to it, it's been remove for awhile now.

PH
01-02-2002, 11:19 AM
V-man,

Like NitroGL said, this has been removed. The new drivers for the Radeon 8500 are supposedly very solid ( with improved image quality and performance ).

I was actually thinking of picking one up tommorow but I definitely want the next NVIDIA card, so I hope it won't arrive for a few months.

Does anyone know if the 8500 has hardware support for shadowmaps ?

NitroGL
01-02-2002, 09:53 PM
Originally posted by PH:
Does anyone know if the 8500 has hardware support for shadowmaps ?

Unfortunately, no. But I personally don't think shadow maps are flexible enough anyway, so it's of no loss to me.

kieranatwork
01-03-2002, 01:42 AM
I don't understand this - I got hold of a radeon 8500 a few months ago, and could not get it to run on most machines in the office (all of which use win2k sp2, and are a mixture of athlons & p3&4's), with either the packaged drivers or the latest reference drivers. The one machine I did get it to run on (1.2ghz athlon, 266fsb), it performed far worse than the geforce2 gts on the same machine.
I guess I must have been unlucky, or had a faulty card (faulty cards cause poor performance?!).
Have I critised ATI unjustly?
I've had a geforce3 for about a month now, and it leaves every other card I've used standing performance-wise, and the regcombiners, texture shaders, and vertexprograms are a joy to use.

NitroGL
01-03-2002, 09:48 AM
Originally posted by kieranatwork:
I don't understand this - I got hold of a radeon 8500 a few months ago, and could not get it to run on most machines in the office (all of which use win2k sp2, and are a mixture of athlons & p3&4's), with either the packaged drivers or the latest reference drivers. The one machine I did get it to run on (1.2ghz athlon, 266fsb), it performed far worse than the geforce2 gts on the same machine.
I guess I must have been unlucky, or had a faulty card (faulty cards cause poor performance?!).
Have I critised ATI unjustly?
I've had a geforce3 for about a month now, and it leaves every other card I've used standing performance-wise, and the regcombiners, texture shaders, and vertexprograms are a joy to use.

Did you remove all of the old driver files from the previous card? Sometimes old drivers can kill a different card.

Zeno
01-03-2002, 11:05 AM
Let's not get carried away asking if the 8500 can do fancy schmancy stuff like 3d textures...

The real question is, can it do correct trilinear texture filtering yet? http://www.opengl.org/discussion_boards/ubb/wink.gif

-- Zeno

Elixer
01-03-2002, 11:19 AM
Even though it is a ROYAL pain, you really do have to switch to VGA mode for your primary display adapter, (reboot), then install the new drivers, (reboot) then you should be all set.

Also you should uninstall any little utility that comes with the pervious drivers, since ATI loved to slam things in the 'autorun' of win2k/NT feature. (I am sure they are not alone in doing this.)

kieranatwork
01-04-2002, 02:54 PM
QUOTE: "Even though it is a ROYAL pain, you really do have to switch to VGA mode for your primary display adapter, (reboot), then install the new drivers, (reboot)"

Yes, did that, as with every card I've ever installed (and there's been a great many, not just nvidia ones too) - but it only worked on one machine, as I said. The performance was good, don't get me wrong, but not faster than gf2gts at plain old single texture triangle rendering.
BTW, all the machines I tried it on have still got the ati display driver in the "add/remove programs" dialog, which simply refuses to be removed, even though the card is but a distant memory to the machine. Maybe I'll edit the registry, but the current nvidia card doesn't seem bothered by this, so why should I.
I've never had trouble with the nvidia drivers, so maybe ati should poach someone from nvidia to get their installation process sorted out once and for all.
I'm sure the 8500 is very nice, and all that, but I think I'll steer clear for a year or two until they get their act together - I'm the one that gets the grief off my boss and the customers when a card I've chosen for a simulation goes wrong, or never goes right in the first place.

jwatte
01-05-2002, 09:55 AM
kieran,

The way you "switch to VGA" is to choose the "remove/uninstall" option in the control panel for application install/uninstall. Any other way is not correct, IMX.

If you're seeing performance problems, I suggest looking into all the usual culprits: AGP aperture size in the BIOS, AGP chipset drivers for your motherboard, old improperly removed display drivers, etc.

Tim Stirling
01-05-2002, 10:16 AM
I think I will wait for the NV25, there will be an official press release on the 5th of Feb. There is very little info on the NV25 on the net (I've looked everywhere), however you can guess it should at least be faster and have 2 vertex shaders (like Xbox), possibly PS1.4 (or better) so it is at least DX8.1 or maybe DX9 card, more features, 6 pixel pipe lines and better FSAA. If these criteria aren't met then it will be a disapointing card. Its been a year since the GF3 so I'am sure NV can do at least that much. As for a totaly amazing new card I doubt it, although NV have purchased the GIGa pixel technology thing for doing tile based rendering like the Kyro cards but I doubt we will see this until GF5.

There is also the R300 and that is supposedly out in the spring but I very much doubt it!. Then there is the Kyro 3, which will be released soon.

NitroGL
01-05-2002, 11:26 AM
Originally posted by Tim Stirling:
I think I will wait for the NV25, there will be an official press release on the 5th of Feb. There is very little info on the NV25 on the net (I've looked everywhere), however you can guess it should at least be faster and have 2 vertex shaders (like Xbox), possibly PS1.4 (or better) so it is at least DX8.1 or maybe DX9 card, more features, 6 pixel pipe lines and better FSAA. If these criteria aren't met then it will be a disapointing card. Its been a year since the GF3 so I'am sure NV can do at least that much. As for a totaly amazing new card I doubt it, although NV have purchased the GIGa pixel technology thing for doing tile based rendering like the Kyro cards but I doubt we will see this until GF5.

There is also the R300 and that is supposedly out in the spring but I very much doubt it!. Then there is the Kyro 3, which will be released soon.



Heh, it'll probably be another overclocked GeForce3 http://www.opengl.org/discussion_boards/ubb/smile.gif

Korval
01-05-2002, 12:03 PM
Tim, if you just keep waiting for the next big thing, you'll never get anything. There will always be a next big thing just over the horizon.

dorbie
01-05-2002, 12:04 PM
The people at NVIDIA are well aware of tile based concepts, they don't need Gigapixel/3Dfx to show them. These people spend their lives designing hardware.

The philosophical debate over tiled deferred shading has been raging for years. With support for coarse Z and developers implementing blended multipass and larger on chip texture and framebuffer cache the debate is probably not seen as swinging towards a tile based approach for people who chose the alternative before these developments, but it might.

I won't speculate on what's in the next generation, but developers who have implemented complex shading already know where the most serious defficiencies are, both hardware and software.

Just ask yourself what could make this better? Look at the history of Carmack's remarks on this.

As for me, I'm just pleasantly amazed at what you can get for $300, we live in interesting times, and fantastic times if you're a graphics software developer. Think about it, there are two or three companies with engineers busting their butts to outdo each other 24/7 and we reap the benefits for the cost of a CHEAP upgrade. They even spend their time making it easy for us to learn how to use their latest stuff, uploading demos and presentations. In all of this we have it easiest, we need only learn how to use this great stuff.

It's almost too good to be true, if you'd told me 10 or even 5 years ago this is where we'd be I wouldn't have believed you. This has to be the biggest 'free ride' in graphics history.


[This message has been edited by dorbie (edited 01-05-2002).]

Humus
01-05-2002, 02:37 PM
Originally posted by dorbie:
The people at NVIDIA are well aware of tile based concepts, they don't need Gigapixel/3Dfx to show them. These people spend their lives designing hardware.


Probably not, but there may be patents and other crap in the way (damn, I hate patents, knowledge should be free IMO).
Also, while the concept can be quite easy to understand an implementation of deferred rendering isn't something you just throw together over a night.

dorbie
01-05-2002, 03:29 PM
Well I'd guess that UNC, HP and Pixel Fusion / (now ClearSpeed?) own a fair bit of the related I.P. not just Gigapixel, Oak technologies, Microsoft or any of the other more publicised players in that area, although I'm sure most of the big players have enough IP to tie each other in knots. It's like an arms race of mutually assured destruction, it wouldn't stop them pursuing the next viable technology. They have to do it if it's likely to be the next big thing. On the I.P. front they try and dance around each others designs, and keep their powder dry. There are potentially serious negative consequences from winning a suit against a competitor because of all the antecedents.

I have absolutely no doubt that the people at NVIDIA and ATI would be more than capable of implementing the kinds of systems we're talking about and consider it as an option every time they go back to the drawing board. Most of the UNC/HP guys who worked on Pixel Flow now work for NVIDIA, AFAIK they adopted a large part of the group based in Chapel Hill. There's more than a mere exchange of published ideas in the industry, it is simply unrealistic to think that any of these companies can't pursue any design philosophy they decide on or that some of their employees aren't intimately familiar with most of the details involved. You can bet they are.