How are Ati gl drivers?

Thoughts? Been thinking of getting 9600pro for shader work as well as fixed pipe work. Can someone download the editor and try it on an ati card? Just wanting to know if gfxs work ok. It should run on 7xxx series ati cards and up as well as on gf1 and up. Thanks.
http://forged3d.tripod.com

I would need a map to test, but so far I made some geometry(an sphere and a cube) and it runned on my ATI Radeon 9700 Pro.

Sorry, I don’t have loading functions working at this time so no map. I thank you for giving it a try and I’m glad to hear that it works for you. You can also try applying three sets of textures (diffuse, heightmap and glossmap) to geometry and see what it does. Thanks for feedback.

Btw, how do you like developing on ati r300 card? About half a year ago I posted the same question here and some were still having more problems on ati than nvidia. Just want to know if situation has changed. I know the ati has very strong shaders just wondering about the rest. Thanks.

ATi’s drivers seem to be getting more solid. There are occasional errors, but these days, they aren’t much more than what you would get with an nVidia card. Different from nVidia’s errors, but similar in number and degree.

The only major error I’ve encountered in recent times is the issue with 16 bit textures being forced. While certainly annoying, it isn’t crippling and ATI is definitely aware of the problem and has reportedly fixed it in an internal build.

Thanks guys for the feedback, much appreciated.

Ostsol, and they arent forced as far as i know, they are just selected as 16bit if you dont ask for 32bit internalmode yourself ( GL_RGB(A)8 as the 3rd parameter in glTexImage*d )

Originally posted by Mazy:
Ostsol, and they arent forced as far as i know, they are just selected as 16bit if you dont ask for 32bit internalmode yourself ( GL_RGB(A)8 as the 3rd parameter in glTexImage*d )
Yes. I still think that was unintentional (aka “bug”), as it only happens on R300, and not on R100/R200.
Rumor has it that it’ll get “fixed” in an upcoming release, which would confirm the bug status if true.

LOL! I thought I had been doing that and even had my enums set correctly. . . Unfortunately, I didn’t send the correct enums to glTexImage. . . Looks purdy, now!

I see about twice as many issues with ATI cards as with nVidia cards, and the ATI issues are typically worse than the nVidia issues. However, ATI drivers continually improve; in 6 months to a year they might be as good as (or even better than) nVidia’s.

Coriolis, I was afraid of that. Any bugs you remember that caused you grief? That would be helpful to me. Thanks for feedback.

dunno. i prefer to have a driver with some small bugs than a driver with big cheats in…

haven’t seen any good nvidia driver out the last months…

and ati devrel is generally great at helping out.

Anyone else with an experience?

I’ve encountered 2 or 3 bugs when using functionality few people use (like…eh…3x3 pbuffer), but the ati devrel was very responsive and they fixed the bugs! And think that you are getting hardware+drivers, and this combination offered by ATI is much better than NVIDIA hardware+drivers.

Regards
-Lev

I’ve also seen minor bugs using advanced features - extensions, like vertex and pixel shaders. The core functions are pretty robust though. And bugs are usually fixed in the next driver release, so i’m definately happy with ATI at the moment.

And to be honnest i’m more and more unhappay with NVidias. No hardware clipping planes support, unusable PBuffers due to low performance, not even speaking of the recent VBO problems that don’t give expected performance. Not to mention “cheats”.

I’ll stick to ATI for a few additionnal months.

Y.

Ysaneya,

Are you saying that ATI Radeon cards have hardware clipping (standard drivers)? I know Quadros have that, and standard GeForce:s too with SoftQuadro if I remember correctly.

Yeah, all Radeons i’ve seen have hardware clipping planes.

Geforces on the other hand, fall back to software T&L when using a clipping plane. Quadros i don’t know.

Y.

Originally posted by Ysaneya:
And to be honnest i’m more and more unhappay with NVidias. No hardware clipping planes support, unusable PBuffers due to low performance, not even speaking of the recent VBO problems that don’t give expected performance. Not to mention “cheats”.

your list doesn’t have the slow performance on ARB_fragment_program
i don’t know how anyone can currently say nvidias drivers are bether than atis drivers. new stuff has bug on them, too, older stuff is still not well implemented. general performance is slow.
btw, the pbuffer thing, if set up right, you can get it fast. but to set it up right, you need to fully be able to understand the quantum theory and you have to be able to implement it on a gba realtime… then you can start thinking about understanding how to optimize for gfFX cards, and others

i just have to ask myself what THAT has to do with opengl… shouldn’t we call it glideFX after all?

>>>And to be honnest i’m more and more unhappay with NVidias. No hardware clipping planes support, unusable PBuffers due to low performance, not even speaking of the recent VBO problems that don’t give expected performance. Not to mention “cheats”.<<<

On which GPU exactly? How is the FX 5800 and the 5900.

The bugs you find really depend on what you do. I was doing some VP and I found a few bugs there. Some bugs are weird like lines not beeing rendered correctly when clipped by the viewport.
I have even seen screen corruption but with someone elses demo.

It pissed me off, but Ive had this R9500 for a couple of months and you can definitly develop on it. Im almost satisfied.
Too bad I don`t have an FX to compare.

And dave, take it easy man. It`s like you want NVidia to go out of business.