3dLabs new GPU

http://www.gamasutra.com/php-bin/product_news_display.php?story=1167

Well, what do you think?
Sounds like a dream…

Ah, they want my e-mail adress… I don’t like that. However, tom’s has got an article on that stuff, too :
http://www.tomshardware.com/graphic/02q2/020503/index.html

Morglum

I never got any spam from gamasutra, and I’ve been registered for years. It’s the web site for Game Developer magazine, which is a good rag and free if you work in games, or regularly approve purchases of hardware, or whatever their advertizer criteria are.

Ah, well, then i’m going to register. I didn’t know it was the site for game developer magazine.

Thank you for your rivetting analysis of the p10 chipset.

It looks pretty sweet. Here are my thoughts:

Pro:

  1. I’m all about fully programmable. The more the better. Go Go OGL 2.0!
  2. Programmable texture filtering?? I don’t know anything about texture filtering (ok, I could write a trilinear filter, but not an anisotropic one). This could potentially make photoshop type after-effects way fast.
  3. 8 textures is great, and probably actually enough (base, light, shadow, detail, reflection, normal…others?). Hope that bandwidth promise holds up

Cons:

  1. I haven’t seen anything rendered by it.
  2. It’ll have to prove it’s driver stability and conformance. I’ve used wildcat cards before and had bad luck with them.
  3. It’ll have to have to beat NV30, R300, and possibly a new card from Matrox (rumors), all of which will probably be on an affordable consumer board at about the same time.

Time’s are gettin’ good in the graphics field

– Zeno

[This message has been edited by Zeno (edited 05-04-2002).]

at the moment i’ll get a fast gl2.0 board i will start raytracing with it…

if it will be p10 yet? would be funny

looks like a good step imho anyways…
matrox is comming
3dlabs is comming
ati is comming

just nvidia does not want to move till next year because they don’t want to have bether stuff than xbox

can’t wait to see this on floats, not 8bit bytes…: http://tyrannen.starcraft3d.net/loprecisionraytracingonatiradeon8500.jpg

and then with shadows and reflections and textures and supersampling to get everything soft…

i’ll go for 10 years… freeze me now please!

Originally posted by Zeno:
3) 8 textures is great, and probably actually enough

I dont agree to this one, IMHO it should be “Cons”, not a “Pro”

I think 8 simultaneous textures is far from being enough (enough to forget multipassing).

I hoped the number of textures will be (practically) unlimited, only with limit
on number of temporal values while evaluating shader tree/graph (like registers in CPU).

Just 8 texture units can’t be called real “virtualisation of multipass”.
Some algotrithms can’t be done with multipass (any order-dependent, transparency for example)
Any number of AUX buffers wont help either.

One PoverVR guy claimed Kyro can “theoretically” do unlimited number of textures per
primitive (true on-chip multipass).
But Kyro’s very limited capablities of combining them made the feature useless, i guess.

This reminds me of when Creative bought E-mu. The capabilities of a $500 E-mu computer/musician product got crammed into a $50 sound card, which to this day rules the sound card world. The more professional offerings of E-mu have suffered somewhat, but at least the level of PC audio was raised substantially (I’m talking about the SB Live!).

It seems that they’re now after 3D graphics in the same way, after having tried the OEM path just to learn the waters before.

“Interesting times” – curse of our times or bane of our existence? Discussion at 11 :slight_smile:

just nvidia does not want to move till next year because they don’t want to have bether stuff than xbox

From interviews I’ve read about Nvidia’s chief, he says NV30 will be out later this year, and he hinted at it being quite revolutionary. i.e. quite a big step from GF4. I’m pretty sure NV are keeping close eyes on Matrox, and 3Dlabs, and ATI, as to lose out at this stage would be a huge blow for NV.

Again tho. It’s not the specs that really matter. Nv have a huge loyal customer base, and excellent (apart from a few bugs in brand new features) drivers across it’s whole product line. This is often more important than pure specs alone. Especially for game developers looking for cards to target.

Nutty

[This message has been edited by Nutty (edited 05-05-2002).]

[This message has been edited by Nutty (edited 05-05-2002).]

I think Nvidia has reputation of good drivers and this will help them out even if their competitors have better hardware(gf4 vs r8500). I think people will be wanting to know driver quality of the other competitors before buying into their products. I know I will. I don’t trust Ati or SiS. Never had matrox card but people didn’t have many driver problems(except early opengl) from what I read. I read 3DLabs drivers are also flaky this from user of their wildcat card. I had permedia2 chip and their drivers didn’t support some common blending modes and what have you so I’m skeptical. Actually the only company I trust now is Nvidia since my gf2 is humming nicely without any problems so far. Done some blending/dot3 stuff and the various modes went fine. This on 12.41 drivers. Nvidia hw is almost like Intel’s or Amd’s where we just expect the hw to work out of the box. I think this alone will have greater impact in August and on my buying decision then.

I’ve never had any trouble with the 3dlabs Wildcat opengl drivers. The only problems I had with the wildcat were its fillrate (bad), its lack of multitexturing (and with that comes its lack of surface shading features)…and…that’s about it. Other than that it’s a very fast card - the t&l is astoundingly fast, and it has a huge amount of onboard memory. It’s antialiasing is second to none, so this sort of counter-balances its low fill rate, as you can run stuff at low resolutions without getting jaggies. It also supports a HARDWARE ACCUMULATION BUFFER! And the entire imaging subset in hardware.
I know this topic isn’t about the wildcat, but if they add the fillrate of the geforces, and the vertex/pixel shaders of the gf4/8500, then I would certainly recommend it - but it sounds like they’re going to do a hell of a lot more than that.

>>I had permedia2 chip and their drivers didn’t support some common blending modes<<

thats because the card physically cant do them + not because of bad drivers, the same thing with the riva128 (a nvidia card) is phyiscally cant do cerain blending modes (quake3 looks digusting on it)