PDA

View Full Version : geforce3 and cost



kaber0111
05-19-2001, 11:06 AM
i'm just curiuos, but did anyone actually pick up the geforce3'?

i didn't....
ended up spending all my money on ps2, accesories, and games/dvd's http://www.opengl.org/discussion_boards/ubb/smile.gif


laterz.

Nutty
05-19-2001, 11:16 AM
PS2 sucks.. hardware is lame. My moneys on Xbox and Gamecube to clean up. http://www.opengl.org/discussion_boards/ubb/smile.gif

I got my Gf3 at reduced cost from nvidia anyways! :P

Nutty

Zeno
05-19-2001, 11:19 AM
I bought mine for $400 at ebworld and it's worth every penny. Orders of magnitude better than my TNT2 (which was $250 IIRC).

-- Zeno

ET3D
05-19-2001, 11:23 AM
But then, a GeForce2 Pro would have cost you half as much (or less), and still be orders of magnitude faster than your TNT2.

Zeno
05-19-2001, 11:45 AM
That's true, but it would be slower, wouldn't have usable anti-aliasing, wouldn't have hardware support for vertex programs, texture shaders, or depth buffer based shadows. Oh, and I'd have to upgrade it sooner.

I am a developer after all, and this new stuff is fun http://www.opengl.org/discussion_boards/ubb/smile.gif

-- Zeno

Humus
05-19-2001, 02:19 PM
"Usable anti-aliasing", are you're refering to Quincunx? It's the up to date most useless FSAA implementation IMO. It does nothing but blurring the screen, and there are still visible jaggies with it. OGSS FSAA is also a waste of time for both consumers and driver writers, the only usable FSAA implementation in consumer level cards yet is the V5's RGSS FSAA.

About the GF3 though, I'd get one if it had 3d textures. It looks really nice and the price is actually realistic and getting lower, but I do really need 3d texture http://www.opengl.org/discussion_boards/ubb/frown.gif
Now once again there are rumours saying that 3d texture are indeed supported, more specifically the forum at beyond3d.com ... a guy there that claims to have talked to a tech guy from nVidia that told him that volumetric textures are supported in hardware and will be available in future drivers ... now this is most likely BS, but if it was true I'd certainly reconsider getting a GF3

Zeno
05-19-2001, 02:46 PM
Humus -

I agree and have also bemoaned the lack of 3d texturemap support. I'm keeping my fingers crossed that new drivers will enable this feature.

Have you actually used the quincunx in a game? It doesn't appear blurry to me, but edges do seem smoother. For me to really tell, though, I guess I'd have to take a screenshot and look at it with and w/o the quincunx on.

-- Zeno

kaber0111
05-19-2001, 03:04 PM
Originally posted by Nutty:
PS2 sucks.. hardware is lame. My moneys on Xbox and Gamecube to clean up. http://www.opengl.org/discussion_boards/ubb/smile.gif

I got my Gf3 at reduced cost from nvidia anyways! :P

Nutty


so?
the nintendo 64 hardware was even lamer..
yet i still bought it and had lots of fun with it.
i'm currently having a lot of fun with tekken tag and hopefully gta3 will be _awesome_.

laterz

Tim Stirling
05-20-2001, 12:08 AM
I hope you weren't comparing the N64 to the PS1- no contest, the N^4 is far more powerfull, 94 MHZ 64bit processor(main, there is also a ~70MHZ 64bit gfx co-processor) compared to 33MHZ 32bit (I know this isn't really a fare comparison), 4MB + 4 MB upgrade ram comapared to 2MB, faster loading times, more powerfull gfx chip (64<32bit), far faster flops.

Anyway , I have now seen a PS2 in action and I have to say it looks crap; half the games look like N64 games, the other half look like PC games from last year and all have zero game play. The controller is also terrible and DVD playback is a joke.

[This message has been edited by Tim Stirling (edited 05-20-2001).]

Nutty
05-20-2001, 02:40 AM
Thanks Tim,
Yes I've worked on the N64, and think it was an excellent piece of kit.

I'm not too impressed with the PS2.

Nutty

Humus
05-20-2001, 06:54 AM
Originally posted by Zeno:
Humus -

I agree and have also bemoaned the lack of 3d texturemap support. I'm keeping my fingers crossed that new drivers will enable this feature.

Have you actually used the quincunx in a game? It doesn't appear blurry to me, but edges do seem smoother. For me to really tell, though, I guess I'd have to take a screenshot and look at it with and w/o the quincunx on.

-- Zeno

No, since I don't have the card I haven't used Quincunx, but judging the screenshot all over the web I'm not impressed by the quality it produces.

Eric
05-20-2001, 11:16 PM
Superb double-theme thread ! http://www.opengl.org/discussion_boards/ubb/wink.gif

1) About FSAA: why the hell do you still need it when GF3 enables you to play at 1600x1200 ????????????????????

2) About PS2 vs X-BOX vs DC vs N64 vs Jaguar (the first 64bit console, whatever Nintendo could say....) : the best console is the one with the best games (as long as gfx are not too lame http://www.opengl.org/discussion_boards/ubb/wink.gif). DC was far better than PS1 on the paper but Sega failed (not enough good games). PS2 might not have fantastic games at the moment but let's wait for GT3, GTA3, Metal Gear Solid: Sons of Liberty, FFX, FFXI, and the others.... Now, X-BOX is nothing more than a PC to me... and I already have a PC (yep: Cyrix133+ with a Matrox gfx card !!!). Let's wait until the X-BOX is released... What are you saying ? X-BOX will have zillions of games on D-Day ? So they said for PS2 but they couldn't deliver... Let's wait... Let's wait...

Regards.

Eric

JD
05-21-2001, 01:56 AM
Well, I don't know about xbox = pc. Doesn't xbox have like two t&l units? Vrooommm http://www.opengl.org/discussion_boards/ubb/smile.gif

Adrian
05-21-2001, 02:49 AM
Originally posted by JD:
Well, I don't know about xbox = pc. Doesn't xbox have like two t&l units? Vrooommm http://www.opengl.org/discussion_boards/ubb/smile.gif

Which begs the question why has the Geforce3 only got 1 T&L unit. I presume there is a technical reason like the AGP on the PC cant feed enough vertex data to utilise both units? Anyone know?

ET3D
05-21-2001, 11:15 AM
Eric, one reason not to run at 1600x1200 is to get better refresh rates.

About the Xbox, I think that the games available for it will be mainly games that were designed for PC, and snatched by MS by "bribing" the developers to sign an exclusivity agreement. Am I being too cynical? http://www.opengl.org/discussion_boards/ubb/wink.gif

zed
05-21-2001, 01:04 PM
having a offtopic forum would hopefully place all these threads in one place.
i have a question r there any xbox games that don't look like pc games?

linuxedge
05-21-2001, 02:39 PM
Xbox is just a toy...PC hardware will be way better less than 6 months after the xbox is released....Also, I doubt the xbox will support OpenGL and that makes it even lamer hehe (unless you prefer D3D, in which case there is no Linux for you and I am sorry)

If you really gotta get a console go with the Nokia Media Station - it's got Linux and OpenGL and OpenAL and SDL...

http://www.opengl.org/discussion_boards/ubb/smile.gif



Originally posted by JD:
Well, I don't know about xbox = pc. Doesn't xbox have like two t&l units? Vrooommm http://www.opengl.org/discussion_boards/ubb/smile.gif

Siwko
05-22-2001, 03:50 AM
Originally posted by Eric:
2) About PS2 vs X-BOX vs DC vs N64 vs Jaguar (the first 64bit console, whatever Nintendo could say....)

Just throwin' this in... the Jaguar was a 32-bit console. It had dual 32-bit CPU's (the old trick - 32-bit Main CPU and a 32-bit GFX Unit = 64-bits, right? - Actually, I think it was only a 24-bit GFX engine).

While we're at it - how about the PC Engine aka Turbo Grafix 16. By far ahead of it's time, and an excellent system. Blazing Lasers, now THAT was a game. It was compared with the Genesis and the Super Nintendo, both 16-bit machines, and while it was the Turbo Grafix 16 - ie: 16-bit... it was actually an 8-bit CPU (Z80 I think).

Little bit of history for you.

Oh well..

Siwko

Eric
05-22-2001, 05:44 AM
Siwko, according to this site:
http://www.gamezero.com/team-0/articles/features/jaguar_93/

"The Jaguar will feature a 16.7 million color palette with true 32-bit graphics, as well as 16-bit CD quality sound. The 64-bit RISC based processor can process at 55 MIPS (million instructions per second), allowing animation speeds in excess of 850 million pixels per second."

I'll check the spec out but it seems it was a 64-bit RISC...

Regards.

Eric

P.S.: and you can find more at: http://www.geocities.com/SiliconValley/Vista/3015/jagspecs.html

[This message has been edited by Eric (edited 05-22-2001).]

Chromebender
05-22-2001, 08:21 AM
I just got the GeForce 3, and I think Quincunx works pretty well, considering that it only gives you a small performance hit compared to FSAA on the Voodoo 5 and GeForce 2. And no offense but saying that Quincunx doesn't look good after seeing a screenshot (which was almost certainly compressed) is like saying you saw a Viper parked on the street and it didn't *look* fast.

To be fair, I got very little or no speed improvement at 800x600 and 1024x768, but Serious Sam was quite playable at 1280x1024x32 bit color with all graphics options turned on and Quincunx enabled. I still run with FSAA turned off though because I prefer the cleaner look and noticable speed improvement. I do think I'll try the 4x AA and see how much nicer looking/how much slower it is than Quincunx.

[This message has been edited by Chromebender (edited 05-22-2001).]

Tim Stirling
05-22-2001, 09:54 AM
The Jaguar was a duel 32bit system as far as I know. Anyway what the hell happened to it, it was only around for about 5 minutes.

The GameCube has some very Promising titles on release date. The graphics also look amazing- just as good as the Xbox screen shots.

When(if http://www.opengl.org/discussion_boards/ubb/smile.gif) do you think the GeForce3 Ultra would be released. More importantly when do you thing the next NV(30?) chip will come out, I read somewhere it would be arouund christmas??

Chromebender
05-22-2001, 11:06 AM
I heard that Nvidia is scheduled to release a new chipset approximately every 6 months.

Humus
05-22-2001, 02:20 PM
Aaaarrrrggghhh ... don't know if I can take anymore:

"Q: Does the Geforce3 support 3d textures and Volumetric Texture Compression?
A: Yes , the Geforce3 supports 3d textures and VTC. Microsoft hasn't added them to DirectX and we're waiting for Microsoft. "
http://www.techextreme.com/display.asp?ID=371&Page=3

This is from an interview with nVidia director of product development. So, is it real?

j
05-22-2001, 03:08 PM
I heard that Nvidia is scheduled to release a new chipset approximately every 6 months.

Yeah, usually it is a new chipset every spring and an updated one that fall. Example - Geforce2 Spring 2000, Geforce2 Ultra Fall 2000.

However, the "new" chipset isn't always that new. For example, the Geforce2 is basically a Geforce256 with a faster clock rate and smaller chip die size.

j

ffish
05-23-2001, 12:02 AM
Humus,

Very interesting. I hate all the rumours about 3D textures. I just wish they'd tell us. I'm sure there'd be a few people (me included) that would buy a GF3 if they did support 3D textures in hardware (maybe not you Radeon owners http://www.opengl.org/discussion_boards/ubb/smile.gif).

What I'd hate even more is if 3D textures are supported in hardware but not yet implemented because of DX and we have to wait until a new version of DX to get them just because of some deal between nVidia and Microsoft.

Tom Nuydens
05-23-2001, 06:15 AM
Originally posted by ffish:
What I'd hate even more is if 3D textures are supported in hardware but not yet implemented because of DX and we have to wait until a new version of DX to get them just because of some deal between nVidia and Microsoft.

That's what it's starting to sound like to me. If this is the case then yes, it's kind of lame, but I'd prefer to wait a bit for a driver upgrade than to not get 3D textures at all. I'd hate to think that I spent $500 on a piece of hardware whose drivers don't expose all of its features http://www.opengl.org/discussion_boards/ubb/smile.gif

- Tom

Nutty
05-23-2001, 08:28 AM
Why do they have to wait for M$ to expose 3D texture hardware? Surely they could just expose the gl extension, while they wait on M$ putting it into DX?

Doesn't make sense.

Nutty

Humus
05-23-2001, 08:33 AM
This was what I was thinking too. And since Radeon's 3d textures are working in D3D already I can't see why GF3 would need special support.

JD
05-23-2001, 09:17 AM
>> Why do they have to wait for M$ to expose 3D texture hardware? Surely they could just expose the gl extension, while they wait on M$ putting it into DX? <<

Just speculating but perhaps ms doesn't want to compete with ogl? and since nvidia has their chips in xbox maybe nvidia doesn't want to damage their relationship with ms. Of course this doesn't work for dx7 vs. ogl register combiners. Or perhaps nvidia is too busy or is in process negotiating with other arb members or red tape or or ok I'm done http://www.opengl.org/discussion_boards/ubb/smile.gif

JD
05-23-2001, 09:29 AM
Just saw nvidia posted new ogl extensions pdf doc on their developer site. Also wanted to ask nvidia folks - where is ms heading with opengl? Does ms tolerate you because you license technology to them as a tradeoff? I read bill gates is very paranoid about competition and now that linux is shaping up its 3d I wonder where this puts nvidia in microsoft's eyes. It's amazing to me that you've been able to release ogl drivers for linux, looks like you guys have some serious button pushing power http://www.opengl.org/discussion_boards/ubb/smile.gif

Nutty
05-24-2001, 12:07 AM
Why the hell _shouldn't_ Nvidia release drivers for Linux??!?! Who the hell does M$ think they are, if they can dictate another companies OS support!!!

If that suspicion is true, and M$ are stopping Nvida releasing hardware 3D textures in OpenGL, as they dont want the competition for D3D, then it just amplifies my contempt for them!

Nutty

mcraighead
05-24-2001, 02:35 PM
Whoah... hold it with the conspiracy theories, there. There is no anti-Linux conspiracy. (And yes, I am including not just us, but also MS in that statement.) And regardless of your views on MS, it is completely absurd to suggest that they somehow restrain us from releasing Linux drivers, seeing as (1) we release the drivers and (2) they _don't_ restrain us, or anyone else (to my knowledge).

- Matt

Nutty
05-24-2001, 11:21 PM
I'm glad to hear it. Just someone implied that was the case on another thread.

Nutty 'The Truth Is Out There' http://www.opengl.org/discussion_boards/ubb/smile.gif

Chromebender
05-25-2001, 08:10 AM
Just thought I'd toss out another post on Quincunx:
I took a closer look at the Geforce 3 using 2x, Quincunx, and 4x anti-aliasing in Serious Sam and Summoner at 1024x768, and both 2x and 4x are much less blurry than Quincunx. Plus, on an 800 mhz machine with 256 mb of ram, 4x anti-aliasing doesn't hurt game performance much more than Quincunx (I'd guess it drops from maybe 40 to 30 fps, but with much better visual quality), so I really don't understand what Quincunx is supposed to do...

cruelworld
05-25-2001, 11:45 AM
On the NVIDIA website developer info there's a listing of all of their OpenGL extensions and EXT_texture3D is listed. Is this the same as 3D textures support?

jwatte
05-25-2001, 12:14 PM
> they _don't_ restrain us, or anyone else (to my knowledge).

Microsoft restrains IHVs from shipping non-MS OS-es. Not only has it been found to be true in court, but I've seen it myself.

davepermen
05-25-2001, 12:22 PM
i can get pointers to functions like

glTexture3D ( no EXT or ARB! ), i can call them, i can bind GL_TEXTURE_3D and render it, there is no error.. so far the nv20emulator wich i use on my gf2..

on the other side, there is nothing on screen if i enable GL_TEXTURE_3D..

but, as far as i see, its supported in the drivers theoretically ( they have too cause they say gf3 supports opengl1.2, and texture3d is gl1.2.. )

cant wait to see if somewhen the screen is not black anymore http://www.opengl.org/discussion_boards/ubb/wink.gif would be so nice.. wouldn't it?

oh, and just to say, dx8 can create 3dtextures, they are just not aviable on the hardwaredrivers... the softwarerenderer can use them ( very slow.. )

hope to see texture3d on xbox.. really hope so..

oh, and matt.. who knows if billi is not standing near you with a gun pointing onto your head while writing this? http://www.opengl.org/discussion_boards/ubb/wink.gif no, seriously.. somehow no one gives a real answer from nvidia, and this creates a lot of rumor.. and cause everyone waits for xbox and cause xbox is not cheap for microsoft and cause games are delaied to be first on xbox than on pc we dont know what microsoft will do to have something great with its xbox.. i know just one thing.. i will buy an xbox, then download the free patch to code for xbox with vc ( or something like this it is.. dont know.. ) and then i will do free stuff for it.. so microsoft payes for my xbox and i dont pay for theyr games..

oh, and of course i will buy a gamecube, nintendo simply rocks http://www.opengl.org/discussion_boards/ubb/wink.gif

davepermen
05-25-2001, 12:24 PM
oh, and btw, i prefer gaming q3 with 800x600 and FSAA 2x than 1600xsomething.. i can see the edges even there and i dont like them.. the very very detailed textures are very perfectly filtered ( with best settings, of course ) but the edges are edges anyways.. thats why i like 800x600.. then everything looks the same.. wich is very realistic.. letz look at quincadjldlfdf if it looks good.. well see, but one thing i know, its a terrible name http://www.opengl.org/discussion_boards/ubb/wink.gif

mcraighead
05-25-2001, 02:02 PM
Well, Dell ships Linux systems today, and we give them Linux drivers to use. Looks like no such thing is happening right now.

Then again, if MS wanted to withhold Windows from an OEM that threatened to use another OS, I would see nothing wrong with that.

- Matt

kaber0111
05-25-2001, 02:37 PM
Originally posted by mcraighead:
Whoah... hold it with the conspiracy theories, there. There is no anti-Linux conspiracy. (And yes, I am including not just us, but also MS in that statement.) And regardless of your views on MS, it is completely absurd to suggest that they somehow restrain us from releasing Linux drivers, seeing as (1) we release the drivers and (2) they _don't_ restrain us, or anyone else (to my knowledge).

- Matt

ROFL*3


btw jwatte, couldn't have said it better my self..



[This message has been edited by kaber0111 (edited 05-25-2001).]

Hull
05-25-2001, 04:20 PM
Matt starts to scare me.

Humus
05-25-2001, 06:39 PM
Originally posted by mcraighead:
Then again, if MS wanted to withhold Windows from an OEM that threatened to use another OS, I would see nothing wrong with that.

- Matt

I don't know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.

mcraighead
05-25-2001, 07:12 PM
Ah, but as I've said before, I'm opposed to all antitrust laws. http://www.opengl.org/discussion_boards/ubb/smile.gif

- Matt

kaber0111
05-25-2001, 07:41 PM
Originally posted by Humus:
I don't know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.


they did this to ibm..
with the whole Office Deal and limiting selling them licenses...

it's in the testimony...

Zeno
05-25-2001, 07:48 PM
What's to prevent OEM's or IBM from going down to best buy and picking up whatever MS products they want? I'm sure it'd cost more than their sweet volume discounts, but I don't think there's anything MS could do to stop it.

-- Zeno

Tim Stirling
05-26-2001, 01:04 AM
Anyone hear of any of the theories surrounding MS buying out Corel to try to stop Corel linux eating into the 'windoze idiots' market share.
There was also a suggestion that MS would make a Linux distribution! strange but what if it was true! http://www.opengl.org/discussion_boards/ubb/frown.gif, next we will be buying MS TVs, eating ms pizzas cooked in an ms oven in a house made buy ms in an extire city run and created by MS.
Who's idiot idea was it to split MS up, now it is just harder to destroy them, if they are one then you could nuke them 'not literaly !' and get rid of the lot. Now they will keep splitting and it wont be long before MS hardware division is created, we already have ms mice,keyboards, joysticks and pads. The new world order, noooo, lets escape to the mountains (and hope the FBI/whoever don't bother us)

http://www.opengl.org/discussion_boards/ubb/smile.gif



[This message has been edited by Tim Stirling (edited 05-26-2001).]

davepermen
05-26-2001, 02:17 AM
anyone knows the story of "big brother is watching you" ? ( not the tv-fun, the original.. )

big billy..

onyXMaster
05-26-2001, 02:57 AM
Too much Deus Ex, right ? http://www.opengl.org/discussion_boards/ubb/smile.gif)
With all those conspiracies, and such...

Auto
05-27-2001, 03:31 PM
xbox smecksbox... why would i ever want one?
OK if they release an OpenGL dev kit/driver set for it then sure - much more interesting, but then wont that be the same as my PC? Actually probably less than my PC (Dual 1Ghz) [+ GF4 by the time it comes out in the UK]

I'm borrowing a PS2 atm and i kinda like it - sure the games are a bit ropey atm but GT3 is great (played it @milia). Also i think the XB is too large for my living room. Apparently the Japanese at TGS were a bit confused by its size also - some even commented on it as being more like a tea table than a console.

I'm just not sure that the XBOX will take off really, and everyone wants to back the winning horse.

Anyway, theres my two pence worth.

(BTW i would love a GF3, but i think ive still got a lot of mileage to go with my GF1 yet http://www.opengl.org/discussion_boards/ubb/wink.gif)

rts
05-27-2001, 03:47 PM
At the risk of dragging out yet another wildly off topic thread...


Originally posted by Humus:
I don't know about american laws but AFAIK at least here in Sweden it would be illegal to do that, possibly the same over there.

Matt said he would see nothing "wrong" with it.

What is right and wrong, and what is legal and illegal, are often two very different things.

And on this I agree with Matt. And no, I am not a fan of Microsoft (I run Linux exclusively at home). Rather, I'm a fan of liberty.

My Cdn$0.02

Humus
05-27-2001, 05:50 PM
Well, I didn't mention anything as right or wrong, I just pointed out that it was illegal. But I do think it's wrong to do that though and fully support such laws even though I understand that there may be other people that don't share my views or opinions.

[This message has been edited by Humus (edited 05-27-2001).]

Tim Stirling
05-27-2001, 11:00 PM
Size matters, the smaller the better! The XBox is just to big, that why I like the GameCube- size of 10 cd cases, sits on the palm of your hand and yet still has the same power as the xbox. Also the GameCube will have THE best games.
Time to start a new thread Intel v AMD...

Did you know that even although AMD is doing very well with ever increasing sales there sales figures are tiny compared to Intel.

Heard about the rumours that Intel and MS are basicly the same company run together but carefuly shown to be 2 seperate companies.

Who will win the 64bit war, AMDs early x86 64bit or Intels later first non-x86 CPU for years and years. What about the supply of MoBos for these. How much of a difference will 64 bit technology make????

zed
05-28-2001, 12:12 AM
though amd is still a small fish compared to intel they certainly have picked up a bit in the last year and can't really be considered as tiny.
>>For chip manufacturer AMD, the year 2000 meant a surge in sales due to an immensely popular processor, the Athlon. Between the processors sold early in 2000 based on the K75 core, and those sold in the second half of 2000 based on the Thunderbird core, the Athlon has achieved a 24% unit share, bringing AMD's total desktop share to 38%. Not bad for a company that previously struggled for a waning position on the value market - not bad at all.
So how has AMD emerged from 2000 in such a successful position? Not only have they matched and surpassed the performance of Intel's Pentium III platform on a clock for clock basis, but you can pick up an Athlon at 1.2GHz for about $50 less than Intel's Pentium III 1GHz. Value and performance made AMD the fastest growing CPU manufacturer last year.<<

alll i can say about ms is they're doing just what any other company would do in there place. theyre all dishonest http://www.opengl.org/discussion_boards/ubb/smile.gif looking at the xbox + the 'pc like games' that will come with it. ive got the feeling ms have made a big mistake that is gonna end up costing them billions. after seeing what a few of the nintendo games (on the web) looked like my moneys on them in the battle of the 3.

kistompika
05-28-2001, 02:43 AM
Does anyone know an OpenGL related discussion forum http://www.opengl.org/discussion_boards/ubb/smile.gif?

MrCalab
05-29-2001, 05:37 AM
Originally posted by Zeno:
That's true, but it would be slower, wouldn't have usable anti-aliasing, wouldn't have hardware support for vertex programs, texture shaders, or depth buffer based shadows. Oh, and I'd have to upgrade it sooner.

I am a developer after all, and this new stuff is fun http://www.opengl.org/discussion_boards/ubb/smile.gif

-- Zeno

Hi Zeno,
about your article about the Geforce 3 and the matter that you are software-developer. I am software-developer as well, but not at least in my craziest dreams I would buy me a Geforce3 because of this, not at this simply perverse price for nothing. Yes, anti-aliasing is surely looking better, but a simple blur is not Antialiasing, but just a fake, which is even negative in a couple of games such as Counterstrike, in which you really need to look for every pixel, cuz it could be an enemy. The Geforce3 offers in my opinion surely a lot of possibilities, but to come to the matter that you are software-developer as well. What a market-value do you think does the matter, that your game or what ever is supporting the Geforce 3 have? I monthly get a statistic, which cards are bought and supported by the mass and the Geforce 3 is not at least listed, means it's under one percent, I would already be surprised, if it had just 0.1% and this will also not change for the next months. Even NVidia all in all doesn't own the market. The mass has got Intel- and ATI-chips, there are a lot of TNTs in use and still a lot of Voodoos as well. Slowly the Geforce 1 is becoming more and more a standard and soon it will be the Geforce 2 MX, but else there's no value I see in the Geforce 3 at the moment. It's surely fast and it can surely render a lot of vertices per second, but you should never forget that these vertices also need to be calculated first. Till a Geforce 3 will really make a sense to buy, the processor need to be faster, the AGP-port anyway and the price has to be at around 150$, then it'll slowly get importance and till this date I will also not waste my time for Pixelshaders or what ever. And I am sure if there wouldn't any money as some hardware would float from NVidia's side to some developer-firms, they would never support this card directly, because it would never be a win for them in the next one year. I am personally very NVidia-religious, there's developer-support is great, if by phone or by e-mail, I would never put any card except one from NVidia into my PC as well, but my second PC has an ATI-card and the third one an Intel-chip, because this is right now...how sad it is...the market, also if surely all of us developer would love to have scenes with 10.000 polygons per tree, bumbmapped, shadowed and with pixelshader support in all ways, but for today and also not for the coming christmastime, when we'll let our actual products publish : VALUELESS.

Michael Ikemann / Virtual XCitement Gmbh.

Chromebender
05-29-2001, 07:06 AM
Mr. Calab, exactly what kind of software development are you employed in? I am a 3D graphics developer in an American university research lab, and like Zeno I bought a GeForce 3 to explore the new hardware t&l features of that card. I'm curious as to why you think the vast majority of the world does not need the capabilities of the GeForce 2, much less the GeForce 3. I count quite a few of my most recent game purchases (of the last year) that would perform poorly on less than a GeForce 2: Black and White, Nocturne, Serious Sam, Alice, and Blade of Darkness, just to name afew. Many of these games also run even better on the GeForce 3, playable at resolutions of up to 1280x1024x32 bit (the maximum resolution of my monitor), with quincunx or 4x anti-aliasing turned on!

Plus, a quick browse of the Nvidia website shows another dozen games in development that will specifically exploit the vertex and pixel shading capabilities of the GeForce 3. I realize that the average PC does in fact have an ATI, TNT, or Intel low-end 3D accelerator, but the fact is that a computer gamer's machine is not the average PC, and both developers and consumers need to take advantage of new hardware to continue advancing the realism and entertainment value of PC games.

MrCalab
05-31-2001, 04:40 AM
Originally posted by Chromebender:
Mr. Calab, exactly what kind of software development are you employed in? I am a 3D graphics developer in an American university research lab, and like Zeno I bought a GeForce 3 to explore the new hardware t&l features of that card. I'm curious as to why you think the vast majority of the world does not need the capabilities of the GeForce 2, much less the GeForce 3. I count quite a few of my most recent game purchases (of the last year) that would perform poorly on less than a GeForce 2: Black and White, Nocturne, Serious Sam, Alice, and Blade of Darkness, just to name afew. Many of these games also run even better on the GeForce 3, playable at resolutions of up to 1280x1024x32 bit (the maximum resolution of my monitor), with quincunx or 4x anti-aliasing turned on!

Plus, a quick browse of the Nvidia website shows another dozen games in development that will specifically exploit the vertex and pixel shading capabilities of the GeForce 3. I realize that the average PC does in fact have an ATI, TNT, or Intel low-end 3D accelerator, but the fact is that a computer gamer's machine is not the average PC, and both developers and consumers need to take advantage of new hardware to continue advancing the realism and entertainment value of PC games.

Hi Chromebender, I'm working in a gamesoftware-firm in Mühlheim, Westgermany and it's surely true, that a Geforce 3 is fast than a Geforce 2 or 1, but this is not what I wanted to say. You said, that the gamers machine is not the average, so I am asking me, why there is no one of my friends, who are surely not poor and bought tons of games the last months has got a Geforce 3? And just about one of ten a Geforce 2 although they are all absolutely fanatic gamers? Of course the Geforce 2 and three offer a big ammount of possibilities, the Geforce 2 has some great reflection/refraction effects for example and so on and I saw some movies of pixelshaders in action already and yes, surely they are looking great and it's no question that in the future this will be a standard anytime, but really just "anytime" and not at the moment. I think I will implement some Geforce 2 features into our game, T&L in any case, Geforce 1 is slowly getting a little piece of the market, but Pixelshaders still this year?! I hope, I really hope that NVidia will still publish a new chipset in the coming months, so that the price of the Geforce3 will fall like a stone, but if not then I will surely not waste my time for Pixelshaders, because then they also won't have any market-value. I know that there are a couple of firms supporting Geforce3-features already, but because they don't need to look for the money, but just for the image. The firm I am working in has to look at the proportions between investment and winning and to invest into the Geforce3 at the moment will surely not bring any penny more than without, that's my opinion. And at the moment there is just one feature I am really waiting for and this is this are hardware-rendered volumetric softshadows without the needed use of stencil, if a card will be sold, which is supporting this feature, then this would be something really new, but when Carmack said "And now we are even able to render the pores of the skin" and he showed just one human in the Doom 3-demo and the scene was already not floating anymore, but like a half dia-show, yeah, "And this is the Geforce 3" ;-). It's fast, surely, but in no way it's a revolution, a card which can render ten million polygons at a framerate of 100, completely softshadowed, with 50 point-lightsources and reflecting objects, this would be a revolution...and then the "pores" Carmack talked about would also be a bit more realistic than his in my opinion more than ironic sentense.

Michael

Tom Nuydens
05-31-2001, 06:00 AM
MrCalab, no offense, but could you start a new paragraph every now and then in your future posts? Makes it easier on the rest of us http://www.opengl.org/discussion_boards/ubb/wink.gif

Anyway, you're overlooking the fact that this is a developer forum. It's us, the developers, that have to adopt new cards like the GeForce3, not the end users. The end users will upgrade when they see games that make full use of the hardware.

Sure, if you want to ship a mass-market game like The Sims or whatever, you can't have it require a GeForce3. If, on the other hand, you are working on a high-profile title, or if you are just a big graphics geek like most of us here, then why wouldn't you want to experiment with the latest and greatest hardware? (If you can afford it, of course)

- Tom

Chromebender
05-31-2001, 09:25 AM
Yes, I think that advancement in the field of 3D graphics is a back-and-forth effort between the hardware vendors and us, the graphics programmers. First the chip designers add a few new features to their hardware, then it is up to the developers to entice consumers to pay for the new hardware. Sure, the average user doesn't need to pay $400 for a little more eye candy but remember, critics once said the same thing about VGA too http://www.opengl.org/discussion_boards/ubb/smile.gif

dorbie
05-31-2001, 03:52 PM
Microsoft is more subtle in it's tactics. It does things like refusing to release OpenGL 1.2 for Windows (even though they could have) while accusing the ARB of being glacial in it's progress and promoting D3D's functionality and streamlined development, even though huge swaths of D3D functionality are exclusive to a single IHV or even a single card. Furtunately Microsoft can't stop IHV's from implementing OpenGL 1.2 + extensions in their own ICD's. IHV's do this mainly because the market demands it thanks mainly to developers like John Carmack who leverage the popularity of their games to keep the world a freer place for us all.

Zeno
05-31-2001, 10:13 PM
I am software-developer as well, but not at least in my craziest dreams I would buy me a Geforce3 because of this, not at this simply perverse price for nothing


MrCalab -

First off, I'd like to second Tom's request for a new paragraph every so often http://www.opengl.org/discussion_boards/ubb/wink.gif

Now I'll address some of your questions / statements.

About the price of the Geforce 3....could you back up your claim of the price being "perverse"? I believe you can pick up these cards for $320 US now. Consider that this gets you a GPU with more transistors than any CPU and 64 Mb of RAM on a bus that is faster than any motherboard. Not only that, but it's $100 less than the GF2 ultra, and I don't remember such whining about the price of that thing when it came out. What's a pentium 4 or high end athlon costing these days? $300-$500? As such, a nice graphics card is only about 1/4 the cost of a new computer. Not too bad IMHO.

In case you were claiming that the price is just plain high (not high with respect to what you get), most college educated people make that much money in a day or two of work and most people who live in california pay many times that much a month just for a small place to live http://www.opengl.org/discussion_boards/ubb/frown.gif. In the scheme of things, it's not such a huge expense.

About the card not being revolutionary: You first define revolutionary to mean (basically) a real time raytracer that can handle pixel sized polys with 5x overdraw and then claim that the GF3 is not this. Fine, but this is a narrow definition of revolutionary. Many consider it a revolutionary thing to have specialized hardware that is, to a large extent, programmable.

Think about where this is leading us: The ultimate thing would be for the programmer to have complete control over all stages of the rendering of his world (currently only attainable in software) and have it work with blazing speed (currently only attainable in hardware). This card is bridging that gap...turning the market in the direction of complete control AND extreme speed. In that sense, it is revolutionary.

Finally, you are right that the card may not have the market penetration yet to support a game (though I think the gf2 class is more common than you think, at least here). However, many developers (such as myself and Chromebender) are not making games. I work on simulations for businesses and the government. Do you know how excited they get when you tell them that they won't have to buy a 250 THOUSAND dollar SGI machine to run the sim, but instead only a $2000 computer?? AND it will do the same (or more) stuff! A $400 graphics card with this much power is like a dream come true for them.

-- Zeno

Eric
06-01-2001, 01:17 AM
Originally posted by Zeno:
The ultimate thing would be for the programmer to have complete control over all stages of the rendering of his world (currently only attainable in software)

My post has nothing to do with the discussion itself (sorry !) but I was struck by something when I read Zeno's post: we should now stop using the words "hardware rendering" and "software rendering"... Because, when you program your GPU, you are actually writing software for it ! Now, we should say CPU-Rendering and GPU-Rendering... http://www.opengl.org/discussion_boards/ubb/wink.gif

Regards.

Eric

Chromebender
06-01-2001, 03:35 AM
Nice point, Eric. And as far as I can tell, Zeno is right about new graphics hardware really reducing the cost of real-time visualization. Sitting on my desk at work is an SGI Octane w/ a 3D monitor that cost more than my car, but what do I do my 'real' graphics development on? A P3 with a GF3 that cost a fraction of what my 'high-end' graphics machine at work cost just 4 or 5 years ago (er that's just the computer price, the monitor is new).

P.S. It might be instructive to consider that, since this SGI was new, my PC has gone from no hardware acceleration, voodoo1, voodoo2, voodoo3, voodoo5, geforce 2, geforce 3... the gap between consumer-grade graphics hardware and 'industrial strength' machines is quickly being reduced.