PDA

View Full Version : GeforceFX



KRONOS
11-21-2002, 07:36 AM
I supose everyone as hear of the GeforceFX, but go to the NVidia site, and then to products and then to GeforceFX, then Demos and them choose the Time Machine demo and view some of the pics. Now check the window border, it's named "NRenderGL".

It's nice to see Nvidia choosing OpenGL to render these Demos. Especially if all of them are rendered with GL.

Nice move Nvidia.... http://www.opengl.org/discussion_boards/ubb/wink.gif

That makes me think "Why don't they suport GL2 oficially?!"... But that is just me.

davepermen
11-21-2002, 08:42 AM
demos are easier to code in gl as you can "simply" write a proper driver having exactly what is needed as extension in.. (GL_NVX_ for the demo..)

and they don't support gl2 because they have bether things, you know, cg and that stuff.. and they don't want to follow other standards, getting dictated by them. and their hw is not capable of gl2 yet, so why exposing half gl2 features? because it could be future-save and help developers? you should not use gl2 anyways according to nvidia..

Korval
11-21-2002, 10:14 AM
Why don't they suport GL2 oficially?!

Because GL2.0 doesn't exist yet? No reason supporting an unfinished specification.

KRONOS
11-21-2002, 10:59 AM
<sight> Suport making GL2. OF COURSE it is not out yet!!! Jesus....

Sure, davepermen, it would be great to have a monopoly in graphics, only doing what NVidia wants us to do. Like: "Play the way WE want". Heil davepermen!! Heil monopoly!!

I love Nvidia, and I just hope that they embrace standarts, or help building them.....

Humus
11-21-2002, 11:02 AM
The ATi Linux driver release today really confused me. http://www.ati.com/companyinfo/press/2002/4574.html



The new unified driver provides robust OpenGL® 2.0 support for many of ATI’s award-winning graphics boards including:
RADEON™ 8500,
RADEON™ 8500LE,
ALL-IN-WONDER® RADEON™ 8500DV,
RADEON™ 9000 PRO,
RADEON™ 9500 PRO,
RADEON™ 9700 PRO,
ALL-IN-WONDER® 9700 PRO and
FIRE GL™ family of workstation products.


[This message has been edited by Humus (edited 11-21-2002).]

cass
11-21-2002, 11:05 AM
NVIDIA participates in the ARB-GL2 working group.

NVIDIA will support OpenGL 2.0 when it is defined.

KRONOS
11-21-2002, 11:11 AM
It does?! Great, I didn't know...
Thanks for the good news cass... http://www.opengl.org/discussion_boards/ubb/smile.gif
By the way, could you tell us if all of the demos were rendered with GL?

cass
11-21-2002, 11:27 AM
Yes. http://www.opengl.org/discussion_boards/ubb/smile.gif

davepermen
11-21-2002, 11:29 AM
Originally posted by KRONOS:
<sight> Suport making GL2. OF COURSE it is not out yet!!! Jesus....

Sure, davepermen, it would be great to have a monopoly in graphics, only doing what NVidia wants us to do. Like: "Play the way WE want". Heil davepermen!! Heil monopoly!!

I love Nvidia, and I just hope that they embrace standarts, or help building them.....

hey nazi, **** off! http://www.opengl.org/discussion_boards/ubb/biggrin.gif

i hope you got my post a little sarkastic..

at least, it'll be fun again. incompatible exts rulez http://www.opengl.org/discussion_boards/ubb/biggrin.gif

zed
11-21-2002, 11:30 AM
nvidia have always been very good supporting opengl eg when 1.3 came out they were out first with drivers , the same with 1.4. i see no reason why this will chnage in the future.
+ if doom3 does have a gl2.0 path then they (+ others) will have to implemt it http://www.opengl.org/discussion_boards/ubb/biggrin.gif

knackered
11-21-2002, 11:42 AM
You haven't heard Matts opinions on GL2, then? Doesn't sound like he supports it - even threatened to resign if it carried on in the direction it's heading...
I'm sure he is merely echoing what's obviously been (being) discussed at nvidias R&D department. Maybe they'll resurrect glide.

davepermen
11-21-2002, 11:56 AM
if they would care that much about standarts then
they would not suggest to bether use NV_vertex_program over ARB_vertex_program
they would bring out standart compliant ARB_fragment_program cg, as they want it to be a global standart, not only on their gpu (they say..)
they would finally start to develop standard compliant hw.. ps3.0? no.. ps2.0? YEAH WE HAVE MORE THAN THAT USE THAT MORE WE KNOW YOU CANT SUPPORT ANYONE ELSE THEN BUT USE OURS WE HAVE EVERYTHING FOR YOU JUST FORGET OTHERS ARE THERE..

sure they support the standart gl when its defined. they would loose to be a real gl supporter else.
but supporting and always trying to support (even future) standart compliant features instead of developing everywhere their own versions is _TWO_ DIFFERENT _THINGS_.

anyways, as i said, the extension missery will be fun again.. http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Tom Nuydens
11-21-2002, 01:06 PM
Originally posted by davepermen:
they would finally start to develop standard compliant hw.. ps3.0? no.. ps2.0? YEAH WE HAVE MORE THAN THAT USE THAT MORE WE KNOW YOU CANT SUPPORT ANYONE ELSE THEN BUT USE OURS WE HAVE EVERYTHING FOR YOU JUST FORGET OTHERS ARE THERE..

Can I get you a Valium, Dave?

-- Tom

FXO
11-21-2002, 03:37 PM
Humus, the quote was actually:


The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI's award-winning graphics boards including:

Or they have changed it since you visited.

On the GL2 issue:
Didn't 3Dlabs release P10 with full GL2 support?
Carmack even implemented support for OpenGL2 in Doom3 when he got a sample.

I also wonder if Nvidia and ATI will implement GL2 support on current top-hardware (R300/NV30), when the standard will be finished?

I havent really seen anybody from ATI on this board though..

dorbie
11-21-2002, 05:23 PM
Ahh the old standards game. The problem is not so much that one company or other won't share their way of doing things, it's that each vendor won't adopt the other guy's way of doing things, at least for most contentious extensions.

Yes a common high bandwidth dispatch API would be a good thing, but let's face it, this has changed so many times in recent years that any common spec agreed before now would have been deprecated already, for a while this seemed to be changing with each major new architecture revision.

rgpc
11-21-2002, 06:24 PM
Originally posted by KRONOS:
It's nice to see Nvidia choosing OpenGL to render these Demos. Especially if all of them are rendered with GL.


You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?) http://www.opengl.org/discussion_boards/ubb/wink.gif

pkaler
11-21-2002, 07:33 PM
Originally posted by dorbie:

Yes a common high bandwidth dispatch API would be a good thing, but let's face it, this has changed so many times in recent years that any common spec agreed before now would have been deprecated already, for a while this seemed to be changing with each major new architecture revision.

That's why GL2 needs to be looking 10 years into the future. GL was introduced in the early 90's and did not start to get extension crazy until after 1999 or so. There are features (such as the accumulation buffer) that are not supported widely at all. But it was insightful to have a stencil buffer designed from the start and it did not really matter that it was hardware-accelerated just very recently.

But I see your point. There's VAR and VAR2. There's NV_vp and NV_vp1_1. Then there are the EXT and ARB extensions that have changed.

It was kind of a "this is what the hardware can do" solution. Rather than a general solution that the hardware could partially implement.

I should go off and re-review the GL2 specs. And see what the 9700 and GfFx is capable off with respect to the spec.

Btw, any word on an ARB_VAR or ARB_VAO extension?

nukem
11-21-2002, 07:41 PM
Nvidia is pretty kool and there new card looks great. There going to support GL2 especialy if Doom III has it. I was looken on there page the other day and there working with IDSoftware to make it, or they got something going they link to them. Nvidia seems to be going more with OpenGL most of the games on there page they link to are all OGL. There going to wait till OGL 2.0 stable comes out to cover there @$$'s http://www.opengl.org/discussion_boards/ubb/wink.gif

davepermen
11-21-2002, 07:47 PM
Originally posted by FXO:
I havent really seen anybody from ATI on this board though..

hm, there are quite a few around..

richardve
11-21-2002, 10:19 PM
Originally posted by rgpc:
You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?) http://www.opengl.org/discussion_boards/ubb/wink.gif

Man, you're talking out of your butt.

Only Microsoft wraps OpenGL to DX in their XP-GL drivers..
(they do not have the time to create an updated 1.4 driver though.. hm, oh well.. why would I mind.. I'm not using Doors(tm) anyway http://www.opengl.org/discussion_boards/ubb/biggrin.gif)

zed
11-21-2002, 11:54 PM
>>You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?)<<

if u mean what i think u mean.
like d3d, opengl often uses directdraw under windows for the actual rendering

and in case your next question is
"in that case d3d must work better cause d3d + directdraw are a team"
the answer is
NO http://www.opengl.org/discussion_boards/ubb/smile.gif

masterpoi
11-22-2002, 02:24 AM
Humus: now it reads:


The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI's award-winning graphics boards including:

Humus
11-22-2002, 03:41 AM
Originally posted by FXO:
Humus, the quote was actually:

The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI's award-winning graphics boards including:

Or they have changed it since you visited.

Ah, they have changed it now. I merely did a copy'n'paste from that page. I suppose there were more confused people asking so they clarified it.

[This message has been edited by Humus (edited 11-22-2002).]

mrbill
11-22-2002, 04:36 AM
Originally posted by Humus:
[B] Ah, they have changed it now. I merely did a copy'n'paste from that page. I suppose there were more confused people asking so they clarified it.
[B]

Blush.

You'll want to check the page in a few hours as it gets clarified yet again.

It will eventually just say "OpenGL 1.4 support for many of ATI's award winning graphics boards...."

ATI does *not* currently export any GL2 extensions in the Linux Driver Version 2.4.3.


ATI is currently leading the OpenGL 2.0 Working Group, aka arb-gl2.
http://www.sgi.com/newsroom/press_releases/2002/september/opengl.html

The OpenGL 2.0 Working Group is currently working on a draft shading language specification and three draft GL2 extensions. See the September 2002 ARB minutes.


Repeat, ATI does *not* currently export any GL2 extensions in the Linux Driver Version 2.4.3.

Sorry for the confusion.

-mr. bill

FXO
11-23-2002, 11:04 AM
What pussels me is that while OpenGL2 is not complete, how can 3Dlabs ship a card with GL2 support and how can carmack support a GL2 rendering path in Doom3?

Since nobody from NVIDIA/ATI has responded, do any of you know their plans for supporting GL2?
Will they ship new GL2 specific boards, or will they make some of the "older" boars GL2 compliant through drivers also?

jwatte
11-23-2002, 11:21 AM
The reason OpenGL was "looking into the future" when it was first released was that they were looking at what high-end graphics stations could do at the time, and tailoring the API to that. Once consumer hardware caught up, they started to have to add extensions, because the API didn't go further than that.

Unfortunately, there's not a whole lot that high-end stations do today that's much different from the consumer hardware -- in many ways, the consumer harwdare is leading the way. So they don't have an already working, optimized implementation to borrow all the good bits from anymore.

Also, I don't think OpenGL 1.0 looked into the future as much as some would have it. Little things like TEXTURE OBJECTS were missing from that version...

mrbill
11-23-2002, 12:19 PM
Originally posted by FXO:

Since nobody from NVIDIA/ATI has responded, do any of you know their plans for supporting GL2?



I *am* from ATI. (And Cass is from NVIDIA btw.)

ATI is leading the OpenGL 2.0 working group.

ATI helped present the 1/2 day Siggraph 2002 OpenGL 2.0 course.

ATI also presented a technology demo of the GL2 shading langauge at Siggraph 2002 running on a Radeon 9700.

Hopefully the next time ATI announces OpenGL 2.0 support (more correctly, GL2 extension support) I won't have to post a retraction. [insert silly smiley face]

-mr. bill

FXO
11-23-2002, 12:56 PM
Thanks, I'll check out the siggraph presentation.

Sounds like there is a slight chance that the r300 will become GL2 compliant later on, correct me if im wrong.

zeroprey
11-23-2002, 02:19 PM
doesnt it need loops in the fragment programs to be ogl2 compliant? Also for that presentation did you just use the non looping shaders to show it off?

V-man
11-23-2002, 07:21 PM
Originally posted by FXO:
What pussels me is that while OpenGL2 is not complete, how can 3Dlabs ship a card with GL2 support and how can carmack support a GL2 rendering path in Doom3?

They made a GL2 driver(beta) for working on GL2. It is distributed to all the companies that are collaborating. Not much hype happening now but GL2 is getting ready.

Im not sure if memory is failing me, but I think I read "the first GL2 card" somewhere. It can do loops in vp. It is suppose to be fully programmeable, so it should be GL2 ready.

Basically, they want to make a GPU as programmable as a CPU, but a zillion times faster at 3D graphics.

Who knows, but maybe in 10 years, all we will need is a GPU who's circuits could also double as standard CPU. For example, the MMX unit could be used for certain operation for 2D graphics, but could also be used for sound.

The GPU might become the center of the PC.

V-man