GeforceFX

I supose everyone as hear of the GeforceFX, but go to the NVidia site, and then to products and then to GeforceFX, then Demos and them choose the Time Machine demo and view some of the pics. Now check the window border, it’s named “NRenderGL”.

It’s nice to see Nvidia choosing OpenGL to render these Demos. Especially if all of them are rendered with GL.

Nice move Nvidia…

That makes me think “Why don’t they suport GL2 oficially?!”… But that is just me.

demos are easier to code in gl as you can “simply” write a proper driver having exactly what is needed as extension in… (GL_NVX_ for the demo…)

and they don’t support gl2 because they have bether things, you know, cg and that stuff… and they don’t want to follow other standards, getting dictated by them. and their hw is not capable of gl2 yet, so why exposing half gl2 features? because it could be future-save and help developers? you should not use gl2 anyways according to nvidia…

Why don’t they suport GL2 oficially?!

Because GL2.0 doesn’t exist yet? No reason supporting an unfinished specification.

<sight> Suport making GL2. OF COURSE it is not out yet!!! Jesus…

Sure, davepermen, it would be great to have a monopoly in graphics, only doing what NVidia wants us to do. Like: “Play the way WE want”. Heil davepermen!! Heil monopoly!!

I love Nvidia, and I just hope that they embrace standarts, or help building them…

The ATi Linux driver release today really confused me. http://www.ati.com/companyinfo/press/2002/4574.html

The new unified driver provides robust OpenGL® 2.0 support for many of ATI’s award-winning graphics boards including:
RADEON™ 8500,
RADEON™ 8500LE,
ALL-IN-WONDER® RADEON™ 8500DV,
RADEON™ 9000 PRO,
RADEON™ 9500 PRO,
RADEON™ 9700 PRO,
ALL-IN-WONDER® 9700 PRO and
FIRE GL™ family of workstation products.

[This message has been edited by Humus (edited 11-21-2002).]

NVIDIA participates in the ARB-GL2 working group.

NVIDIA will support OpenGL 2.0 when it is defined.

It does?! Great, I didn’t know…
Thanks for the good news cass…
By the way, could you tell us if all of the demos were rendered with GL?

Yes.

Originally posted by KRONOS:
[b]<sight> Suport making GL2. OF COURSE it is not out yet!!! Jesus…

Sure, davepermen, it would be great to have a monopoly in graphics, only doing what NVidia wants us to do. Like: “Play the way WE want”. Heil davepermen!! Heil monopoly!!

I love Nvidia, and I just hope that they embrace standarts, or help building them…[/b]

hey nazi, **** off!

i hope you got my post a little sarkastic…

at least, it’ll be fun again. incompatible exts rulez

nvidia have always been very good supporting opengl eg when 1.3 came out they were out first with drivers , the same with 1.4. i see no reason why this will chnage in the future.

  • if doom3 does have a gl2.0 path then they (+ others) will have to implemt it

You haven’t heard Matts opinions on GL2, then? Doesn’t sound like he supports it - even threatened to resign if it carried on in the direction it’s heading…
I’m sure he is merely echoing what’s obviously been (being) discussed at nvidias R&D department. Maybe they’ll resurrect glide.

if they would care that much about standarts then
they would not suggest to bether use NV_vertex_program over ARB_vertex_program
they would bring out standart compliant ARB_fragment_program cg, as they want it to be a global standart, not only on their gpu (they say…)
they would finally start to develop standard compliant hw… ps3.0? no… ps2.0? YEAH WE HAVE MORE THAN THAT USE THAT MORE WE KNOW YOU CANT SUPPORT ANYONE ELSE THEN BUT USE OURS WE HAVE EVERYTHING FOR YOU JUST FORGET OTHERS ARE THERE…

sure they support the standart gl when its defined. they would loose to be a real gl supporter else.
but supporting and always trying to support (even future) standart compliant features instead of developing everywhere their own versions is TWO DIFFERENT THINGS.

anyways, as i said, the extension missery will be fun again…

Originally posted by davepermen:
they would finally start to develop standard compliant hw… ps3.0? no… ps2.0? YEAH WE HAVE MORE THAN THAT USE THAT MORE WE KNOW YOU CANT SUPPORT ANYONE ELSE THEN BUT USE OURS WE HAVE EVERYTHING FOR YOU JUST FORGET OTHERS ARE THERE…

Can I get you a Valium, Dave?

– Tom

Humus, the quote was actually:

The new unified driver provides robust OpenGL® 1.4 support, with 2.0 extensions, for many of ATI’s award-winning graphics boards including:

Or they have changed it since you visited.

On the GL2 issue:
Didn’t 3Dlabs release P10 with full GL2 support?
Carmack even implemented support for OpenGL2 in Doom3 when he got a sample.

I also wonder if Nvidia and ATI will implement GL2 support on current top-hardware (R300/NV30), when the standard will be finished?

I havent really seen anybody from ATI on this board though…

Ahh the old standards game. The problem is not so much that one company or other won’t share their way of doing things, it’s that each vendor won’t adopt the other guy’s way of doing things, at least for most contentious extensions.

Yes a common high bandwidth dispatch API would be a good thing, but let’s face it, this has changed so many times in recent years that any common spec agreed before now would have been deprecated already, for a while this seemed to be changing with each major new architecture revision.

Originally posted by KRONOS:
It’s nice to see Nvidia choosing OpenGL to render these Demos. Especially if all of them are rendered with GL.

You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?)

Originally posted by dorbie:

Yes a common high bandwidth dispatch API would be a good thing, but let’s face it, this has changed so many times in recent years that any common spec agreed before now would have been deprecated already, for a while this seemed to be changing with each major new architecture revision.

That’s why GL2 needs to be looking 10 years into the future. GL was introduced in the early 90’s and did not start to get extension crazy until after 1999 or so. There are features (such as the accumulation buffer) that are not supported widely at all. But it was insightful to have a stencil buffer designed from the start and it did not really matter that it was hardware-accelerated just very recently.

But I see your point. There’s VAR and VAR2. There’s NV_vp and NV_vp1_1. Then there are the EXT and ARB extensions that have changed.

It was kind of a “this is what the hardware can do” solution. Rather than a general solution that the hardware could partially implement.

I should go off and re-review the GL2 specs. And see what the 9700 and GfFx is capable off with respect to the spec.

Btw, any word on an ARB_VAR or ARB_VAO extension?

Nvidia is pretty kool and there new card looks great. There going to support GL2 especialy if Doom III has it. I was looken on there page the other day and there working with IDSoftware to make it, or they got something going they link to them. Nvidia seems to be going more with OpenGL most of the games on there page they link to are all OGL. There going to wait till OGL 2.0 stable comes out to cover there @$$'s

Originally posted by FXO:
I havent really seen anybody from ATI on this board though…

hm, there are quite a few around…

Originally posted by rgpc:
You mean their DX wrapper? (Or does it state in the GL specification that GL uses DX error messages when it bombs?)

Man, you’re talking out of your butt.

Only Microsoft wraps OpenGL to DX in their XP-GL drivers…
(they do not have the time to create an updated 1.4 driver though… hm, oh well… why would I mind… I’m not using Doors™ anyway )