PDA

View Full Version : And the OGL 2.0 HLSL is!



no-one
09-16-2002, 02:34 PM
apparently 3D-Labs Glslang!!!

read at
http://www.theinquirer.net/?article=5418

comments thoughts?

V-man
09-16-2002, 03:45 PM
Im not surprised. I think that 3dlabs with their P10 and there push for GL2 gave them the edge.

Any other source for this news. Where are the ARB minutes stored exactly? Nice to hear that the wheels are in motion.

V-man

dorbie
09-16-2002, 04:06 PM
No this is still up in the air.

The biggest question is what is NVIDIA's position w.t.r. GLslang. They do support the most hardware out there. Was this a straw poll or was it some official vote?

If they aren't going to support it then the ARB has f*'d things up royally.

I'm getting annoyed with all sides in this fiasco. The last thing these *******s want is to be under Microsoft's cointrol but they just can't help themselves competing over I.P and bragging rights for having created 'the' shader language. Unfortunately they're selling the rest of us down the river in the process.

I thought they were supposed to be working on some middleground that everyone could agree on, what the heck happened to that?

PH
09-16-2002, 04:15 PM
I couldn't have said it better myself dorbie. Very good ( and graphic http://www.opengl.org/discussion_boards/ubb/smile.gif ) description of the current situation.

Zeno
09-16-2002, 04:27 PM
I just hope they can agree on something before the burden of having to make special code for 5 different graphics chipsets/API's forces everyone to D3D http://www.opengl.org/discussion_boards/ubb/frown.gif

Korval
09-16-2002, 04:37 PM
This is what happens when you have a group comprised entirely of compeditors. They will screw each other over as much and as often as possible.

There is something to be said for having one guy decide what will be the shading language; at least then, it's uniform (having the obvious downside of making that one guy extremely powerful).

PH
09-16-2002, 04:37 PM
I agree Zeno. That was my first thought when I heard of Cg. I'm sure it's just as good as the 3Dlabs proposal but it has already created a mess. There's no way NVIDIA is going to drop Cg.

Humus
09-16-2002, 04:47 PM
Language choise doesn't matter much to me, glslang or Cg, no difference really. With 3dlabs proposed language already out there and microsoft supposedly adding some kind of HLSL to direct3d I don't see the point really why nVidia developed Cg.

no-one
09-16-2002, 04:48 PM
>Any other source for this news. Where are the ARB minutes stored exactly?

not yet, and don't know.

>There's no way NVIDIA is going to drop Cg.

what NVidia does if the above is indeed true is going to give a lot of insight on their position in the ARB, and towards GL 2 all-together.


[This message has been edited by no-one (edited 09-16-2002).]

IT
09-16-2002, 04:53 PM
I don't see the point of a "one size fits all" type of shading language anymore, since most apps either do OGL or D3D and not both and the feature gaps between OGL and D3D are a lot smaller than they used to be. Granted, there are some that may feel the need to support both.

If OGL has a HLSL and D3D has a HLSL, great, as long as they both support the current hardware (to a reasonable degree) and look to the future.

vshader
09-16-2002, 05:15 PM
i just really wish we could convince nVidia that focussing Cg is a really bad idea...like Humus pointed out it is pretty much redundant given the GLslang and DX9 HLSLs, and is gonna cause no end of problems with GL2...ideally they would try and merge it with GLslang... or at least make it compatible . (ie Cg = GLslang + extras + can be compiled to DX9)

nvidia could have been one of the driving forces behind GL2, but now their vested interest is probably in promoting some kind of GL1.x + tons of proprietary extensions...it all really sux.

doesn't it make sense to them that it's better for business if all the cool new graphics features are consolidated in one ARB-approved cross-platform API. kudos to ATI for choosing to go with GL_ARB_FRAGMENT_PROGRAM for the 9700 (if thats true). it's a step in the right direction... nV should do the same...

zed
09-16-2002, 08:57 PM
contary to my initial reaction. cg was good for opengl (though nvidia method of introduction wasnt eg not having loops 'yet' cause there hardware couldnt handle loops 'yet' but anyways)
unfortunatly cg looks to be dead (practically noones using it) oh well roll on the next victum er i mean candidate http://www.opengl.org/discussion_boards/ubb/biggrin.gif hopefully that will suceed better

GeLeTo
09-17-2002, 01:05 AM
NVidia can support both the Cg and the GL2 HLSL in the Cg toolkit. I remember reading somewhere that initialiy the Cg shading language was different from the DX9 HLSL and NVidia later changed it to match it. They can do the same with regards to the the gl2 SL (while still keeping support for the old Cg SL). Also they can add support for Cg into OpenGL using an GL_GL2_shader_objects interface - that would be nice for supporting older hardware.

davepermen
09-17-2002, 01:14 AM
Originally posted by GeLeTo:
NVidia can support both the Cg and the GL2 HLSL in the Cg toolkit. I remember reading somewhere that initialiy the Cg shading language was different from the DX9 HLSL and NVidia later changed it to match it. They can do the same with regards to the the gl2 SL (while still keeping support for the old Cg SL). Also they can add support for Cg into OpenGL using an GL_GL2_shader_objects interface - that would be nice for supporting older hardware.

cool, dialecs for every problem. why not using the actual language for the problem? the gl language for gl, the dx language for dx.. the wonderful allinwonderlanguage does nto exist, cg is not it.. too much versionconflics will exist, and while the errors are then in a nice way (not a crash but a message from the system, that thing is not supported by your hw) it doesn't help.. so you need to code for each gpu another version again..

support for old hw would be nice. but the old hw is so restricted in design, you can't really fit a real advanced language so that it can work there as well. you could, but its not worthing the effort..

go for glslang, just because the name is funny http://www.opengl.org/discussion_boards/ubb/biggrin.gif and because its a real language, wich will be from ground up fully there. cg is quite useless.. once you have an nv30 you will not code gf3shaders anymore, and you know it.. it will not be compliant anymore.. so you need to code different codepaths again.. so what?

i both have a problem that nvidia _did_ cg, its just useless somehow, and cg in its design is far from being nice as well..

can't wait for OpenGL2.0 and OpenRT cards.. back to the future..

Tom Nuydens
09-17-2002, 01:28 AM
The R300 cards are out, but there's no HLSL to use for them. The NV30 should be out "soon" as well -- by that time DX9 will also be out, and the D3D crowd will have their HLSL, but where does that leave OpenGL?

Given that OpenGL 2.0 wasn't around for the R300 release, and that it won't be around for the NV30 release either, I can perfectly understand why NVIDIA came up with their own HLSL. In fact, I'm surprised ATI didn't do the same!

The usual "upgrade path" for OpenGL these days is: IHVs implement new features the way they think they should be implemented. They present their new extensions to the ARB, and over time they can get merged into a vendor-independent standard. This is happening in GL 1.4 and in the 2.0 proposals with things like memory management, synchronization and so on. This is excellent, and if GL 2.0 can save me from duplicating lots of low-level code for different vendors (e.g. VAR vs VAO), then I'm all for it, and the sooner the better.

But might it not be too early to do the same for the HLSL? The fact that it's being labeled as "3Dlabs' HLSL" sums it up for me. Like Dorbie said, why aren't they trying to come up with some middle ground? If only ATI had their own HLSL, Id be in favor of shipping GL 2.0 without the shading language. Or to call it GL 1.5 (whatever), and take the time to get the standard shading language right.

-- Tom

ehart
09-17-2002, 03:46 AM
I can't really say too much as this is work in progress taking place under the participant's agreement. What I can say is I think a lot of people here are reading too much into what they have found on the internet. I do feel safe in saying that no particular contribution has been thrown out of the working group.

Think back to the vertex programming debate. There were several specs all before the ARB. At some point, one had to be chosen as the basis for editing to the final version. In the end, that spec had a lot of changes based on other contributions to the working group.

Hopefully this month's ARB meeting notes will shed a bit of light on this all, so people aren't getting pieces of rumor and innuendo. (sp?) BTW, this month's meeting is later than usual, so don't think something is going to be posted tomorrow.

-Evan

mcraighead
09-17-2002, 01:45 PM
Originally posted by Tom Nuydens:
The R300 cards are out, but there's no HLSL to use for them.

Actually, we've had Cg working on both ATI and 3Dlabs hardware for some time now.

- Matt

pocketmoon
09-18-2002, 12:24 AM
Originally posted by mcraighead:
Actually, we've had Cg working on both ATI and 3Dlabs hardware for some time now.

- Matt

And when ATI release the RenderMonkey SDK there's nothing to stop someone (NVidia ?) writing a plugin to support Cg in Rendermonkey http://www.opengl.org/discussion_boards/ubb/smile.gif

- Rob

Tom Nuydens
09-18-2002, 12:53 AM
Originally posted by mcraighead:
Actually, we've had Cg working on both ATI and 3Dlabs hardware for some time now.

Cool, I hadn't heard about this. Using OpenGL?

-- Tom

knackered
09-18-2002, 01:09 AM
Originally posted by mcraighead:
Actually, we've had Cg working on both ATI and 3Dlabs hardware for some time now.

- Matt

So what you're saying is that Cg works with vertex/pixel shaders in D3d, so yes you've had it working with other hardware for some time, but that is just the nature of d3d - it's hardware abstracted.
Obviously, vertex shaders will work on all hardware in opengl if your profile supports the ARB vertex program, but are you also saying that you've got a version of nvparse that converts PS code into the various vendor specific pixel shader API calls?

knackered
09-18-2002, 01:56 PM
You're obviously refering to d3d then - so why mention that fact on an opengl forum?
It's like microsoft claiming winxp works on 'linux hardware'...

mcraighead
09-18-2002, 03:34 PM
Ummm, no, I'm talking about OpenGL. We have Cg programs running as ARB_vertex_programs on ATI hardware, and as NV_vertex_programs (would you believe it?) on 3Dlabs hardware.

- Matt

knackered
09-18-2002, 11:48 PM
Umm, Matt, that's only half of Cg - the pixelshaders are what most people are interested in - you know, the stuff we can't do on the CPU?

SirKnight
09-19-2002, 05:51 AM
... and as NV_vertex_programs (would you believe it?) on 3Dlabs hardware.


Didn't suprise me! http://www.opengl.org/discussion_boards/ubb/biggrin.gif I remember John Carmack talking about his experience with the P10 3DLabs sent him. He said the P10 has basic support for NV_vertex_program (version 1.0) and NV_register_combiners. I find that pretty cool that another card that's _NOT_ a GeForce has basic support for some of the NV GeForce extensions. If only all cards shared extensions like this. http://www.opengl.org/discussion_boards/ubb/wink.gif

-SirKnight

V-man
09-19-2002, 07:39 AM
I thought I had seen MESA getting the reg combiner extension. There was even a demand for some help on it.

Must have been my imagination cause it's not there now. hmmm......

V-man

davepermen
09-19-2002, 08:00 AM
Originally posted by SirKnight:
Didn't suprise me! http://www.opengl.org/discussion_boards/ubb/biggrin.gif I remember John Carmack talking about his experience with the P10 3DLabs sent him. He said the P10 has basic support for NV_vertex_program (version 1.0) and NV_register_combiners. I find that pretty cool that another card that's _NOT_ a GeForce has basic support for some of the NV GeForce extensions. If only all cards shared extensions like this. http://www.opengl.org/discussion_boards/ubb/wink.gif

-SirKnight

well, its not the job of a gpu to adapt the extensions of other vendors, i bet you would prefer it to get real extensions instead of that proprietary stuff..

and i would prefer bether designed extensions would fit onto all systems (i mean, sorry, register combiners. they are great fun, but handy? no.. cool, nonthesess, but not handy..)

knackered
09-19-2002, 10:33 AM
Matt, so that's that. No opengl Cg fragment stuff on anything but nvidia hardware, right?

SirKnight
09-19-2002, 11:56 AM
I was not being totally serious on my last remark there. Hence the ' http://www.opengl.org/discussion_boards/ubb/wink.gif' face. The message I was displaying though meant that it would be nice if all cards supported the same things the same way. Like you say none of this proprietary stuff. http://www.opengl.org/discussion_boards/ubb/smile.gif

-SirKnight

mcraighead
09-19-2002, 01:18 PM
I think it's very safe to say that we will make sure Cg has support for ARB_fragment_program.

- Matt

knackered
09-19-2002, 01:53 PM
LOL! http://www.opengl.org/discussion_boards/ubb/smile.gif