And the OGL 2.0 HLSL is!

apparently 3D-Labs Glslang!!!

read at
http://www.theinquirer.net/?article=5418

comments thoughts?

Im not surprised. I think that 3dlabs with their P10 and there push for GL2 gave them the edge.

Any other source for this news. Where are the ARB minutes stored exactly? Nice to hear that the wheels are in motion.

V-man

No this is still up in the air.

The biggest question is what is NVIDIA’s position w.t.r. GLslang. They do support the most hardware out there. Was this a straw poll or was it some official vote?

If they aren’t going to support it then the ARB has f*'d things up royally.

I’m getting annoyed with all sides in this fiasco. The last thing these *******s want is to be under Microsoft’s cointrol but they just can’t help themselves competing over I.P and bragging rights for having created ‘the’ shader language. Unfortunately they’re selling the rest of us down the river in the process.

I thought they were supposed to be working on some middleground that everyone could agree on, what the heck happened to that?

I couldn’t have said it better myself dorbie. Very good ( and graphic ) description of the current situation.

I just hope they can agree on something before the burden of having to make special code for 5 different graphics chipsets/API’s forces everyone to D3D

I agree Zeno. That was my first thought when I heard of Cg. I’m sure it’s just as good as the 3Dlabs proposal but it has already created a mess. There’s no way NVIDIA is going to drop Cg.

This is what happens when you have a group comprised entirely of compeditors. They will screw each other over as much and as often as possible.

There is something to be said for having one guy decide what will be the shading language; at least then, it’s uniform (having the obvious downside of making that one guy extremely powerful).

Language choise doesn’t matter much to me, glslang or Cg, no difference really. With 3dlabs proposed language already out there and microsoft supposedly adding some kind of HLSL to direct3d I don’t see the point really why nVidia developed Cg.

>Any other source for this news. Where are the ARB minutes stored exactly?

not yet, and don’t know.

>There’s no way NVIDIA is going to drop Cg.

what NVidia does if the above is indeed true is going to give a lot of insight on their position in the ARB, and towards GL 2 all-together.

[This message has been edited by no-one (edited 09-16-2002).]

I don’t see the point of a “one size fits all” type of shading language anymore, since most apps either do OGL or D3D and not both and the feature gaps between OGL and D3D are a lot smaller than they used to be. Granted, there are some that may feel the need to support both.

If OGL has a HLSL and D3D has a HLSL, great, as long as they both support the current hardware (to a reasonable degree) and look to the future.

i just really wish we could convince nVidia that focussing Cg is a really bad idea…like Humus pointed out it is pretty much redundant given the GLslang and DX9 HLSLs, and is gonna cause no end of problems with GL2…ideally they would try and merge it with GLslang… or at least make it compatible . (ie Cg = GLslang + extras + can be compiled to DX9)

nvidia could have been one of the driving forces behind GL2, but now their vested interest is probably in promoting some kind of GL1.x + tons of proprietary extensions…it all really sux.

doesn’t it make sense to them that it’s better for business if all the cool new graphics features are consolidated in one ARB-approved cross-platform API. kudos to ATI for choosing to go with GL_ARB_FRAGMENT_PROGRAM for the 9700 (if thats true). it’s a step in the right direction… nV should do the same…

contary to my initial reaction. cg was good for opengl (though nvidia method of introduction wasnt eg not having loops ‘yet’ cause there hardware couldnt handle loops ‘yet’ but anyways)
unfortunatly cg looks to be dead (practically noones using it) oh well roll on the next victum er i mean candidate hopefully that will suceed better

NVidia can support both the Cg and the GL2 HLSL in the Cg toolkit. I remember reading somewhere that initialiy the Cg shading language was different from the DX9 HLSL and NVidia later changed it to match it. They can do the same with regards to the the gl2 SL (while still keeping support for the old Cg SL). Also they can add support for Cg into OpenGL using an GL_GL2_shader_objects interface - that would be nice for supporting older hardware.

Originally posted by GeLeTo:
NVidia can support both the Cg and the GL2 HLSL in the Cg toolkit. I remember reading somewhere that initialiy the Cg shading language was different from the DX9 HLSL and NVidia later changed it to match it. They can do the same with regards to the the gl2 SL (while still keeping support for the old Cg SL). Also they can add support for Cg into OpenGL using an GL_GL2_shader_objects interface - that would be nice for supporting older hardware.

cool, dialecs for every problem. why not using the actual language for the problem? the gl language for gl, the dx language for dx… the wonderful allinwonderlanguage does nto exist, cg is not it… too much versionconflics will exist, and while the errors are then in a nice way (not a crash but a message from the system, that thing is not supported by your hw) it doesn’t help… so you need to code for each gpu another version again…

support for old hw would be nice. but the old hw is so restricted in design, you can’t really fit a real advanced language so that it can work there as well. you could, but its not worthing the effort…

go for glslang, just because the name is funny and because its a real language, wich will be from ground up fully there. cg is quite useless… once you have an nv30 you will not code gf3shaders anymore, and you know it… it will not be compliant anymore… so you need to code different codepaths again… so what?

i both have a problem that nvidia did cg, its just useless somehow, and cg in its design is far from being nice as well…

can’t wait for OpenGL2.0 and OpenRT cards… back to the future…

The R300 cards are out, but there’s no HLSL to use for them. The NV30 should be out “soon” as well – by that time DX9 will also be out, and the D3D crowd will have their HLSL, but where does that leave OpenGL?

Given that OpenGL 2.0 wasn’t around for the R300 release, and that it won’t be around for the NV30 release either, I can perfectly understand why NVIDIA came up with their own HLSL. In fact, I’m surprised ATI didn’t do the same!

The usual “upgrade path” for OpenGL these days is: IHVs implement new features the way they think they should be implemented. They present their new extensions to the ARB, and over time they can get merged into a vendor-independent standard. This is happening in GL 1.4 and in the 2.0 proposals with things like memory management, synchronization and so on. This is excellent, and if GL 2.0 can save me from duplicating lots of low-level code for different vendors (e.g. VAR vs VAO), then I’m all for it, and the sooner the better.

But might it not be too early to do the same for the HLSL? The fact that it’s being labeled as “3Dlabs’ HLSL” sums it up for me. Like Dorbie said, why aren’t they trying to come up with some middle ground? If only ATI had their own HLSL, I’d be in favor of shipping GL 2.0 without the shading language. Or to call it GL 1.5 (whatever), and take the time to get the standard shading language right.

– Tom

I can’t really say too much as this is work in progress taking place under the participant’s agreement. What I can say is I think a lot of people here are reading too much into what they have found on the internet. I do feel safe in saying that no particular contribution has been thrown out of the working group.

Think back to the vertex programming debate. There were several specs all before the ARB. At some point, one had to be chosen as the basis for editing to the final version. In the end, that spec had a lot of changes based on other contributions to the working group.

Hopefully this month’s ARB meeting notes will shed a bit of light on this all, so people aren’t getting pieces of rumor and innuendo. (sp?) BTW, this month’s meeting is later than usual, so don’t think something is going to be posted tomorrow.

-Evan

Originally posted by Tom Nuydens:
The R300 cards are out, but there’s no HLSL to use for them.

Actually, we’ve had Cg working on both ATI and 3Dlabs hardware for some time now.

  • Matt

Originally posted by mcraighead:
[b] Actually, we’ve had Cg working on both ATI and 3Dlabs hardware for some time now.

  • Matt[/b]

And when ATI release the RenderMonkey SDK there’s nothing to stop someone (NVidia ?) writing a plugin to support Cg in Rendermonkey

  • Rob

Originally posted by mcraighead:
Actually, we’ve had Cg working on both ATI and 3Dlabs hardware for some time now.

Cool, I hadn’t heard about this. Using OpenGL?

– Tom

Originally posted by mcraighead:
[b] Actually, we’ve had Cg working on both ATI and 3Dlabs hardware for some time now.

  • Matt[/b]

So what you’re saying is that Cg works with vertex/pixel shaders in D3d, so yes you’ve had it working with other hardware for some time, but that is just the nature of d3d - it’s hardware abstracted.
Obviously, vertex shaders will work on all hardware in opengl if your profile supports the ARB vertex program, but are you also saying that you’ve got a version of nvparse that converts PS code into the various vendor specific pixel shader API calls?