Which hlsl: cG or glshade will be in 2.0 core?

Ok, hopefully, I’m getting close to understanding this :slight_smile:
The way I get it is that neither DirectX nor OpenGl have hight level shading language, but they both have programmable pixel/fragment shaders (similar in features). Now cG is a hlsl from NVidia that can be used instead of the assembly-like pixel shaders, for both the Direct3d and OpenGL, but it only supports NVidia FX family of cards. So far so good? Now, glshade is similar to cG, but it is from 3d labs (by the way, which is more powerful cG or glshade). Now, speaking of 2.0, it is supposed to have hlsl. Will it simply be wither cG or glshade added into the core, like with a lot of previous extensions, or will it be something totally different?
Hopefully I’m getting somewhere with understanding of all this :slight_smile:
Thanks again for all your help,
Luke

Originally posted by BigShooter:
The way I get it is that neither DirectX nor OpenGl have hight level shading language, but they both have programmable pixel/fragment shaders (similar in features).
DX has HLSL (btw, this HLSL is really the name of the language (kinda stupid). So speaking of a hlsl is something general, speaking of HLSL is speaking about D3D’s hlsl. Hope this makes sense ). OpenGL has glslang (as an extension).

Now cG is a hlsl from NVidia that can be used instead of the assembly-like pixel shaders, for both the Direct3d and OpenGL, but it only supports NVidia FX family of cards. So far so good?
No. Cg can be compiled into assembly-like pixel shaders like you said, which can be executed on any compliant hardware (depending on the chosen target).

And about glslang, it will likely be integrated into the core next year.

Ok, I think I almost got it :slight_smile:
Thanks for all the explaination though.
OK, two more questions if you don’t mind:

  • I use NVidia card. Can I still use glslang from 3d labs, or is that only a vendor specific extension?

  • What is better and more powerful to learn as shading language: cG or glslang ?

Thank you,
Luke

CG works now… just download and test it.

GLSlang is just released as a specification, but i believe that it will be implemented very soon by the big vendors ( 3dlabs, ati and nvidia) and GLSlang is an ARB extension, so its not vendor specific, but still an extension.

Originally posted by BigShooter:
- I use NVidia card. Can I still use glslang from 3d labs, or is that only a vendor specific extension?

glslang is an ARB extension, meaning that all hardware vendors are free to implement it (free, not forced). As long as NVIDIA supports glslang, you can use it. Note that since glslang has just been approved by the ARB, today there is no official driver that supports glslang. This should change quickly however, I’m sure next crop of official drivers will have support for it(ATI has already experimental entrypoints in their drivers).

  • What is better and more powerful to learn as shading language: cG or glslang ?

I wouldn’t say that one is more powerful than the other (in terms of what you can do with them). The question is more about how you want to manage your shader code. Well, in fact there is a key difference : Cg is an offline language (you write you shader in Cg, compile it to, say, ARB_fragment_program (== you Cg code is translated into assembly-like ARB_fp code) and then this assembly file is used by your app, as a resource, much like you create your textures offline and load them at run time), whereas glslang shaders can be generated on the fly.