Has anyone ever used Nvidia Shader Library?

<a href=“NVIDIA Shader Library” rel=“nofollow” target=“_blank”>

Has anyone ever used the Nvidia Shader Library? There are tons of great CG shaders there. I would really like to use them in a project. I’ve used GLSL and became discouraged with the lack of resources, like not being able to find code for environment cube mapping. Let me know if anyone has used Nvidia’s shaders before or if anyone knows of a good place to start for using them. I have never used CG before and I would appreciate some knowledge on how I should start, Nvidia’s website just sorta throws this stuff out there with no entry point :frowning:

BTW, here is my GLSL shader gallery video [url=http://bluebomber128.wordpress.com/2009/07/11/opengl-shader-gallery/]GLSL Shader Gallery Video</a>. Such a great language for OpenGL, but just not enough community support :frowning:

Cg is based on glsl. Cg is Nvidia specific. Cg has no support on ATM chip. This forum targets languages that can work on both chips.

NVIDIA forum may have more members using CG.

Some people have good reasons to use Cg instead of glsl. One could be because they like the syntax, provided development tools,… and especially because its API supports OpenGL and D3D.

But the sad thing is that ati hardware still only support the old arbvp1 and arbfp1 profiles, if I am not mistaken.

What i heard is CG is better optimized and run faster on NVIDIA chips than GLSL. As a small example, unlike in GLSL where setting up a shader program takes 4-5 steps, CG completes it in 1-2 step.

Another point is when CG came, the Texture rectangle support was already included. GLSL rejected it earlier but with higher shader models still the support has to be explicitly enabled.

You can use GLSL profiles to “compile” Cg to GLSL. Unfortunately, no UBO support yet.

Can you please explain or give a link?
I did not know this before.

This is the list of profiles you can compile your Cg shaders to: http://developer.nvidia.com/object/cg_profiles.html

You can use GLSL profiles to “compile” Cg to GLSL. Unfortunately, no UBO support yet.[/QUOTE]

Ok thank you for this precision Eosie. :slight_smile:

No, Cg was around before GLSL. If anything, GLSL was developed based on the Cg and HLSL languages.

Cg is Nvidia specific.

No, Cg is owned, developed, and maintained by NVidia, but as has been pointed out, you can compile for other cards (like ATI/AMD) using either the arbvp1/arbfp1 profiles (ARB_vertex_program / ARB_fragment_program level capability), or GLSL profiles as back-end compilers.

I think would be more correct to say that Cg is developed to take the best advantage of NVidia hardware.

That’s what I gather. However there is support for uniform BUFFERs in the language and the Cg API ((cgCreateBuffer, cgSetBufferSubData, etc.) which on NVidia G80+ hardware will allegedly use hardware bindable buffers (NV_parameter_buffer_object, which is like UBOs but NVidia-specific), but fall back to individual uniform sets on older or non-NVidia hardware. Folks have reported dramatic speed improvements after rearchitecting their code to use them (link).

But yes, UBO support in Cg would be a great addition, especially since the Cg language and Cg API support for it is already there. If Cg is still trying to pitch itself as a competitive cross-platform shading language, they’ll probably add this, just as they added the GLSL back-end.

Wow, that stinks that CG doesn’t automatically work with ATI cards… more reason for my to just stick with GLSL. I am not concerned with performance, I am not trying to create super complex effects (I wish I could hehe). I’ll make a new topic on GLSL resources, thanks for the info guys :slight_smile:

It does, though it just doesn’t necessarily support all the latest-and-greatest features of those cards, unless you can get to them through GLSL of course.

There are even hints regarding ATI hardware in the Cg reference manual. It’s in their best interest to support hardware other than theirs, otherwise Cg would not be that popular.

One of the main advantages of the Cg toolkit is that it can be used even on crippled Intel hardware without GLSL support. And that’s not all, it can be even used as a standalone GLSL compiler with parameter “-oglsl” targeting any profile you want, even a D3D one.