PDA

View Full Version : Has anyone ever used Nvidia Shader Library?



Bluebomber357
07-25-2009, 12:16 PM
<a href="http://developer.download.nvidia.com/shaderlibrary/webpages/shader_library.html" rel="nofollow" target="_blank">[/url]

Has anyone ever used the Nvidia Shader Library? There are tons of great CG shaders there. I would really like to use them in a project. I've used GLSL and became discouraged with the lack of resources, like not being able to find code for environment cube mapping. Let me know if anyone has used Nvidia's shaders before or if anyone knows of a good place to start for using them. I have never used CG before and I would appreciate some knowledge on how I should start, Nvidia's website just sorta throws this stuff out there with no entry point :(

BTW, here is my GLSL shader gallery video [url=http://bluebomber128.wordpress.com/2009/07/11/opengl-shader-gallery/]GLSL Shader Gallery Video</a>. Such a great language for OpenGL, but just not enough community support :(

awhig
07-25-2009, 03:13 PM
Cg is based on glsl. Cg is Nvidia specific. Cg has no support on ATM chip. This forum targets languages that can work on both chips.

NVIDIA forum may have more members using CG.

dletozeun
07-25-2009, 03:42 PM
Cg is based on glsl. Cg is Nvidia specific. Cg has no support on ATM chip. This forum targets languages that can work on both chips.

NVIDIA forum may have more members using CG.

Some people have good reasons to use Cg instead of glsl. One could be because they like the syntax, provided development tools,... and especially because its API supports OpenGL and D3D.

But the sad thing is that ati hardware still only support the old arbvp1 and arbfp1 profiles, if I am not mistaken.

awhig
07-25-2009, 05:10 PM
What i heard is CG is better optimized and run faster on NVIDIA chips than GLSL. As a small example, unlike in GLSL where setting up a shader program takes 4-5 steps, CG completes it in 1-2 step.

Another point is when CG came, the Texture rectangle support was already included. GLSL rejected it earlier but with higher shader models still the support has to be explicitly enabled.

Eosie
07-25-2009, 05:11 PM
But the sad thing is that ati hardware still only support the old arbvp1 and arbfp1 profiles, if I am not mistaken.
You can use GLSL profiles to "compile" Cg to GLSL. Unfortunately, no UBO support yet.

awhig
07-25-2009, 05:15 PM
Can you please explain or give a link?
I did not know this before.

Eosie
07-25-2009, 05:18 PM
This is the list of profiles you can compile your Cg shaders to: http://developer.nvidia.com/object/cg_profiles.html

dletozeun
07-26-2009, 02:26 AM
But the sad thing is that ati hardware still only support the old arbvp1 and arbfp1 profiles, if I am not mistaken.
You can use GLSL profiles to "compile" Cg to GLSL. Unfortunately, no UBO support yet.

Ok thank you for this precision Eosie. :)

Dark Photon
07-27-2009, 05:50 AM
Cg is based on glsl.
No, Cg was around before GLSL. If anything, GLSL was developed based on the Cg and HLSL languages.


Cg is Nvidia specific.
No, Cg is owned, developed, and maintained by NVidia, but as has been pointed out, you can compile for other cards (like ATI/AMD) using either the arbvp1/arbfp1 profiles (ARB_vertex_program / ARB_fragment_program level capability), or GLSL profiles as back-end compilers.

I think would be more correct to say that Cg is developed to take the best advantage of NVidia hardware.

Dark Photon
07-27-2009, 06:06 AM
Unfortunately, no UBO support yet.
That's what I gather. However there is support for uniform BUFFERs in the language and the Cg API ((cgCreateBuffer, cgSetBufferSubData, etc.) which on NVidia G80+ hardware will allegedly use hardware bindable buffers (NV_parameter_buffer_object, which is like UBOs but NVidia-specific), but fall back to individual uniform sets on older or non-NVidia hardware. Folks have reported dramatic speed improvements after rearchitecting their code to use them (link (http://developer.nvidia.com/forums/index.php?showtopic=1013)).

But yes, UBO support in Cg would be a great addition, especially since the Cg language and Cg API support for it is already there. If Cg is still trying to pitch itself as a competitive cross-platform shading language, they'll probably add this, just as they added the GLSL back-end.

Bluebomber357
07-27-2009, 11:14 AM
Wow, that stinks that CG doesn't automatically work with ATI cards... more reason for my to just stick with GLSL. I am not concerned with performance, I am not trying to create super complex effects (I wish I could hehe). I'll make a new topic on GLSL resources, thanks for the info guys :)

Dark Photon
07-27-2009, 11:17 AM
Wow, that stinks that CG doesn't automatically work with ATI cards...
It does, though it just doesn't necessarily support all the latest-and-greatest features of those cards, unless you can get to them through GLSL of course.

Eosie
07-27-2009, 03:15 PM
There are even hints regarding ATI hardware in the Cg reference manual. It's in their best interest to support hardware other than theirs, otherwise Cg would not be that popular.

One of the main advantages of the Cg toolkit is that it can be used even on crippled Intel hardware without GLSL support. And that's not all, it can be even used as a standalone GLSL compiler with parameter "-oglsl" targeting any profile you want, even a D3D one.