GLSL should learn something from NVIDIA Cg

OpengGL shading language is a very power tools for graphic programmer to make cool effects on computer.

But because of the expensive cost in flow control in GPU process pipeline,so most of us use multi-shaders instead of one super-shader.So if we will use lot’s of shaders, it will be very hard to manage the source codes,and the handles in program.Addition, use glUseProgram() for several time in frame rendering maybe causes some cost.

In nvidia Cg,there’s is the “techinique” and “pass” similar the multi-pass render piple-line in a single file,CgFX.We can call “technique” before drawing something by name,and more,we also can choose which pass to use,to generate the effect we want.

is my suggestion valuable?

But because of the expensive cost in flow control in GPU process pipeline,so most of us use multi-shaders instead of one super-shader.So if we will use lot’s of shaders, it will be very hard to manage the source codes,and the handles in program.Addition, use glUseProgram() for several time in frame rendering maybe causes some cost.
You can compile and link GL shaders intelligently to good effect. I don’t really see this as an issue.

In nvidia Cg,there’s is the “techinique” and “pass” similar the multi-pass render piple-line in a single file,CgFX.We can call “technique” before drawing something by name,and more,we also can choose which pass to use,to generate the effect we want.
Actually that was borrowed from D3D Effects (fx), but they’ve added their own twist to it, I believe.

Techniques as a mechanism are easy to create yourself, as a layer on top of Cg/HLSL; it’s just an encapsulation of render state and a constant table manager. What’s nice is that the core API itself exposes a semantic that the shader can use to communicate information to the renderer. I believe that’s all that’s needed in GLSL, though I’m not at all opposed to some more higher level stuff.

Perhaps you might use COLLADA FX?
http://www.khronos.org/collada/

Not used it myself…

Good present,I will spend some time on COLLADA FX

I think a more valuable lesson from Cg (in that it’s more difficult to implement as an abstraction layer) is literal uniforms. Basically, you tell the compiler that a constant isn’t going to change much (maybe it’s a quality knob on a control panel). That hints that it should be constant-folded, and cause a recompile every time it’s changed.

You can do some of this with #define, but then you’re reparsing the shader completely from scratch every time you change the uniform, plus you have difficult interfaces for literal and non-literal uniforms.

That hints that it should be constant-folded, and cause a recompile every time it’s changed.
This is already going to happen (except that it’s not a hint, but a command, and you have to manually do the recompiling yourself when you change it) in Longs Peak. Personally, I prefer the Longs Peak method.

I love the DX/Cg technique and passes approach. Also the offline compilation to partially protect my IP and to see the compilated code. I miss seriously annotations in GLSL too.

However, the new layer approach gonna change all this like Korval mentioned. See http://www.opengl.org/pipeline/article/vol002_1/

From my point of view the best thing that Cg posseses is interfaces, it would be nice to have something like this in GLSL

Re effects systems, also see the recent Khronos press release regarding the formation of the “glFX” working group. This is a new activity for Khronos, so there’s nothing to say yet beyond the release - but there’s quite a bit of interest, and some solid starting points in existing projects.

Is there, or will there be a ‘bloom’ type effect in the future of OpenGL?

Is there, or will there be a ‘bloom’ type effect in the future of OpenGL?
One, OpenGL doesn’t have “effects” anymore; they (and the graphics industry) abandoned such things once programmable GPUs became possible. OpenGL exposes the programmable features of the GPU; nothing more.

Two, what is a “bloom” effect anyway? You’re going to have to provide more information than that to get an answer.

Three, I imagine that the crux of your question is whether GL has render-to-texture. The answer is yes. It’s had it for about a year now.

Bloom is a graphics effect from games like Oblivion and Medieval 2: Total War where the textures give the appearance of depth when you look at them dead-on. It’s a lighting effect, I suppose. When you look at it from the side it’s just as flat as any other texture, of course, but the effect, whether software or hardware rendered, is nice. I see it on all sorts of Direct X stuff now.

Sorry about my assumption. I am here to learn, and as you can see, I’ve a long way to go. :smiley:

Bloom is a graphics effect from games like Oblivion and Medieval 2: Total War where the textures give the appearance of depth when you look at them dead-on.
That’s called “Bump mapping,” and OpenGL had that well before D3D, thanks to the GeForce 256 (the first GeForce) and its register combiners.

Generally, " bloom " (the shader effect. The stupid forum won’t let me link directly to the article because it has parenthesis in the name) is what is referred to when a bright light appears to expands beyond its boundaries. The trick to that effect is that it is image based; the blooming will expand around anything that occludes part of it.

Cut him some slack Korval, the kid’s just turned 12 years old.

lol From what I remember reading about it, it does the same sort of effect as bump-mapping without using so many system resources. If you tried to use bump-mapping on all the things that the bloom effect is used on and your computer would probably lock up or something. I remember the games bump-mapping was used on. Most people usually had to turn it off, if they could.

For all I know about the effects a video card can do, I might was well be 12. :smiley: I just came here because I know that OpenGL has always been superior, or at least far more computer resource friendly, than Direct-X. I remember back in the day when alot of games gave you a choice between running in Direct-X or OpenGL. You would have been an idiot to choose Direct-X over OpenGL.

I…am…speechless.
What do you want? Why are you contributing to a thread called “GLSL should learn something from NVIDIA Cg”? What’s going on?
Korval’s linked to a page telling you what bloom is, but you still don’t seem to have the faintest idea what you’re talking about.
http://tinyurl.com/op4fn

:smiley: