Pixel Shaders with CG

Is it true that pixelshaders doesn’t work for opengl in CG right now?
When will it do so?

NVidia will release a profile soon.
Doubtful whether any other vendor will release a profile at all - so you’ll only be able to use Cg on nvidia cards.

GL_ARB_vertex_program is out soon. and www.cgshaders.org states that it works on all gl1.4 hardware. so vertex functions can be coded soon. pixelfuncs, well… gl1.5

  1. Isn’t there a good chance NVidia will compile Cg down to GL2 once it became more standard.

  2. When will opengl 1.5 be released?

  3. What to do if I want to write opengl pixel shaders right now?

-Ninja

What to do if I want to write opengl pixel shaders right now?

You’ll have to use the vendor-specific extensions(ATI_fragment_shader or NV_texture_shader1_2_3 & NV_register_combiner & 2).

I’m not sure if it’s worth bothering with Cg in OpenGL, to be honest.

  1. it only works on nvidia hardware in opengl. Other vendors probably won’t support it…everyone’s got their own agendas…just look at the proliferation of vendor specific extensions to see what I mean.
  2. its very limited on even the most up to date nvidia hardware
  3. because of these limitations, you’ll have to learn about vertex programs and register combiners anyway to understand why you can’t do what you want to do in Cg - which defeats the point, somewhat.
  4. There’s far more documentation and help available for using register combiners, vertex programs, and fragment shaders etc. at the moment, and probably will be for a good while yet. (lots of gotchas etc.)
  5. It looks like it’s far more limiting than the shader language proposed in the opengl 2.0 specs…so why bother?

You get the point?

It looks like it will be quite useful in D3D though.

  1. That wont be true when ARB_Vertex_Program comes out. And if ARB_Fragment_Program comes out as well, then pixel “shaders” will work on all gl hardware that supports that.

  2. Only limited by the quality of the code produced by the compiler, which will get better.

  3. Nothing new there. Try and write, and debug a full game project without knowing anything of assembler on the target hardware. Bit hard, but you dont need to be a god at it, thats where the compiler does the work for you. But knowledge of the hardware helps immensely with problems/debugging.

  4. Well yeah, but how much doc/examples was there when VP’s 1st came out? I remember doing stuff on them with detonator version 7, and there wasn’t much stuff about.
    http://opengl.nutty.org/cgtest1.zip My 1st example

  5. Not really, theres stuff unsupported as yet. But it’s basically C, and you can do anything in C. Theres no reason why further language support can’t come out later to make it easier to code for future hardware.

Nutty

Why would someone choose Cg over the gl2 shading language?
3dlabs Wildcat VP does support that shading language now (to a large extent). The new matrox chipset also looks as if it could. The gl2 shading language has really cool things in it like being able to take the frame buffer pixel value as input into your shader, thus unifying combiners and blending.
On the current hardware (ie. non gl2 compliant) pixel/vertex shaders are not at all difficult to write using the basic assembly-like opcodes that have been introduced, because they can never be over a certain size/complexity because of hardware limits.
As soon as those hardware limits are overcome, then the hardware will probably be gl2 compliant, and therefore able to use the gl2 shading language…where does that leave Cg? Looking pretty dated, I would imagine.

This is all irrelevant anyway, as gamers hardware is always eons behind the current generation.

>>where does that leave Cg? Looking pretty dated, I would imagine.<<

nah mate, they’ll release an updated cg spec.
personally this is related to what sucks the most about cg.

Q/ why are there no support for looping etc capabilities
A/ because nvidia cards dont support it yet (no other reason)

what a very fair standard that is

(adapted from blackadder where the witchsmeller explains how to they tell if a person’s a witch)

Why would someone choose Cg over the gl2 shading language?

erm… because Cg is here and working right this moment, and gl2 isn’t. As far as consumer hardware goes anyway.

I dont think you understand precisely what Cg is. It’s not an API. It’s a language, and languages dont go out of date, just because some new hardware comes out.

Thats like saying we’re gonna have to start coding Unix in Java cos C is out of date.

Why is there no loops/branching, well probably because no-hardware at all supports it, except the P10 based cards, but these features will probably appear in the compiler pretty soon anyway, with the profile determining weather they can be used or not.

On current hardware vertex/fragment programs are not difficult, true, but Cg creates a common interface to all of them, and including gl2 as well. When it eventually arrives, and is implemented on consumer class hardware. Theres nothing stopping you using Cg and OpenGL 2.0 together.

Nutty

>>Why is there no loops/branching, well probably because no-hardware at all supports it, except the P10 based cards,<<

What a very fair standard that is


Edmund: Witchsmeller, my dear, if you do happen to come across someone who’s a bit – you know, um – witchy, how do you prove him guilty?

Witchsmeller: By trial or by ordeal.

Edmund: Ah, the ordeal by water…

Witchsmeller: No, by axe.

Edmund: Oh!

Witchsmeller: The suspected witch has his head placed upon a block , and an axe aimed at his neck. If the man is guilty, the axe will bounce off his neck , so we burn him; if he is not guilty, the axe will simply slice his head off .

Edmund: What a very fair test that is.

Cg is nvidia specific, Nutty. So no generic solution exists, until gl2.

I dont think you understand precisely what Cg is. It’s not an API. It’s a language, and languages dont go out of date, just because some new hardware comes out.

I understand it’s supposed to be a language. But, it’s a language with some major restrictions (no looping, or branching). I’m sorry, but even Cobol had branching, so I’m loathed to call Cg a language in its current state.

but Cg creates a common interface to all of them, and including gl2 as well.

No it doesn’t. It creates an interface that will work on NVidia hardware in OpenGL, until the ARB introduce a fragment_shader extension - and people will always complain that that will be too limiting (NVidia and ATI have vastly different capabilities in hardware) - “use register combiners, they’re more powerful!”…just the same arguments that people came up with when d3d introduced pixel shaders - “they’re not as powerful as register combiners!”.

But it all comes down to choice. Do you want NVidia to control the capabilities of your shading language, or do you want a comittee of vendors to agree on a standard?

first: hy nutty

Originally posted by Nutty:
erm… because Cg is here and working right this moment, and gl2 isn’t. As far as consumer hardware goes anyway.

yeah. okay, the current compilers are crap, but thats (i hope soon) not a topic. currently its the best cause only thing to use. BUT coding the stuff in assembler is not THAT hard because most programs are rather short they can’t get THAT long anyways.
nontheless, its here, and for now its the best thing to have…

I dont think you understand precisely what Cg is. It’s not an API. It’s a language, and languages dont go out of date, just because some new hardware comes out.

yes a language, but a language wich is not yet working at its own standart. cg will change for every additional hardware with an additional feature. old cg code still works, but backwardcompatibility is not provided by cg. can’t be. thats no fault of cg (except that its too early/useless because gl2 is what cg wants to be at the moment the hw supports it etcetc ), its a fault of the hw. todays hw is too restricted (mainly “pixelshaders” wich you cant call yet pixelprograms, not even pixelfunctions… more extended texture_stages…)

and i dont like to use a language wich is not standart.

Thats like saying we’re gonna have to start coding Unix in Java cos C is out of date.

no its not. we would choose c#

Why is there no loops/branching, well probably because no-hardware at all supports it, except the P10 based cards, but these features will probably appear in the compiler pretty soon anyway, with the profile determining weather they can be used or not.

and the moment loops come in the first incompatible cg code will start. and versionconflicts are here again… like html on webpages, like dll’s in windows, like gl1.0,1.1,1.2,1.3,1.4 etc… like dx. its nothing bether…

On current hardware vertex/fragment programs are not difficult, true, but Cg creates a common interface to all of them, and including gl2 as well. When it eventually arrives, and is implemented on consumer class hardware. Theres nothing stopping you using Cg and OpenGL 2.0 together.

two parts: fragment programs of todays hw will not fit into GL_ARB_fragment_program, thats why nvidia is saving it for gl1.5… we will have to code a fallback for gf3/4 (and <gf3 as well) manually i guess. so the common interface is ****ed up… cg’s common interface is as common as gl is, so, well, it doesn’t help us there much (for the community i mean… (hm?! ))

second. why using cg on opengl2 hw? cg for gl2 will feature a lot of stuff that will not compile for non gl2 hw anyways, so its no difference than using gl2 language directly.
theres nothing stopping me from doing this, except its useless…

cg is cool. right now. but i dont see more than a (very cool) nvparse in it. not now, not for the future. sorry…
www.davepermen.net <<soon

Originally posted by Ninja:
2. When will opengl 1.5 be released?

after gl1.4?
when will gl1.4 be released? soon? i dont know, ask matt or cass, they possibly know it…

…and still you people refuse to consider direct3d - all I can say is you must know an awful lot of people running linux/irix! I know game developers don’t…

There is no pre-requisite that states all programming languages must have loops and branches. Nvidia said they will appear, but at this moment in time it’s pointless putting it in. They’d rather get it out for ppl to use now.

No it doesn’t. It creates an interface that will work on NVidia hardware in OpenGL,

I’m sorry, but it does create a common interface. If you look at the DX side of things, it will work on any DX8 compatible hardware.

2ndly, there is nothing stopping ATI from releasing a profile for ATI_VertexProgram, or whatever it’s called, before ARB_vertex_program.

And 2ndly I’m not refusing to consider D3D. Again using Cg has benefits there, as provided suitable fragment program profiles come out, you can use the exact same Cg shader for D3D code and OpenGL.

I really dont understand why soo many ppl bash it, just because it was created by a single company. I still think it’s going to be a fair while b4 gl2 appears on nvidia/ati hardware. Once some decent common interfaces for vertex and fragment programs arrive, then Cg will work across D3D, and OpenGL on all hardware that supports the common interfaces. AND even if they dont support the common interfaces, they can still make a profile for their vendor specific extensions.

Nutty

Nutty, nvidia will not enable looping/branching until their hardware can support it, then they will enable it, even if other vendors aren’t able to support it.
This just isn’t right that NVidia decide this.

As for whether it’s a language or not, I’d say it’s a very limited script, not a programming language. Without branching and looping, you’re basically feeding values into a script, which then can call a small subset of mathematic functions on those values, and output results. That’s a programming language? In the same way HTML is a programming language? We’ll have to agree to disagree on the usage of the phrase ‘programming language’.

nvidia will not enable looping/branching until their hardware can support it, then they will enable it, even if other vendors aren’t able to support it.

and?!? Well you could say it’s not fair that GL2 will support features that other vendors can’t support in hardware. It’s the same situation. Will you call the ARB unfair, because gl2BlahBlahBlah is only implemented in hardware on certain cards?

As to weather it is a programming language or not is neither here nor there. It’s based on C, with a few things removed. Does that suddenly make it not a language. I personally dont think so.

This just isn’t right that NVidia decide this.

Why isn’t it right? Nvidia are in a free country are they not? They have the right to do what they want. IF it’s a bad mistake they’ll pay for it by losing support. And perhaps a couple million dollars of R&D down the drain.

IS it fair that M$ dictate what goes in DX? IS it fair that SGI dictated the original OpenGL api all those years ago?

btw. please dont think I’m having a go. Just lots of ppl seem to bash Cg, so I thought I’d play devil’s advocate, as not many ppl seem to be supporting it!

Personally I think it’s good. Provided that we get fragment profiles for gl, and other vendors participate.

Nutty

I really dont understand why soo many ppl bash it, just because it was created by a single company. I still think it’s going to be a fair while b4 gl2 appears on nvidia/ati hardware.

Maybe because of the posibility that gl2 will not appear on nvidia hardware. And nvidia will use Cg as an excuse not to support gl2. I would have been much more enthusiastic about Cg if nvidia did show some commitment to gl2. This can be much worse than glide vs. opengl because 3dfx at least did support opengl. The difference between pure gl2 and gl1.x+nvidia extentions is very big so we’ll have to either use two different APIs or choose D3D. And I am not talking just about the shader stuff.

[This message has been edited by GeLeTo (edited 07-04-2002).]

Originally posted by knackered:
…and still you people refuse to consider direct3d - all I can say is you must know an awful lot of people running linux/irix! I know game developers don’t…

I’ve read some of your previous posts and I’m really wondering what you’re doing here on an OpenGL forum…

Are you perhaps working for Microsoft?
Maybe you’re the monkeyboy himself?
Or the king of nerds: Bill Gates?

Now please go away or I will send you a penguin with a red hat.