Cg - nice, but makes me dislike NVIDIA

3Dlabs has been working for quite a while on OpenGL 2.0, one of whose parts is exactly this kind of compiler. Now NVIDIA brings out its own implementation to take all the publicity before 3Dlabs gets any, and force everyone to use its own definition of the language, giving it an advantage in the market. It’s a classic Microsoft move, and I really dislike that.

No it’s NOT exactly this kind of compiler.

I can just see the 10 following replies of this thread :

ET3D : YES, it is.

dorbie : NO, it’s not

ET3D : YES, it is.

dorbie : NO, it’s not

ET3D : YES, it is.

dorbie : NO, it’s not

ET3D : YES, it is.

dorbie : NO, it’s not

Please people, if you are trying to argue about something, bring some proofs.

Gorg, you’re a hoot. But funny as you are, please don’t put words into people’s mouth. You might find that they find it less funny than you think.

dorbie, if you’d be kind enough to explain the difference…

You know, Creative (3Dlabs) is not the only one working on openGL 2.0. ATI, Nvidia, and a few other players are also in the ballgame.

No one is forcing you to use Cg, and as we all know it takes forever to ratify new specs, so until that happens, Cg is perfectly fine for now, the alternative is to make your own. You up for that?

Well, in fact, I’m already using DirectGraphics (which in some respects is a refreshing change after using OpenGL), and as I understand it, Cg will be part of DX9. I certainly don’t claim that what NVIDIA did isn’t effective or useful for developers. It does work well for Microsoft, and I’m sure it’ll work well for NVIDIA. It just hurts my sensibilities, that’s all. I keep my sensibilities separate from my development decisions (which is why I have no problem using DirectGraphics). It bothers me that in the public eye, NVIDIA will look like the original innovator, just like it bothers me that Microsoft can be considered innovative doing things that others have done before them.

[This message has been edited by ET3D (edited 06-22-2002).]

Gorg, if you have something to contribute then say it, otherwise you appear to have less to say than anyone else in this thread. It’s not my fault an obvious statement of fact confuses you because you know nothing about the subject being discussed.

ET3D, the specs are there you can download them, Matt has some interesting comments in this thread: http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/006690.html
I don’t see a need to start another one here.

[This message has been edited by dorbie (edited 06-22-2002).]

personally i cant see how cg benifits ms at all they’re coming out with their own standard with d3d9. HLSL != cg (contrary to what some ppl have written). its a wonder ms allows nvidia to release this but then again the relations between the 2 companies ain exactlly rosy

I don’t think this is an M$ scheme at all. This is a typical marketing endorsement, they might be in simply because of minor assistance with the D3D profile.

Sorry Guys if you found my joke had bad taste.

I was actually interested in both you bringing more to the discussion, because even though Cg and OGL2.0 shading language are different in design(what I think dorbie meant), they actually want to achieve the same the same thing by giving a common interface for all hardware(what I think ET3D meant).

So if they try to achieve the same thing, aren’t they of the same kind? I am currently trying Cg and have thorougly read OGL2.0 specs when they came out and I would have a tendency to say yes and that one of them will be at some point redundant if Cg ever includes all the stuff from OGL2.0 shaders. But I am not sure. Since you guys seems to have made up your mind, I wanted to know your though process to see what I can take out of it.

There’s now ANOTHER thread discussing this, see “HLL vs HLSL” thread.

Originally posted by dorbie:
There’s now ANOTHER thread discussing this, see “HLL vs HLSL” thread.

I don’t really care about that.

My question is when OGL2.0 comes out. If Cg is there with all the same features(looping, branching…) then what’s the point of OGL2.0 language. There has been some discussion on this, but I did not feel like a proper answer was given.

Perhaps try to understand the discussion points in those threads. Dismissing key differences makes all things equal.

[This message has been edited by dorbie (edited 06-23-2002).]