Response to "Cg won't work" op-ed in The Register

http://www.theregister.co.uk/content/35/25780.html

yep, i’ve read the article too, the guy has serious counter arguments too :slight_smile:

I’m glad nvidia have produced Cg. It’s a bold step. I just hope to god that ati, matrox and 3dlabs produce ‘profiles’, otherwise it will be utterly pointless.

Developer support will force them to create profiles.

If NVIDIA supplies a default profile for vanilla OpenGL and developers use Cg, ATI et.al will then be pressured to improve their performance by providing profiles that exploit their hardware & extensions.

That at least is one scenario NVIDIA is probably hoping for. You want profiles? Support Cg.

Or maybe create their own versions of Cg (yawn) - then supporting each others profiles. Much like loading a MS Word 6.0 file into Word Perfect

Pardon my ignorance but…
What are profiles?

Don’t you also feel that OpenGL is a large leap behind. I am a bit worried about the fragment shader functionality. How long will it take until we get a fragment shader for OpenGL. I want to use Cg or equiv but as long as the output is so varying between profiles, it is difficult to make something generic…

NV20 fragment profile is expect “soon”. Probably when nvidia finish their NV_fragment_program extension, which will probably combine RC’s and TS’s into 1 easier API, like ATI’s.

Nutty

I sure hope it is soon. My Cg fragment shader i wrote for OpenGL is just dying to be used. At first I didnt know that the OpenGL fragment part of Cg was not working yet and I just about pulled my hair out trying to figure out why in the heck the shader loading func was crashing horribly. Then I read that post on cgshaders.org which talked about fragment Cg progs under OpenGL not being finished. Doh! Maybe i’ll give myself a crash course on D3D so I can test it out. It’s been a long time since I used D3D 8 last. Well, since it first came out actually.

-SirKnight

Originally posted by SirKnight:
Maybe i’ll give myself a crash course on D3D so I can test it out.

No! You don’t have to do that to use fragment programs…

Read the post from Cass in the following topic: http://www.cgshaders.org/forums/viewtopic.php?t=24&start=20

Just use the dx8ps profile for fragment programs and use nvParse (the version that supports ps1.1, which should be at NVIDIA’s site ‘soon’) to set the regcombiner + texshader states and switch to the fp20 profile when it’s done/available (with next release at siggraph?)

Given how truly little we know about NV30, this guy is making some pretty bold claims about nVidia having the best hardware. They just released GeForce 4, months after the Radeon 8500, and the 8500 is still, fundamentally, more versatile and more powerful than the GeForce 4. Given that, I don’t think it’s wise to be throwing around comments like this:

…and although Nvidia is the 800 lb gorilla of graphics, they also have the most interesting and innovative hardware currently on the way.

ATi beat them once before. They’ve got a pretty good head start (even a fragment_shader API that is very extensible and flexable). To be D3D 9-compiant in the pixel shader department, ATi simply has to add a few more passes into the hardware. nVidia has to build an entire new pixel pipeline. Granted, because of this, nVidia could build a very good one, but it is just as likely that ATi can build off of their current one and make one better than nVidia’s.

And that says nothing about what 3DLabs is bringing to the table.

You assume that the writer knows no more than you.

Anyway, I’m glad someone corrected the original article, it was terrible. You don’t have to pick favorites in the graphics card war to see that.

Very interesting to read: http://www.extremetech.com/article2/0,3973,183940,00.asp

Very interesting,

the battle lines are getting drawn :slight_smile:

Hmmm, that sounds pretty interesting richardve, I might have to see if I can get that working, that’s if the new nvparse is online.

-SirKnight

Am I being paranoid, or has nvidia knocked up this Cg language because it’s unsure that it can produce competitive hardware that will support the shading language proposed in gl2.0 ?

Originally posted by knackered:
Am I being paranoid, or has nvidia knocked up this Cg language because it’s unsure that it can produce competitive hardware that will support the shading language proposed in gl2.0 ?

knackered, I think you’re just being paranoid.

I don’t know if NVIDIA can produce a competitive chip that supports the gl2.0 shading language but that being said I don’t know if anybody can…

As far as I am concerned, Cg is a good thing and I don’t understand why people keep trying to put it down even before trying it. Bottom line is, if you don’t like it, stick to ASM-style shaders… BTW, why did you move to C/C++ instead of staying in the fantastic world of x86???

As to whether NV30 will be a truly fantastic chip or not (cf. Korval post), I’d say it will (which does not mean that ATI or someone else cannot produce something better).

Have I missed something important in the past few weeks? It looks like everyone is having something against NVIDIA these days…

Regards.

Eric

knackered, there’s absolutely nothing to back up that suspicion, and something like Cg is definitely not “knocked up”. It looks like an intentionally minimalist low level approach to shader compilation.

Yes, ‘knocked up’ was a bad choice of words.
You’re right, I’ve no facts to back up my suspicion.
All I say is this - whenever d3d is mentioned on this newsgroup, you people have been quick to point out that opengl is an open api, governed by a body with no single commercial interest - whereas with d3d, microsoft plays the tune that everyone must dance to. It seems to me that (albeit to a smaller extent) nvidia is attempting to do the same with opengl. They will govern what can be added to the language…but this is ok by you guys? It’s in some way morally different is it?
Eric, get in the real world - nobody is anti-nvidia, just healthy suspicion of a commercially driven organisation.
If you think NVidia thinks there’s room for more than one consumer hardware vendor, and that it’s “healthy competition” then I suggest you read up on how to run a business successfully - high in the priorities is to eliminate the competition.

[This message has been edited by knackered (edited 06-21-2002).]

Originally posted by knackered:
Eric, get in the real world - nobody is anti-nvidia, just healthy suspicion of a commercially driven organisation.

I am sorry but I am in the real world, you aren’t: I am very conscious of the commercial issues behind what NVIDIA is doing (although I think the guys who are developping Cg are not the ones who are commercially interested in it…).

You complain about this commercial side of things while this is something you should expect these days. Who’s in DreamLand then?

Regards.

Eric