PDA

View Full Version : glslang... when?



pedrosl
07-04-2003, 03:27 AM
Hi,

Does anyone have any idea of when I, a nvidia geforce user, will be able to work with the Opengl Shading Language?

Thanks

YK
07-04-2003, 05:10 AM
GL_NEVER ???

pedrosl
07-04-2003, 05:23 AM
glEnable(GL_REALLY_REALLY_HOPE_NOT)

sqrt[-1]
07-04-2003, 04:48 PM
When you say "GeForce" what do you mean? Geforce FX? If so, then you can expect a "glslang" extension for it. If you mean Geforec 1-4 then no.(if you are just developing on a tight budget, you can't go past the cheap Geforec FX 5200 ~75 USD)

pedrosl
07-05-2003, 02:14 AM
what do you mean...? glslang won't be enabled on a GeForce 4 ??? NOOOOOOOOOOOOOOO http://www.opengl.org/discussion_boards/ubb/frown.gif

Mazy
07-05-2003, 03:22 AM
gf4 doesnt have the fragment pipeline to support it.. Maybe ( since the language is splitted in 2) you can have glSlangs vertex part?..

Lurking
07-05-2003, 09:39 AM
i also cant wait to work w/ glslang. But until nvidia comes out w/ drivers that support it i guess we just have to use asm.

SirKnight
07-05-2003, 12:36 PM
Originally posted by Lurking:
i also cant wait to work w/ glslang. But until nvidia comes out w/ drivers that support it i guess we just have to use asm.

What's wrong with using Cg? Cg is a pretty nice tool to use atm. Sure beats having to write shaders in asm IMHO.

-SirKnight

Lurking
07-05-2003, 06:18 PM
to be honest i really dont have anything against cg. the only thing is that as much as i would like to have one standard setup for opengl i would like it to be run by ARB and not just one company (Nvidia). I love nvidia but the thing is that when that ARB supports it you know both Nvidia and Ati will go all the way with those features. But when you have nvidia running the shader language you give all the power for desicions to nvidia. This is one of the reasons i dont work w/ direct x. If nvidia keeps going the way it is ati could take its place. Then we would have a static language cg. Also i like the idea of bringing glslang into the core of opengl which cg cant do cause they have support for both direct x and opengl.

Robert Osfield
07-06-2003, 05:18 AM
Originally posted by Lurking:
i also cant wait to work w/ glslang. But until nvidia comes out w/ drivers that support it i guess we just have to use asm.

You can use Cg as offline tool, and get it to compile shaders into ARB_fragment_program and ARB_vertex_program. This way you can keep your run-time clean from single vendor API's, without hacking in lower level program strings.

Once GL2 is widely available then you'll be able to clean up your workflow. I certainly look forward to this day, and hope the Nvidia support it as well as they currently support GL1.4.

Robert.

pedrosl
07-06-2003, 05:37 AM
I certainly look forward to this day, and hope the Nvidia support it as well as they currently support GL1.4.


Amen

JD
07-07-2003, 01:12 PM
Reading between the lines in nv faqs I've gather that nv is feverishly working on gl2. I think when the nv40/r400 cards arrive we'll have some gl2 functionality in their drivers.

Lurking
07-07-2003, 03:40 PM
i guess my question is why wouldn't we have that functionality in the current NV35's. Seeing that these cards contain the power to handle fragment shaders well. I just got my 5900 ultra and am very happy with it. But i dont see why you wouldn't be able to use glslang on any NV3x model card?

Azrael
07-08-2003, 03:15 AM
ati already has glslang support in their latest drivers (Catalyst 3.5) and it works.
somebody inside here already posted something like that. You can get a delphi version of it
from DelphiGL (http://www.delphigl.com) .

At the moment i prefer ASM Shader, because the glslang part inside of the catalyst driver is not official and all the functions changed when updating from cat 3.4 to 3.5 .

see you,
Azrael.

M/\dm/\n
07-08-2003, 03:27 AM
Cg is NOT NV only, it works for ARB_f_p too, that is only pure f_p profile that Ati supports. Moreover all you get in the end is ASM, and as far as I know it should work with GF3/GF4 under fp20.
But I think gl slang can be put upon any card that has reg combiners, although that's though task.
As stated Cg is NVs gift to the world, so why we shouldn't use it?

pedrosl
07-08-2003, 03:53 AM
Another thing: Doom3 is using ARB_vertex & fragment programs, right?

matt_weird
07-09-2003, 08:20 AM
pedrosl:
Another thing: Doom3 is using ARB_vertex & fragment programs, right?

Here's what JC wrote in his .plan file:


The R300 can run Doom in three different modes: ARB (minimum extensions, no specular highlights, no vertex programs), R200 (full featured, almost always single pass interaction rendering), ARB2(floating point fragment shaders, minor quality improvements, always single pass).

The NV30 can run DOOM in five different modes: ARB, NV10 (full featured, five
rendering passes, no vertex programs), NV20 (full featured, two or three
rendering passes), NV30 ( full featured, single pass), and ARB2.

Some "funny" facts about ARB vertex/fragment programs, from the specs:



ARB_vertex_program

Contact Pat Brown, NVIDIA Corporation

IP Status: NVIDIA claims to own intellectual property related to this extension, and has signed an ARB Contributor License agreement licensing this intellectual property.

Status: Microsoft claims to own intellectual property related to this extension.


ARB_fragment_program

Contact Benj Lipchak, ATI Research

IP Status: Microsoft claims to own intellectual property related to this extension.

http://www.opengl.org/discussion_boards/ubb/confused.gif http://www.opengl.org/discussion_boards/ubb/wink.gif


M/\dm/\n:

As stated Cg is NVs gift to the world, so why we shouldn't use it?


Thinking this way, Direct3D is Microsoft's gift to the world too, so why we shouldn't use it? http://www.opengl.org/discussion_boards/ubb/wink.gif http://www.opengl.org/discussion_boards/ubb/tongue.gif

[This message has been edited by nasty_moderator http://www.opengl.org/discussion_boards/ubb/tongue.gif http://www.opengl.org/discussion_boards/ubb/biggrin.gif (edited 07-09-2003).]

[This message has been edited by matt_weird (edited 07-09-2003).]

Nutty
07-10-2003, 04:15 AM
Because it ties you down to a single operating platform, and forces you to wait for the next version of D3D, before new hardware features are revealed.

Cg has neither of these limitations... Cg is a tool, not an API like OpenGL or D3D.

matt_weird
07-10-2003, 09:17 AM
Nutty http://www.opengl.org/discussion_boards/ubb/wink.gif ,that was sarcastic http://www.opengl.org/discussion_boards/ubb/wink.gif ,mind you http://www.opengl.org/discussion_boards/ubb/wink.gif, and the point was about the "gift" basis http://www.opengl.org/discussion_boards/ubb/wink.gif, not about whatsoever it is http://www.opengl.org/discussion_boards/ubb/wink.gif. Should i put some more winks next time to prove it? http://www.opengl.org/discussion_boards/ubb/wink.gif http://www.opengl.org/discussion_boards/ubb/wink.gif

SirKnight
07-10-2003, 09:40 AM
That's a lot of winking faces!

-SirKnight

Jan
07-10-2003, 12:12 PM
[This message has been edited by nasty_moderator (edited 07-09-2003).]

pedrosl
07-10-2003, 01:31 PM
just another thing: do you guys know any demo showing bump mapping with CG (in opengl) using NORMAL MAPS (not height maps)?

Thanks

jra101
07-10-2003, 02:16 PM
Yes, check out the cg_bump_mapping demo.

Lurking
07-10-2003, 10:30 PM
grrr... waiting for drivers sucks. Ive been waiting for a long time to try out glslang yet didn't want to put my money on a wildcat vp. Now that its release (atleast by ARB) i have to wait for nvidia to come out w/ drivers that support it and i dont know how long that will take. im just giddy for the new language. yes ive tried cg and like it but i also want to try glslang too! Hurry nvidia before i explode! ; ^ )

- Lurking

velco
07-10-2003, 11:03 PM
Originally posted by M/\dm/\n:
Moreover all you get in the end is ASM, and as far as I know it should work with GF3/GF4 under fp20.

It does not. See Cg User's Guide, Appendix B, "OpenGL NV_texture_shader and NV_register_combiners Profile ( fp20 )"

~velco