PDA

View Full Version : Any ideas when Nvisia will release their OpenGL drivers that include GLSL ?



fuxiulian
01-09-2004, 04:38 AM
I am waiting for over a month now... I am not in the mood to learn any sort of assembler like language, since I have done that with intel asm and now I don't use that anymore. I am not fond of using cg either since it doesn't work well on anything but Nvidia. I want a clean start with GLSL but it seems Nvidia is not very devoted to support this new ARB extension...

Any ideas how long it will take for them to release their new drivers with GLSL support ?

Zengar
01-10-2004, 02:57 AM
I also don't like it but I find it very wise of nvidia to delay glSlang support. Maybe we'll have it in NV40 launch driver.

PanzerSchreck
01-10-2004, 04:13 AM
Originally posted by Zengar:
I also don't like it but I find it very wise of nvidia to delay glSlang support. Maybe we'll have it in NV40 launch driver.

What's so wise about that? The only field where NV nowadays was in front of ATI was their developer support, and now they're even falling behind ATI in that. So after designing not so top-notch HW, loosing one exclusive partner after another (Abit was the latest one) and cheating on customers, now they're even struggling in regards of developer support.

The way ATI did with their beta glSlang-Support is the best way a hardware vendor could go (NV did this with nv30emulate for example), so I totally can't unterstand why it takes NV so long to implement that into their oh so great drivers.
Sure, there are glslang-features you can't use on current ATI-HW, but at least you can get into glSlang.

I won't say that NV will go the same way as 3DFX, cause they not only rely on gfx-hardware. But they aren't making a good figure right now.

P.S. : Sorry for my OT-Rant, but I really don't unterstand what NV is trying to do...selfdestruction is surely not the way to success.

P.P.S. : I somewhere read that NVidia intended to release drivers with glSlang-support by the end of the year. But as you may have noticed, it just didn't happen http://www.opengl.org/discussion_boards/ubb/wink.gif

[This message has been edited by PanzerSchreck (edited 01-10-2004).]

eirikhm
01-20-2004, 01:24 PM
and they keep waiting...

Korval
01-20-2004, 02:46 PM
The way ATI did with their beta glSlang-Support is the best way a hardware vendor could go (NV did this with nv30emulate for example), so I totally can't unterstand why it takes NV so long to implement that into their oh so great drivers.
Sure, there are glslang-features you can't use on current ATI-HW, but at least you can get into glSlang.

Well, there's something to be said for being able to trust your glslang compiler. I would prefer waiting a few weeks or months for a hard version of glslang that is a mostly bug-free compiler, than having to deal with one that is moderately buggy.

Granted, ATi's glslang support, not including features that require emulation, is pretty decent.

[This message has been edited by Korval (edited 01-20-2004).]

mw
01-22-2004, 04:02 AM
Giving the fact that many errors will be detected by users trying to do some unconventional stuff, I think that it's better to release a beta version of an extension - but tell ppl that the implementation is faulty.
If I know that GLslang support isn't finished, I'll not use it in a commercial application - but at least I can play with it.

imr1984
01-23-2004, 02:48 PM
I could be wrong, but Mesa 6 has support for OpenGL 1.5, so this means glSlang is available to everyone.

PanzerSchreck
01-23-2004, 03:18 PM
Originally posted by imr1984:
I could be wrong, but Mesa 6 has support for OpenGL 1.5, so this means glSlang is available to everyone.

That's wrong. glSlang is not an OpenGL 1.5 feature, so it's not included with Mesa 6.

imr1984
01-23-2004, 11:47 PM
I stand corrected. glSlang is part of GL2 then?

Corrail
01-24-2004, 02:36 AM
Yes, GLslang is a core part of GL2 but yet avaiable through extensions.

Zengar
01-25-2004, 02:02 AM
The reason why Nvidia delays glSlang may be that it isn't compartible with nvidias hardware. NV40, if we can believe the rumours, whould also contain fixed-point math units(16 bit precision). glSlang doesn't mirror it. It's the same as in DirectX 9.0 - originally it had to contain fixed-point, then it was dropped. Nvidia, which desined NV30 already had no time change it. I think that fixed-pont math is a very powerfull feature - all normal calculation can be carried out with it at a very hight quality. 16-bit fixedpoint grants the same precision as ATI's 24-bit floats.
No wonder that Nvidia isn't quick to support a feature which doesn't match it's hardware.
glSlang has several serious issues - that's mine opinion - and I don't understand why everyone is so fond of it. The idea is very nice, but the language is a bit shabby.

lc_overlord
01-25-2004, 04:09 AM
FX card's do support the features of GLSlang.
The implementation of GLSlang is actuarly only a compiler that compiles the GLSlang code into GPU assembler.
In theory you should be able to run GLSlang shader's on a GF3(allthough not as blazingly fast as you would like it to).
I think the reason Nvidia hasn't released it yet is because they want to release a fully fuctional version instead of a beta like ATI did.
I don't think that they will release the new driver's at the launch of nv40 since it's to far off.
My guess is that Nvidia will release a GLSlang supporting driver within a month.

Zengar
01-25-2004, 06:21 AM
I mean glSlang is not compartible with nvidias hardware and not the other way round. I just recalled all the PACK/UNPACK instructions. They ARE usefull, but I don't see support for them in glslang. Don't take me too serious but:
Why should nvidia support a shading language that doesn't match it's hardware? It will be the same situation as with DX.
As different programmable hardware have different features it is a way too big simplification to create the one SL. It will need another extension mechanism it want's to be "the one", and we will finally end writing difeferent paths for different cards. I don't comsider it bad, other way. Different versions of the same SL shader are not so annoying as different versions of assembler programs(written in different assembler). I liked Cg-s profile funcions.
Why not
#ifdef GeForceFX
...
#else
#ifdef Raeon
...
#endif

?
Wouldn't it be the best? I don't see any form of API that would unify the hardware. Nvidia will never encourage glSlang in it's nowadays form. And not because they are bad, selfisch and arrogant(later maybe http://www.opengl.org/discussion_boards/ubb/smile.gif ) but because they have another approach.

PanzerSchreck
01-25-2004, 07:08 AM
@NVidia :
I could think of two reasons why NV hasn't published a driverset with glsl-support. The first one is that they're slightly upset by the fact that the ARB decided against cG as the glsl, so NVidia wants to hinder the success of glslang. The second reason (maybe more likely than the first) is that they kind of "panicked". They haven't had really competitive hw since two generations (if you count the R9700 as one generation and the R9800 as the other) and rumors have that ATI's R420 is one month ahead of NVidia's NV40. And since the R420 is said to already have taped out, NVidia surely has better things to do than to implement glSlang-Support. So I think they won't have it in their drivers until the NV40 is out.

@ATI's beta glslang-implementation :
As already stated : It's better to have some beta-implementation you can play with than to have nothing. Since ATI released their first driverset with glSlang, I've been dabbling around with it for many hours and I've also enjoyed using it due to it's clear HL-syntax (although I don't really like C++ http://www.opengl.org/discussion_boards/ubb/wink.gif )

V-man
01-25-2004, 08:13 PM
R300 and R350 are in the same generation.
I don't know the codenames for the 9600XT and 5700 but it looks like Nvidia did something right with the 5700.

There are cases where the 5700 is faster and vice-versa, instead of always beeing below.
Example: http://www.hardocp.com/article.html?art=NTU5LDE=

As for GLSL... I dont know. Why don't you get in touch with one of the developers?

Humus
01-26-2004, 05:10 AM
Originally posted by Zengar:
I mean glSlang is not compartible with nvidias hardware and not the other way round. I just recalled all the PACK/UNPACK instructions. They ARE usefull, but I don't see support for them in glslang. Don't take me too serious but:
Why should nvidia support a shading language that doesn't match it's hardware? It will be the same situation as with DX.
As different programmable hardware have different features it is a way too big simplification to create the one SL. It will need another extension mechanism it want's to be "the one", and we will finally end writing difeferent paths for different cards. I don't comsider it bad, other way. Different versions of the same SL shader are not so annoying as different versions of assembler programs(written in different assembler). I liked Cg-s profile funcions.
Why not
#ifdef GeForceFX
...
#else
#ifdef Raeon
...
#endif

?
Wouldn't it be the best? I don't see any form of API that would unify the hardware. Nvidia will never encourage glSlang in it's nowadays form. And not because they are bad, selfisch and arrogant(later maybe http://www.opengl.org/discussion_boards/ubb/smile.gif ) but because they have another approach.

There's nothing preventing nVidia to add fixed point math, pack/unpack, and other features to the GLSL. They just need a regular extension.

I think the main reason why nVidia doesn't support GLSL yet is because they are behind the schedule, not because they don't intend to support it. Remember that adding GLSL support is a large project. It can easily take half a year to get to beta status, well, depending on how many people you put on the project of course. If nVidia started to implement it around the time the spec was ratified, they may still have a couple of months to beta status.

Zengar
01-26-2004, 06:38 AM
Humus, about a week ago I started writing my own HLSL compiler. It's a pascal-like language, rather simple(but not less poverfull as glslang). I didn't spent a lot of time on it till now, but it can produce some usefull output now. It converts the shader to ARB_vp, ARB_fp or NV_vp2, NV_fp. If I will continue it with the same tempo, it'll be ready within a month. Now look: If I, a person, which doesn't have any experiance with compilers can do it so fast, Nvidia is able to do it much faster. Don't forget: they have their unified compiler and cg. They could simply combine it together. It took me about 2 hours to write a full-featured lexical scanner! So I assume, it's not so diffucult to make the cg compiler support glslang.
So: after I made my own experiance with compilers I don't think that they are so difficult to write. They doesn't have to cover complicated memory management, stack, OOP etc.

Zeross
01-26-2004, 11:12 AM
nVidia WILL support Glslang, they already have GLslang enabled drivers available in limited beta. It's just a question of time.

PanzerSchreck
01-26-2004, 11:32 AM
@Zengar : That has already been done, but only for ARB_VP/ARB_Fp : http://www.sourceforge.net/projects/fxpascal
I used it before glSlang and it's a great alternative to it.

Korval
01-26-2004, 01:33 PM
So: after I made my own experiance with compilers I don't think that they are so difficult to write. They doesn't have to cover complicated memory management, stack, OOP etc.

And, what about the features that nVidia (and ATi) cards don't support? Like looping in the fragment shader, texture accesses (ugh) in the vertex shader, etc? The latter can be emulated on the CPU relatively easily, but the former requires a full software rasterizer. That is not trivial to write.

JD
02-11-2004, 04:02 PM
Nv glslang will be probably released in april. You can play with buggy and slow ati glslang or buggy cg. The cg 1.2 should be out in a month or so as well. Your best bet now is the nv asm fp for speed and arb fp for cross hw support but slower speed on nv hw. I honestly don't know why you complain. Try working with register combiners and then you'll love asm shaders http://www.opengl.org/discussion_boards/ubb/smile.gif Doesn't glslang lack some asm functionality? It does in cg.

Zak McKrakem
02-12-2004, 03:04 AM
Originally posted by JD:
Nv glslang will be probably released in april. You can play with buggy and slow ati glslang or buggy cg. The cg 1.2 should be out in a month or so as well. Your best bet now is the nv asm fp for speed and arb fp for cross hw support but slower speed on nv hw. I honestly don't know why you complain. Try working with register combiners and then you'll love asm shaders http://www.opengl.org/discussion_boards/ubb/smile.gif Doesn't glslang lack some asm functionality? It does in cg.

Have you tried current ATI glslang implementation?. It can be a little buggy in some areas but, in general, it work pretty well. Slow? It is as faster (if no more) than hand generated asm.

eek2121
02-17-2004, 01:51 AM
IIRC The Geforce FX 5700 has full support for Floating Point Shaders.

Nutty
02-17-2004, 10:11 AM
The latest leaked drivers (53.56 IIRC) support GLSL with some registry tweaks, according to some ppl.

CybeRUS
02-17-2004, 10:41 PM
maybe GL_EXT_cg_shader is explanation of delay

You can try this (I didn't check this yet):
[HKEY_LOCAL_MACHINE\SOFTWARE\NVIDIA Corporation\Global\OpenGl]

[HKEY_LOCAL_MACHINE\SOFTWARE\NVIDIA Corporation\Global\OpenGl\Debug]
"ShaderObjects"=dword:00000001
"WriteProgramObjectAssembly"=dword:00000000

Zengar
02-18-2004, 01:53 AM
I was able to run all the existing glslang demos I could get with 56.56

DarkWIng
02-18-2004, 09:35 AM
Funny. With that registy patch GLSLs GL_ARB_shader_objects and GL_ARB_shading_language_100 are exposed in extension string even on GF4. Isn't this a bit pointles since there is no ARB_FP on GF4. Or did someone find a magic way to run GLSL on GF4?

PS and a bit OT: What is the GL_NV_pixel_buffer_object in the new drivers for? I don't find it in extension registry.

jra101
02-18-2004, 12:07 PM
Originally posted by DarkWIng:
Funny. With that registy patch GLSLs GL_ARB_shader_objects and GL_ARB_shading_language_100 are exposed in extension string even on GF4. Isn't this a bit pointles since there is no ARB_FP on GF4. Or did someone find a magic way to run GLSL on GF4?

You will be able to use ARB_vertex_shader but not ARB_fragment_shader on GeForce 4 GPU's.

jeickmann
02-18-2004, 12:33 PM
Originally posted by jra101:
You will be able to use ARB_vertex_shader but not ARB_fragment_shader on GeForce 4 GPU's.

Well, technically you can also use the fragment shader, except you'll get software rendering.
I think this is something to look out for, since you won't get any errors except 0.5 fps.

Jan

jra101
02-18-2004, 12:59 PM
Well ARB_fragment_shader will only be exposed on GeForce 4 GPU's if NV30 emulation is enabled, in which case you should expect it to be slow.

MZ
02-18-2004, 07:12 PM
I've been running Humus' Portals demo (GFFX5200, 56.55) and played tweaking its GLSL shader text. Funny, it appears you can use some Cg's functions and types in GLSL code. That's probably what the GL_EXT_Cg_shader was made for.

Things that worked:
- fp30/vp30-profile specific functions: lit, pack/unpack, saturate, clip
- half, fixed, half2, fixed2, ... (they did affect output asm code, but I can't say that about speed: 0% difference, as usual on 5200)

Things that didn't work:
- fp20-profile specific functions (even 'expand' didn't work)
- I was unable to make the compiler produce code using RFL instruction. Anyone knows how it can be done in Cg?

Zengar
02-18-2004, 07:43 PM
glslang is compiled using cg on nvidia boards(the compiler is included in the driver), so it's probably not a surprise. http://www.opengl.org/discussion_boards/ubb/smile.gif

Hampel
02-18-2004, 10:22 PM
@CybeRUS: what is the reg key "WriteProgramObjectAssembly" for? by setting it to 0x0, you disables it?

Zengar
02-18-2004, 10:51 PM
@Hampel: you use that flag to get the asm output of the shader. This are txt files that would be created in the directory of your's exe.

imr1984
03-03-2004, 04:16 AM
glSlang works on nVidia cards? sweet http://www.opengl.org/discussion_boards/ubb/smile.gif i have a ati radeon 9800 and im wondering how does glSlang on an FX card compare with on a 9800?

jmpCrash
03-29-2004, 12:02 PM
forceware 56.68, which is only availabled on GDC SDK DVD supports officially GLSL

hey, come on NVIDIA guys out there, the ordinary mob also demands support