GLSL Parser Test

Trying to get started using shaders, so I found this program called GLSL Parser Test
and ran it:

GLSL Parser Test
vendor: ATI Technologies Inc.
renderer: RADEON 9700 PRO x86/SSE2
OpenGL version: 1.4.4103 WinXP Release 

pass   The GLSL implementation parsed this shader correctly.  
fail   The GLSL implementation did not parse this shader correctly.  
crash   The GLSL implementation crashed while parsing this shader.  

passed: 154
failed: 20
score: 89%
10/13/04 22:36:10

Is this good or bad? Should I get a 100% if I want to start coding shaders,
or does my graphics card suck and should forget about it?

If you want to code GLSL an ATI card is rather what you want. ATI still has the best GLSL implementation around. Last time I saw any numbers NVidia scores were at 50-60% or therearound. It may have improved since. 3Dlabs I believe have 100%, but that’s no surprise since they wrote it. It’s by no means an impartial tool.

You may want to start by upgrading your drivers. I get 97% success with the latest drivers. I see you have OpenGL 1.4 there. ATI drivers have supported OpenGL 1.5 for I don’t know how long.

Thanks for the reply. I just need a good litmus test for shaders to tell me where I stand
before I spend any serious time on it.

My experience has been the reverse. Mind you, I’m using July ATI drivers, because that’s all that Dell has released for my Mobility 9700. But I’m also using July era NVIDIA drivers, so that’s fair.

The two most annoying things with the ATI GLSL support are:

  1. Literal integers (“0” or “1” say) don’t auto-convert to floats; you have to write them as “0.0” or “1.0” or you’ll get type mismatches.
  2. Missing the #line preprocessor directive, so I can’t trace the compile errors to the right line from my assembled fragments easily.

The fact that standard GL state isn’t correctly tracked (i e, glColor4ub seems to “leak” or “stick” between passes) doesn’t help, either. At this point, I’m pretty sure it’s the drivers, not my code.

How many of these problems are fixed with current ATI drivers? And, how can I make them actually work on my Dell Inspiron laptop?

Originally posted by jwatte:
How many of these problems are fixed with current ATI drivers? And, how can I make them actually work on my Dell Inspiron laptop?
try omega drivers (www.omegadrivers.net), they are modded to run in mobile graphics.

jwatte:
I thought literal integers where not allowed to be converted to floats automatically. (ie according to the spec?) Unless thay have changed it since the last time I read it?

I’d be curious to hear more about this glColor bug…

Moving on up! :slight_smile: Just installed the latest ATI drivers.

GLSL Parser Test
vendor: ATI Technologies Inc.
renderer: RADEON 9700 PRO x86/SSE2
OpenGL version: 1.5.4582 WinXP Release 

pass   The GLSL implementation parsed this shader correctly.  
fail   The GLSL implementation did not parse this shader correctly.  
crash   The GLSL implementation crashed while parsing this shader.  

passed: 168
failed: 6
score: 97%

10/14/04 01:37:38

Originally posted by jwatte:
1) Literal integers (“0” or “1” say) don’t auto-convert to floats; you have to write them as “0.0” or “1.0” or you’ll get type mismatches.

That’s actually according to spec, it isn’t allowed. I think not allowing this kind of well defined conversion is over zealous, but that’s what they agreed on. At least in the 100 version, it might be fixed be fixed in 110, but I haven’t checked the spec. I think the shaders ATI fails to parse in the recent drivers are verison 110 as well, so they don’t seem to support that version yet.

Most of all failures refer to 110 in the filename:

CorrectExtension1.V110.frag    success   error   	fail
CorrectExtension4.V110.frag    success   error   	fail
CorrectExtension10.V110.frag   success   error   	fail

CorrectVersion.V110.frag       success   error   	fail
CorrectVersion1.V110.frag      success   error   	fail
ParseTest4.frag                error   success   	fail

So, I guess I can throw those cases out and assume I got a 99.9%. :slight_smile:

Originally posted by jwatte:
[b]My experience has been the reverse. Mind you, I’m using July ATI drivers, because that’s all that Dell has released for my Mobility 9700. But I’m also using July era NVIDIA drivers, so that’s fair.

The two most annoying things with the ATI GLSL support are:

  1. Literal integers (“0” or “1” say) don’t auto-convert to floats; you have to write them as “0.0” or “1.0” or you’ll get type mismatches.
  2. Missing the #line preprocessor directive, so I can’t trace the compile errors to the right line from my assembled fragments easily.

The fact that standard GL state isn’t correctly tracked (i e, glColor4ub seems to “leak” or “stick” between passes) doesn’t help, either. At this point, I’m pretty sure it’s the drivers, not my code.

How many of these problems are fixed with current ATI drivers? And, how can I make them actually work on my Dell Inspiron laptop?[/b]
No.1 is according to spec and won’t be changed unless the spec changes. I agree it should though.
No.2 works just fine. No idea how long it’s been in the drivers, I just tested it now. I might as well implement that into my framework now that you brought it up, it’s pretty useful functionality.

As for the glColor bug, I’m not even sure what that’s supposed to mean. Care to give an example?

Here’s a tool to get regular Catalyst drivers to install on your laptop:
http://www.driverheaven.net/patje/

I thought literal integers where not allowed to be converted to floats automatically.
That’s one of the most absurd things I’ve heard today. Standards bodies never cease to amaze me. It’s a friggin’ LITERAL! I think NVIDIA are doing the right thing in accepting this :slight_smile:

I tried using #version 110, but neither of my drivers would accept it.

the glColor bug … Care to give an example?
Well… I call glBindProgram(). I call glColor4f(). I read it using gl_Color in the vertex shader, and pass it straight to gl_FrontColor. I read gl_Color in the fragment shader, and modulate the result of various other calculations. On NVIDIA, the right objects have the right colors modualted. On ATI (Dell drivers), various objects seem to get various unintended colors modulated. Although the un-intentional-ness is consistent, i e terrain always gets a dark muddy brown (used on car wheels); left front car wheel always gets a bright blue (used on stack of boxes); etc.

If it persists across driver upgrades, I might be able to send a copy to devrel@ati.

http://www.driverheaven.net/patje/
Lovely.

Although nVidia is only 56% GLSL compatible vs 97% for ATI, I can still do more things with my nVidia card, because they have far more possible instructions in fragment shaders.
I cannot digest ATI’s 4 texture indirection limitation…

So I tried the patcher on DriverHeaven.net on the latest cat 4.10 download, selecting Mobility Radeon 9700 as the target, and the ATI Catalyst 4.10 drivers still wouldn’t install on my laptop after I used it – the installer got a little further, but then said that no device that matched the driver was found on the system.

In the end, I used “update driver” and forced Windows to install that particular driver, which seems like it might have worked (control panel says Catalyst 4.10, Radeon 9500 Pro/9700). However, the bug is still there (as is a cube map sampler issue, which I might report separately).

Looking at it, it looks as if the tracking of glColor into gl_Color is treated like a user Uniform, and is remembered for whatever program is active at the time of the call, rather than changing GL state, and GL state being reflected to the built-in uniforms on program invocation. It’s un-clear whether the NVIDIA interpretation or the ATI interpretation is correct, but clearly, both can’t be!

On the off topic gl_Color bug originally posted by jwatte:

Two possible things to rule out:

1 - are you using glVertexAttrib4*v( 3, ) by any chance?

NVIDIA’s current implementation incorrectly trashes gl_Color with glVertexAttrib4v (and visa-versa) by aliasing glColorv and glVertexAttriv4*v( 3, ).

2 - have you inadvertantly set two sided lighting by enabling GL_VERTEX_PROGRAM_TWO_SIDE?

A quick sanity check:

// Your vertex shader
// ...
gl_FrontColor = gl_Color; // Existing 
gl_BackColor = gl_Color;  // Add this line temporarily for sanity check

If the bug goes away, that’s where you need to look.

-mr. bill

  1. No, I’m not. Only plain GL state.
  2. No, I haven’t. Back faces are culled away, even.

Originally posted by mrbill:
[b]On the off topic gl_Color bug originally posted by jwatte:

Two possible things to rule out:

1 - are you using glVertexAttrib4*v( 3, ) by any chance?

NVIDIA’s current implementation incorrectly trashes gl_Color with glVertexAttrib4v (and visa-versa) by aliasing glColorv and glVertexAttriv4*v( 3, ).
[/b]
According to NVIDIA’s release notes for glsl :

NVIDIA’s GLSL implementation does not allow built-in vertex attributes to collide with a generic vertex attributes that is assigned to a particular vertex attribute index with glBindAttribLocationARB. For example, you should not use gl_Normal (a built-in vertex attribute) and also use glBindAttribLocationARB to bind a generic vertex attribute named “whatever” to vertex attribute index 2 because gl_Normal aliases
to index 2.

The fact is that NVIDIA opengl shading language implementation is far more permissive than 3DLab’s or ATI’s implementation. Non standard / yet implemented features are there for compatibility issue as the compiler used to compile glsl is the same for cg and hlsl (cg compiler) but with another “target”.

For a more complete answer to a lot of questions in this thread i highly recommend this reading.

++

The specific bug I was referencing turns out to be an issue with COLOR_MATERIAL and gl_FrontLightProduct when using GLSL on current Catalyst drivers.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.