ATI problem with passing constant indices to functions?

The following vertex shader does not compile with ATI drivers:

varying vec4 lightPos;

vec4 GetLightPos(int i)
{
    return gl_LightSource[i].position;
}

void main(void)
{
    gl_Position = ftransform();
    gl_FrontColor = gl_Color;
    lightPos = GetLightPos(0);
}

giving the error:
ERROR: 0:5: ‘[’ : array must be redeclared with a size before being indexed with a variable

Obviously, the following does work:

varying vec4 lightPos;

void main(void)
{
    gl_Position = ftransform();
    gl_FrontColor = gl_Color;
    lightPos = gl_LightSource[0].position;
}

So, apparently, the ATI drivers aren’t inlining functions before checking for variable indices.

However, 3DLab’s ShaderGen produces code that does send light source indices as parameters to functions. Needless to say, ShaderGen produces code that does not compile with my ATI drivers.

Can anyone shed some light into this problem? Is ATI not confirming to the standard or is ShaderGen producing code that yields unspecified results on some cards?

I’m using ATI drivers 8.18 on a 9600 pro.

The drivers just don’t like it.
Are you running this on Linux?
I just tested on Windows (catalyst 5.9) and it gives the same message.

It has nothing to do with standards. Either it is supported or not. I think ATI’s an use the address register for this but they haven’t coded the support in their drivers.

Write

static vec4 GetLightPos(int i)

instead of

vec4 GetLightPos(int i)

Write
static vec4 GetLightPos(int i)
instead of
vec4 GetLightPos(int i)
Using static won’t work, it is a reserved keyword and should cause the shader not to compile.

I’m using windows as well. I tried the shader with nVidia drivers, and it seems to work.

Alarmingly enough, adding “static” still compiles and works with nVidia drivers. I’ll try to use it with ATI drivers tonight and see if it works.

So is it safe to say the ATI driver is broken?

If this is not part of the OGSL standard, wouldn’t it be a bad idea to write code that uses it at all? In that case, shouldn’t 3DLabs ShaderGen be producing code that can run on all cards?

If this is not part of the OGSL standard, wouldn’t it be a bad idea to write code that uses it at all?
Correct, writing code that is not part of the standard may result in broken shaders and most likely will break portability.

In that case, shouldn’t 3DLabs ShaderGen be producing code that can run on all cards?
GLSL ShaderGen generates shaders that follow the OpenGL Shading Language Specification. GLSL Is still a fairly new language, and not all implementations handle all of the language elements used in GLSL ShaderGen .

The tool is written to create portable GLSL code that strictly follows the specification, therefore it should run on all hardware advertising support for the OpenGL Shading Language. We only have control over our implementation, it is impossible to guarantee GLSL ShaderGen will run on every vendor’s hardware since we have no control over their implementation.

GLSL ShaderGen does have an editable text box, if you need to make changes to the generated shaders on the fly you may do so by editing the fragment and vertex shaders within the text box. Press compile and then link and you’ll have the edited shader applied.

We also provide tools to help in the creation of portable shaders, even if you’re using a a compiler that does not adhere to the specification. These tools are GLSL Validate and GLSL Parser Test . GLSL Validate uses a strict compiler to parse your shader and gives specific error messages if the shader doesn’t adhere to the spec. GLSL Parser Test runs a group of shaders through the compiler on your machine and gives a detailed report on the compiler’s specification compliance.

Thanks for the post, 3DlabsDevRel. Those tools will come in handy for me.

I just upgraded to the latest ATI drivers (Catalyst 5.10) and it seems that the problem persists. I wonder if anyone from ATI could tell us if the problem is being adressed.

I reported the same problem a year ago, and it’s still not fixed. But please bug them too, it’s really annoying (basicly making any decent form of dynamic multiple light stuff impossible)
http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=11;t=000461

The problem boils down to that you just can’t access the gl_LightSource array in any semi-dynamic way. It’s a simple compiler/parser internal state error as it actually works for the very first compilation in a context. (I have provided them a simple test case compiling the same simple program twice)

A workaround is to use uniforms instead:

uniform gl_LightSourceParameters glLightSource[8];

Thanks, guys. I’m guessing it’s a known problem to ATI, but I’ll bug it anyway.

Another option (which I haven’t tested) would be to pass in “gl_LightSourceParameters” instead of “int” to the functions:

GetLightPos(gl_LightSource[0]);

instead of:

GetLightPos(0);

Since you seem to be calling the function with a constant parameter anyways, have you tried declaring GetLightPos() to take a const int instead? It might hint the compiler into not being such a jerk :smiley:

I tried “const int” to a function and then “const gl_LightSourceParameters” but none of them worked (although the later gave a different (ati only) error message).
So the pretty ugly multiline #define AddLight(i) has to stay :frowning:

Using custom uniforms instead isn’t really viable for the general case here as it requires quite some structure around the shader setup. (especially as I assume only actually used elements/members are handled when using the standard array).

Yes, I’ve noticed passing a “const gl_LightSourceParameters” does not work properly, however, the following does work:

    gl_LightSourceParameters lsp = gl_LightSource[0];
    lightPos = GetLightPos(lsp);

But I’m pretty sure that’s actually copying the light parameters structure, because the following does NOT work:

    const gl_LightSourceParameters lsp = gl_LightSource[0];
    lightPos = GetLightPos(lsp);

Which makes me wonder why didn’t the GLSL spec force uniforms (like gl_LightSource[]) to be const? I think that would make the language more strict and clearer.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.