Swapping a fragment shader

Hi All,

I am completely new to shader and need help with the following. I have two shader working (see below) and I changed the fragment shader with the one at the bottom (gaussian blur). Why I get an GL_INVALID_VALUE error? Should I init some values?

Thanks,

Alberto

void main()
{

  gl_Position = ftransform();
	
}

void main()
{

   gl_FragColor = vec4(0.9,0.4,0.8,1.0);

}
/////////////////////////////////////////////////
// 7x1 gaussian blur fragment shader
/////////////////////////////////////////////////

 varying vec2 v_Coordinates;

 uniform vec2 u_Scale;
 uniform sampler2D u_Texture0;

 const vec2 gaussFilter[7] = 
 { 
              -3.0,	0.015625,
               -2.0,	0.09375,
               -1.0,	0.234375,
               0.0,	0.3125,
               1.0,	0.234375,
               2.0,	0.09375,
               3.0,	0.015625
};

void main()
{
     vec4 color = 0.0;
     for( int i = 0; i < 7; i++ )
     {
        color += texture2D( u_Texture0, vec2( v_Coordinates.x+gaussFilter[i].x*u_Scale.x, v_Coordinates.y+gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
     }

     gl_FragColor = color;

}

Assuming that you also updated the vertex shader to generate the v_Coordinates varying there are following syntax errors in the pixel shader.

The array is initialized incorrectly. It should be something like


vec2 gaussFilter[7] = vec2[7]( 
             vec2( -3.0,  0.015625 ),
             vec2(  -2.0, 0.09375 ),
             vec2(  -1.0, 0.234375 ),
             vec2(  0.0,  0.3125 ),
             vec2(  1.0,  0.234375 ),
             vec2(  2.0,  0.09375 ),
             vec2(  3.0,  0.015625 )
);

The color vector is initialized incorrectly. It should be:


vec4 color = vec4( 0.0 );

Additionally some hw does not support dynamic indexing into constant arrays in the pixel shader. If on such hw the driver fails to unroll the for loop, the compilation might also fail. You can use the glGetShaderInfoLog function to retrieve string describing compilation or linking errors.

If you want to see the reason of your invalid errors try to compile manually your shader using Cg compiler for example, or at running time, check the shader compilation infoLog, which contains all the syntax errors that cause the compilation failing.

Look at the glGetShaderInfoLog function that is very useful to retrieve many informations about your shaders.

dletozeun,

Please send me some links to programs to test shaders code.

Thanks.

Alberto

Komat,

I found this code inside a tutorial on the internet and thought it was fine.

Thanks for your help, I will try to fix it and let you know.

Alberto

Nvidia drivers are by default very lax in syntax checking so if author of such tutorial does have only Nvidia card, he will likely never notice that.

External applications for checking shaders are:
GLSL Validate This is somewhat older tool without support for never versions of the GLSL language. In my experience it can be sometimes confused by #ifdef statements.

GPU Shader Analyzer Shows how the shader compiles for various ATI GPUs.

Cg Toolkit While intended for the Cg language it can be used to compile GLSL shaders using the -oglsl command line parameter. The same compiler is used in the Nvidia drivers.

GLExpert This program can enable automatic dump of compiled shaders and log files into directory of the program which compiled them. Works only on Nvidia cards.

And here is what you are looking for if you want to check that shaders are valid at runtime.

Hi Kormat,

I downloaded the 3D Labs GLSL Validate tool, cleaned up the code but I still get a strange error:

Parsing fragment shader 'fra.txt'....
Failure.

ERROR: 0:8: 'gaussFilter' : syntax error parse error
ERROR: 1 compilation errors.  No code generated.

Here is my fragment shader code:

varying vec2 v_Coordinates;

uniform vec2 u_Scale;
uniform sampler2D u_Texture0;

vec2 gaussFilter[];

gaussFilter[0] = vec2(-3.0, 0.015625);
gaussFilter[1] = vec2(-2.0, 0.09375);
gaussFilter[2] = vec2(-1.0, 0.234375);
gaussFilter[3] = vec2(0.0, 0.3125);
gaussFilter[4] = vec2(1.0, 0.234375);
gaussFilter[5] = vec2(2.0, 0.09375);
gaussFilter[6] = vec2(3.0, 0.015625);

void main()
{

   vec4 color = vec4(0.0);

   for( int i = 0; i < 7; i++ )
   {
      color += texture2D( u_Texture0, vec2( v_Coordinates.x+gaussFilter[i].x*u_Scale.x, v_Coordinates.y+gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
   }

   gl_FragColor = color;
 
}

To me looks everything fine. Where is the problem?

Thanks,

Alberto

what are you trying to do with this?

vec2 gaussFilter[];

Even in C this would not work since you are not allocating the memory the for this array.

you have to put a constant value between the brackets to define the size of the array. Dynamic allocation and pointers are not supported in glsl shaders, you can only create static arrays. Some nvidia cards will allow you to put a variable between the brackets compiling with their own specific profile.

EDIT:

I have never tested but I think you can do something like:

float plop[] = {0.0, 1.0, 2.0};

Since the compiler will know at compile time the array size.

the Orange book “OpenGL Shading Language” page 73 says that it is possible, by the way even setting

vec2 gaussFilter[7];

nothing changes.

Thanks,

Alberto

Don’t have this book, but yeah you are right, it looks like the compiler implictly determines the array size.

Move the array initialization in a function.

I have tried this and it works on a geforce 8800GTS, not sure that would work on every hardware.

Just a clarification, as a programmer I am a little concerned about all these ’ should work ', ’ on my geforce it works ', ’ nvidia GPU are more tolerant ': isn’t the shading language “A STANDARD” ?

My last fragment shader apparently has no syntax errors, why it doesn’t work yet?

Thanks.

Alberto

Because different vendors have their own ideas, because the implementations are buggy and because the hardware differs. It’s a shame, but GLSL, while being a “standard”, isn’t just standard enough…

Yes, I completely share your concern… GLSL is a standard in my opinion but not pushed further enough. It looks like IHV, can’t agree on standards and always put their fancy functionnalities to make the difference…this is a shame and the cause of many headaches for game developers.

And, putting the array initialization in a function like main, resolved the problem or not? If it doesn’t it may works all the same using in your application. I don’t know how GLSL validate works but for instance the Cg compiler gives me more information about the syntax error.

Do you mean like this:


varying vec2 v_Coordinates;

uniform vec2 u_Scale;
uniform sampler2D u_Texture0;

void main()
{

vec2 gaussFilter[7];

gaussFilter[0] = vec2(-3.0, 0.015625);
gaussFilter[1] = vec2(-2.0, 0.09375);
gaussFilter[2] = vec2(-1.0, 0.234375);
gaussFilter[3] = vec2(0.0, 0.3125);
gaussFilter[4] = vec2(1.0, 0.234375);
gaussFilter[5] = vec2(2.0, 0.09375);
gaussFilter[6] = vec2(3.0, 0.015625);

   vec4 color = vec4(0.0);

   for( int i = 0; i < 7; i++ )
   {
     color += texture2D( u_Texture0, vec2( v_Coordinates.x+gaussFilter[i].x*u_Scale.x, v_Coordinates.y+gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
        }

gl_FragColor = color;

}

Yes, it works!

Now I get the error only if I uncomment the commented line:

varying vec2 v_Coordinates;

uniform vec2 u_Scale;
uniform sampler2D u_Texture0;


void main()
{
   vec2 gaussFilter[7];

   gaussFilter[0] = vec2(-3.0, 0.015625);
   gaussFilter[1] = vec2(-2.0, 0.09375);
   gaussFilter[2] = vec2(-1.0, 0.234375);
   gaussFilter[3] = vec2(0.0, 0.3125);
   gaussFilter[4] = vec2(1.0, 0.234375);
   gaussFilter[5] = vec2(2.0, 0.09375);
   gaussFilter[6] = vec2(3.0, 0.015625);

   vec4 color = vec4(0.0);

   for( int i = 0; i < 7; i++ )
   {
      color += texture2D( u_Texture0, vec2( v_Coordinates.x+gaussFilter[i].x*u_Scale.x, v_Coordinates.y+gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
    //  color += texture2D( u_Texture0, vec2( gaussFilter[i].x*u_Scale.x,gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
   }

   gl_FragColor = color;
 
}

I init variables like this:

glUseProgram(myProgram);
glUniform2f(glGetUniformLocation(myProgram, "u_Scale"), 1, 1);
glUniform1i(glGetUniformLocation(myProgram, "u_Texture0"), 0);

What should I pass for the v_Coordinates one?

I cannot understand what is it for…

Thanks again,

Alberto

I meant that you can keep the gaussFilter array as a global variable and do the initialization in a function, but if you don’t need a global variable let your shader like this.

For the next problem, I don’t see the cause, what is the error?

EDIT:

On some hardware, indexing arrays with variables causes problems like on some ATIs but works fine on nvidia ones. I have myself problems to understand this disparities.

What is amazing is the line just before that is very similar to the commented one and don’t causes any compilation errors…

Finally I have the followings working without errors!

Now the question is: why the texture is not affected at all from this shader.

I write black triangles on a transparent texture (glClear(0,0,0,0)) and get the same initial transparent texture.

Thanks agian for your help!

Vertex:

varying vec2 v_Coordinates;

void main()
{	
   v_Coordinates = gl_Vertex.xy;
   gl_Position = ftransform();
}

Fragment:

varying vec2 v_Coordinates;

uniform vec2 u_Scale;
uniform sampler2D u_Texture0;

vec2 gaussFilter[7];

void main()
{

gaussFilter[0] = vec2(-3.0, 0.015625);
gaussFilter[1] = vec2(-2.0, 0.09375);
gaussFilter[2] = vec2(-1.0, 0.234375);
gaussFilter[3] = vec2(0.0, 0.3125);
gaussFilter[4] = vec2(1.0, 0.234375);
gaussFilter[5] = vec2(2.0, 0.09375);
gaussFilter[6] = vec2(3.0, 0.015625);

   vec4 color = vec4(0.0);

   for( int i = 0; i < 7; i++ )
   {
      color += texture2D( u_Texture0, vec2( v_Coordinates.x+gaussFilter[i].x*u_Scale.x, v_Coordinates.y+gaussFilter[i].x*u_Scale.y ) )*gaussFilter[i].y;
   }

   gl_FragColor = color;
 
}

Uniform initialization:


glUseProgram(myProgram);
glUniform2f(glGetUniformLocation(myProgram, "u_Scale"), 1, 1);
glUniform1i(glGetUniformLocation(myProgram, "u_Texture0"), 0);

I have never written a gaussian blur shader, so maybe, my question is silly but:

Are you sure of your texture coordinates? v_Coordinates should not be the one after applying ftransform()?