PDA

View Full Version : Fragment shader not working in OpenGL 2/GLSL 1.20



jas511
06-03-2011, 09:59 PM
I have a fragment shader program that is the following code:


#version 120
void main(void) {
if ( gl_Color.a > 0 ) {
gl_FragColor.rgb = gl_Color.rgb;
gl_FragColor.a = 0.2;
}
}

I run this on my machine which has an nvidia card with GLSL 1.3 and it works fine. I have a user that tried to run in it on their machine, and they have an ATI graphics card with OpenGL 2.1 and GLSL 1.2, and it does not set the fragment alpha value.

Is there something that wouldn't work here on OpenGL 2.1/GLSL 1.2? I'm stumped...

Thanks,
Jeff

V-man
06-05-2011, 06:24 AM
The obvious thing is that you aren't outputing to gl_FragColor. You should probably init it to vec4(0).

Besides that, it could be that it isnt a shader problem. It is a buffer problem.

jas511
06-05-2011, 10:41 AM
V-man, thanks. What do you mean it could be a buffer problem?

V-man
06-05-2011, 10:59 AM
Perhaps you don't have a alpha buffer.
You can check it with glGetIntegerv(GL_ALPHA_BITS, integer) if you are using a backward compatible context.

jas511
06-05-2011, 02:49 PM
I was also having the same issue with a fragment shader that was not changing the alpha, but rather changing the color. That machine had the same problems that the shader wouldn't work.

BionicBytes
06-06-2011, 01:02 AM
So what happens if you take the IF statement out:

#version 120
void main(void) {
//if ( gl_Color.a > 0 ) {
gl_FragColor.rgb = gl_Color.rgb;
gl_FragColor.a = 0.2;
// }
}

Presumably the problem with this shader is that it does not always set the gl_FragColor - because it depends upon the incomming per-vertex color attribute being > 0.

jas511
06-07-2011, 12:35 PM
I'll have our users test that out if possible. However, the shader does work on most graphics cards. My PC runs GLSL 1.3 and it runs fine. The one we have problems with has GLSL 1.2, so I wouldn't think it would work on one but not the other. But we'll give it a shot.

BionicBytes
06-07-2011, 02:56 PM
if ( gl_Color.a > 0
One more thought. GLSL v120 does not support integers. Some compilers don't like values specified as if they were integers and need to be floats. So the above if statement should be

if ( gl_Color.a > 0.0)

jas511
06-08-2011, 05:54 AM
Thanks, I'll give that a try as well.

Will the version number cause problems as #version 120 since 120 is an integer also?

V-man
06-08-2011, 09:00 AM
That's how the version is written.
I don't know why you have to leave the "." out, but that is what the ARB decided.

jas511
06-08-2011, 09:10 PM
Hmm...still no luck. I'm completely stumped. I have it working on numerous machines except on this Dell Optiplex 980 with an ATI Radeon 3450 (latest drivers from Dell, not from AMD though).

jas511
06-09-2011, 01:02 PM
It turns out there are other issues with some rendering to a frame buffer I am doing. I am making an additional post regarding that.

frank li
06-12-2011, 07:37 PM
So you could make it work on other platforms except Dell Optiplex 980. Why don't you install the official Cat11.5 to have a try?