View Full Version : Fragment Shader can't negate a value

01-02-2007, 07:40 AM
Putting a minus sign in front of gl_ModelViewMatrixInverse[0][0] or other uniform matrix built-ins has no effect.

The first version of a test fragment shader below shows the problem. The second version shows a wierd variation- assigning the value to a local float seems to transfer the sign ignoring behavior. The matrix is the identity.

I am new to shaders so maybe I just don't understand something. Any help would be appreciated.

It may be a vendor problem. I am using an NVIDIA Quadro FX 350M.

Thank You.

vec4 pos_color = vec4(0.0,1.0,0.0,1.0);
vec4 neg_color = vec4(1.0,0.0,0.0,1.0);

#if 1

// Both versions of the conditional show
// green for positive. So try the next main.
void main()
// if(gl_ModelViewMatrixInverse[0][0]>=0.0)
gl_FragColor = pos_color;
gl_FragColor = neg_color;


void main()
float a;

a = gl_ModelViewMatrixInverse[0][0];

// These don't "show red" for negative a
// a = -a;
// a = -1.0*a;
// These do work properly
// a = -1.1*a;
// a = 0.0 - a;
gl_FragColor = pos_color;
gl_FragColor = neg_color;


01-02-2007, 01:50 PM
i think there is a sign func (ive never used it)
but assume u could do this
which should be faster as well without the if statement

gl_FragColor = mix( neg_color, pos_color, sign( a ) );

01-02-2007, 01:51 PM
wait sorry the sign statement i think returns -1 or 1 so u will need to clamp to 0
or res *= 0.5 + 0.5 // prolly slower