View Full Version : No Matrix-Vektor-Mult in Fragment-Shader with ATI on Linux

02-02-2005, 06:39 AM

I have a strange Problem here. If i use this simple Fragment-Shader:

varying vec4 shaded_color;

void main(void) {
gl_FragColor = shaded_color;
}everything works fine. Do i change it like this

varying vec4 shaded_color;

void main(void) {
mat4 test = mat4(
1.0, 0.0, 0.0, 0.0,
0.0, 1.0, 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0
vec4 textVec = test * gl_FragCoord;

gl_FragColor = shaded_color;
}all i get is a black window. It doesn't matter what matrix i multiply with what vector or if they are 2, 3 or 4 dimensional. If i multipy a matrix with a vector, the fragment-shader seems to discard the fragment. But i dont get any errormessage.

I use Gentoo-Linux with a Radeon 9600Pro and latest ATI-drivers.

Any Idea?

02-02-2005, 09:27 AM
i was wrong: it only crashed if i use gl_FragCoord as vector. With vec4(1.0, 2.0, 3.0, 4.0) instead it works. But i realy need gl_FragCoord...

02-03-2005, 03:55 AM
Im not sure (somebody has to correct me if I wrong) but I think that ATI doesn't support gl_FragCoord as well as derivates (dFdx, dFdy, fwidth).


02-03-2005, 05:13 AM
I didn't get what are you using textVec for... ATI should support glFragCoord, but doesn't support dFdx, dFdy etc.

02-03-2005, 05:27 AM

there is no sense in the calculation of testVec, it is only for testing.

I just tried it on a NVidia-Card (FX5700), and here everything works fine. Seems to be a Problem of ATI.

02-03-2005, 05:55 AM
Well, this is no real proof. ;)
The compiler should remove useless instructions and keep only this:
varying vec4 shaded_color;
void main(void) {
gl_FragColor = shaded_color;
You need to source gl_FragCoord into an output to be sure.

02-03-2005, 06:49 AM
I just tried:

gl_FragColor = normalize(gl_FragCoord);

wich produces a nice color-gradient on NVidia, and a black window with ATI.