gl_NormalMatrix no worky, gl_ModelViewMatrix does?

I am currently trying to implement per pixel lighting in my app (OpenGL Descent 2 modification).

The shader is the usual simple affair for vanilla pp-lighting, but it just didn’t work, until I changed

normal = normalize (gl_NormalMatrix * gl_Normal)

to

normal = normalize (vec3 (gl_ModelViewMatrix * vec4 (gl_Normal, 1.0)))

Why the heck is that?

I double and triple and quadruple and … checked to make sure the normals set with glNormal are correct, so I have no idea why gl_NormalMatrix doesn’t do what I expect it to (transform the normal properly).

Do I have to explicitly set it, like the model view matrix? I thought it was derived from the model view matrix?

Any ideas? Please enlighten me, maybe I am making some fundamental mistake I am not aware of?

You should do
normal = normalize (vec3 (gl_ModelViewMatrix * vec4 (gl_Normal, 0.0)))

because if you use 1.0, then your normal gets translated.

I wouldn’t know why gl_NormalMatrix wouldn’t work. It is automatically generated by the driver so if the driver has problems…

I had the same problem. It was a driver bug on OS X with a Radeon 9600M.

V-man,

true, thanks.

Now the ATI driver crashes when it has to link my per pixel lighting shader programs …

ATI drivers seem to be a bunch of crap.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.