I’m having some trouble with gl_ModelViewMatrixInverseTranspose and gl_NormalMatrix. I have not applied scaling to the modelview matrix, so I expect that the following three lines should do the same thing:
vec3 A=normalize(gl_ModelViewMatrix * vec4(gl_Normal,0.0f)).xyz;
vec3 B=normalize(gl_ModelViewMatrixInverseTranspose*vec4(gl_Normal,0.0f)).xyz;
vec3 C=normalize(gl_NormalMatrix * gl_Normal);
A is exactly what I expect, but B and C are not. (From visual inspection of the results, B=C which makes sense, assuming gl_NormalMatrix is the upper 3x3 of gl_ModelViewMatrixInverseTranspose.) Am I mistaken in believing that these three should produce the same vector, since no scaling is involved?
I was trying to implement very simple lighting, with a non-attenuated point light hard coded and fixed at (0,0,10). I get what I expect when I use the vector A (see above) as the normal for the diffuse shading calculation. B and C make the light source appear to be coming from (0,10,0), and the light position remains fixed relative to the object as I rotate said object, rather than stationary at (0, 0, 10).
Have I blundered, or encountered a genuine bug?
Vertex Shader:
#version 120
void main()
{
vec3 lightPosition = vec3(0.0f, 0.0f, 10.0f);
vec3 Kd = vec3(1.0f, 1.0f, 1.0f);
vec3 lightColor = vec3(1.0f, 1.0f, 1.0f);
// vec3 N = normalize(gl_ModelViewMatrix * vec4(gl_Normal,0.0f)).xyz;
// vec3 N = normalize(gl_ModelViewMatrixInverseTranspose * vec4(gl_Normal, 0.0f)).xyz;
// vec3 N = normalize(gl_NormalMatrix * gl_Normal);
vec3 L = normalize(lightPosition - gl_Vertex.xyz);
gl_FrontColor.rgb = Kd * lightColor * max(0, dot(N, L));
gl_FrontColor.a = 1.0f;
gl_Position = ftransform();
}
Fragment Shader:
#version 120
void main()
{
gl_FragColor = gl_Color;
}