Shading influenced by camera's orientation

I’m rather embarrassed to ask for help on this but I can’t figure out what the problem is. I want to simulate basic phong shading in GLSL. However, the resulting shading is influenced by the camera’s orientation. Can you spot my mistake? I’ve tried switching spaces, to no avail.

Thank you!

Vertex shader:

#version 330 core

layout(location = 0) in vec4 in_vertexPos;
layout(location = 1) in vec2 in_textureCoords;

layout(location = 2) in vec4 in_weights;
layout(location = 3) in ivec4 in_boneIds;

layout(location = 4) in vec3 in_normal;

out Vertex
{
	vec3 normal;
	
	vec3 eyeVector;
	vec3 lightVector;

	vec2 textureCoords;
} vertex;

layout(std140) uniform;

uniform Transform
{
	mat4 view;
	mat4 model;

	mat3 normal;
	mat4 modelView;

	vec4 animate;
	mat4 bones[64];
} transform;

uniform Projection
{
	mat4 perspective;
	mat4 orthographic;
} projection;

vec3 vLightPosition = vec3( 5.0, 5.0, 5.0 );

void main()
{
	vertex.normal = transform.normal * in_normal;
	vertex.lightVector = mat3(transform.modelView) * (vLightPosition - in_vertexPos.xyz);

	gl_Position = projection.perspective * transform.modelView * in_vertexPos;
}

Fragment shader:

#version 330 core

in Vertex
{
	vec3 normal;
	
	vec3 eyeVector;
	vec3 lightVector;

	vec2 textureCoords;
} vertex;

out vec4 finalColor;

uniform Material
{
	vec4 ambient;
	vec4 diffuse;
	vec4 specular;

	float shininess;
} material;

uniform bool renderWireframes;
uniform sampler2D u_diffuseTexture;

void main()
{
	float diff = max(0.0, dot(normalize(vertex.normal), normalize(vertex.lightVector)));

	finalColor = vec4( vec3(diff), 1.0);
}

Here’s how I setup the uniform matrices:

m_graphicsEngine->updateUniformBuffer( m_transformBuffer, &m_viewMatrix, "view" );

const mat4 &modelMatrix = transform->getModelMatrix();
m_graphicsEngine->updateUniformBuffer( m_transformBuffer, &modelMatrix, "model" );

const mat4 modelViewMatrix = m_viewMatrix*modelMatrix;
m_graphicsEngine->updateUniformBuffer( m_transformBuffer, &modelViewMatrix, "modelView" );

const mat3 normalMatrix = glm::inverseTranspose( mat3(modelViewMatrix) );
m_graphicsEngine->updateUniformBuffer( m_transformBuffer, &normalMatrix, "normal" );

vec3 vPosition3 = vPosition4.xyz / vPosition4.w;

Not necessary since you’re not projecting with the modelview matrix. Furthermore, I’d translate the worldspace light direction to eye coordinates. What you do here is compute the difference of an eye and world space vector:


vertex.lightVector = normalize(vLightPosition - vPosition3);

You could instead do:


vertex.lightVector = mat3(transform.modelView) * (vLightPosition - in_vertexPos.xyz);

This assumes that the rotational part of modelView is orthogonal.

Also, normalizing in the vertex shader is wasted since for correct results you’ll have to re-normalize the interpolated vector in the fragement shader anyway. You can use the interpolated, normalized vector but in general it doesn’t have unit length.

This should do it.

Just to be clear - it is not only wasted, it is plain wrong, as interpolated directions will be wrong.

Thank you for your quick answer!
I adopted your suggestions, however, the problem remains.

Model and camera are positioned at the origin here, and no rotation is applied to the model:

turns out the problem seems to be the normal matrix!

if i replace this:

vertex.normal = transform.normal * in_normal;

by this:

vertex.normal = transpose(inverse(mat3(transform.modelView))) * in_normal;

shading is no longer influenced by the camera’s orientation. However, it sometimes “flips around” when i move the camera close to the origin.

Is the normal matrix correct? You could try


vertex.normal = mat3(transform.modelView) * in_normal;

Mathematically, since the transformation is linear, you can also express the light vector computation as


vec3 eLightPosition = mat3(transform.modelView) * vLightPosition;
vec3 ePosition = (modelView * in_vertexPos).xyz;
vertex.lightVector = eLightPosition - ePosition;

but if the input values are correct the light vector shouldn’t be any different.


vertex.normal = transpose(inverse(mat3(transform.modelView))) * in_normal;

This is redundant. Since it’s probably the case that mat3(transform.modelView) is orthogonal it is always true that (M^-1)^T = (M^-1)^-1 = M.

if the input values are correct the light vector shouldn’t be any different.

this does indeed result in different shading!

This is redundant. Since it’s probably the case that mat3(transform.modelView) is orthogonal it is always true that (M^-1)^T = (M^-1)^-1 = M.

but isn’t that how one is supposed to calculate the normal matrix? also, shouldn’t this result in the same shading compared with when I’m using the matrix computed on the CPU?

You rarely need to use a separate normal matrix. If you only have translation, rotation and uniform scale, you can use the modelview matrix also for normal transform. Inverse transpose of these kind of matrices is the matrix itself. Only if you add non-uniform scaling or something more interesting, you would need separate inverse transpose.

Edit: I typically do something like this:

gl_Position = model_to_clip_matrix * vec4(_position, 1.0);
vec4 position = model_to_world_matrix * vec4(_position, 1.0);
v_normal = vec3(model_to_world_matrix * vec4(_normal, 0.0));

Yeah, I’m aware of that. But I might apply non-uniform scaling to my model matrices at a later point, so that’s why I’m doing it like that. Either way, I would like to understand why the normal matrix I compute on the CPU using GLM does not work but the matrix computed in the vertex shader does.
Also, I still haven’t figured out why the shading “flips around” when I move the camera close to the origin.

Well turns out I had a bug in my uniform buffer update code… the 3x3 normal matrix was uploaded as if it was a 4x4 matrix, which caused the other matrices to be corrupted during the update.
Thank you for helping me fixing the shading, though :slight_smile: