Camera and lighting

Hello everybody,

I’m new in OpenGL programming, and I have a problem.
I have a scene whith one model. My model appears correct but lighting not. When I turn around it, the light moves too, and causes bad results.

I create a camera class, I think it’s ok because I can turn around my model correctly:


class Camera
{
public:
    Camera()
        : _yaw(0), _pitch(0), _roll(0)
    {
    }

    Matrix4x4f getMatrix() const
    {
        Vector3f up     = _rotation.rotatedVector(Vector3f(0, 1, 0)); // up vector
        Vector3f target = _rotation.rotatedVector(Vector3f(0, 0, -1)) + _position; // forward vector

        Matrix4x4f viewMatrix;
        viewMatrix.lookAt(_position, target, up);

        return viewMatrix;
    }
    
    void move(Vector3f distance)
    {
        _position += _rotation.rotatedVector(distance);
    }
 
    void rotate(float yaw, float pitch, float roll = 0)
    {
        _yaw   += yaw;
        _pitch += pitch;
        _roll  += roll;

        Quaternionf y = Quaternionf::fromAxisAndAngle(Vector3f(0, 1, 0), _yaw);
        Quaternionf p = Quaternionf::fromAxisAndAngle(Vector3f(1, 0, 0), _pitch);
        Quaternionf r = Quaternionf::fromAxisAndAngle(Vector3f(0, 0, 1), _roll);
        _rotation = y * p * r;
    }

    Vector4f position() const
    {
        return Vector4f(_position.x(), _position.y(), _position.z());
    }

    void setPosition(const Vector3f &pos)
    {
        _position = pos;
    }

private:
    Vector3f    _position;
    Quaternionf _rotation;
    float _yaw, _pitch, _roll;
    float _fov, _near, _far;
};

Then I have a function that sets uniforms:


glUseProgram(_programID);

Matrix4x4f camMatrix = scene->activeCamera().getMatrix();
Matrix4x4f mvMatrix = camMatrix * object->modelMatrix();

    
glUniformMatrix4fv(mvMatrixUniformLocation, 1, GL_FALSE, mvMatrix);
glUniformMatrix4fv(mvpMatrixUniformLocation, 1, GL_FALSE, scene->projectionMatrix() * mvMatrix);
glUniformMatrix3fv(normalMatrixUniformLocation, 1, GL_FALSE, mvMatrix.normalMatrix());
    
glUniform4fv(diffuseUniformLocation, 1, object->model()->diffuse());

//glUniform4fv(l0PosUniformLocation, 1, (mvMatrix * scene->light().position()));

For the last line, when I want to send light position do I need to convert it in model view transformation ?

Finally my shaders:

#version 150

// Per vertex
in vec4 in_Vertex;
in vec3 in_Normal;
in vec2 in_TexCoord;

// Per batch
uniform mat4 mvMatrix;
uniform mat4 mvpMatrix;
uniform mat3 normalMatrix;


// To fragment program
smooth out vec3 vEyeNormal;
smooth out vec3 vEyeVec;
smooth out vec4 vEyeLightDir;


void main(void)
{
    gl_Position  = mvpMatrix * in_Vertex;
    vEyeNormal   = (normalMatrix * in_Normal).xyz;
    vec3 vVertex = (mvMatrix * in_Vertex).xyz;
    vEyeLightDir = vec4(10.0, 0.0, 0.0, 1.0) - vVertex; // light position in hard
    vEyeVec      = -vVertex;
}


#version 150

// From vertex program
smooth in vec3 vEyeNormal;
smooth in vec3 vEyeVec;
smooth in vec4 vEyeLightDir;

// Per batch
uniform mat4 mvMatrix;
uniform mat4 mvpMatrix;
uniform mat3 normalMatrix;

vec4 diffuse;


// Final color
out vec4 out_Color;


void main(void)
{
	vec3 N = normalize(vEyeNormal);
	vec3 L = normalize(vEyeLightDir);
	float NdotL = max(0.0, dot(L, N));
	out_Color = vec4(diffuse.xyz * NdotL, 1.0);
}

So I don’t understand why my lighting moves when I rotate my camera.
Can you help me ?

Thanks a lot

Tipoun

Typically lighting is easiest when in eye space - so all eye space matematics requires all inputs to be converted to eyespace for correct lighting.
There are 3 areas where this can go wrong.
1.Bad normals in the model
2.Incorrect normal matrix
3.Light position not converted to eye space

smooth in vec4 vEyeLightDir;

Here you should use flat becuase the light direction vector is contant across a triangle and does not need interpollating.

uniform mat3 normalMatrix;

You would need to check you have calculated a correct normal matrix. If you just draw the transformed normals in the fragment shader then the colour you should see on the screen is mostly blue with a hint of red as you turn the camera.
eg


        vec3 N = normalize(vEyeNormal);
	out_Color = vec4(vEyeNormal, 1.0);
vEyeLightDir = vec4(10.0, 0.0, 0.0, 1.0) - vVertex

Are you sure 10,0,0 is in eye space? I seriously doubt that.
To transform the light world space position to eye space you must multiply the position by the only the camera view matrix (not the modelview matrix).

when I want to send light position do I need to convert it in model view transformation

No, use the camera’s view matrix not the modelview matrix.

Thanks for your answers !

I found a bug in my normal matrix :o
So now, everything is alright :slight_smile:


smooth in vec4 vEyeLightDir;

Here you should use flat becuase the light direction vector is contant across a triangle and does not need interpollating.

Ok, but can I compute the light direction in the fragment shader ?

Thanks again

Tipoun

but can I compute the light direction in the fragment shader ?

Yes, but why? since all the information you need for the calculation is in the vertex shader. The light direction vector is constant per vertex.

Ok thank you !