Piotrek Janisz

04-04-2011, 07:31 AM

Hi everyone,

I'm rendering spheres as points (I'm computing normals in fragment shader)

I have a scene with a sphere, and a directional light. I can rotate and move camera around scene.

The problem I have is that when I rotate camera the sphere shading changes - I mean the sphere diffuse intensity changes becouse the angle between lightDirection vector and sphere normal changes.

I can't just multiply normals by camera rotation matrix, as I do with light direction vector becouse this will cause the shading of sphere to be constant regardless of the camera position.

I want the shading of sphere to change accordingly to camera position and not to change when camera is rotated.

So my question is how should I transform normals to achieve that?

that is how I do it right now:

vertex shader:

uniform mat4 projectionMatrix;

uniform mat4 modelViewMatrix;

in vec4 vertex;

void main()

{

gl_Position = projectionMatrix * modelViewMatrix * v;

}

fragment shader:

uniform vec3 lightDirection; // this is already multiplied by camera rotation matrix

out vec4 fragColor;

void main()

{

vec3 normal;

normal.xy = gl_PointCoord.xy * vec2(2.0, -2.0) + vec(-1.0, 1.0);

float mag = dot(normal.xy, normal.xy);

if (mag > 1.0f)

discard;

normal.z = sqrt(1.0f - mag);

normal = normalize(normal);

float diffuseIntensity = max(0.0f, dot(normal, lightDir));

fragColor = vec4(0.1f, 0.1f, 0.1f, 1.0f) * diffuseIntensity * vec4(1.0f, 0.0f, 0.0f, 1.0f);

}

I'm rendering spheres as points (I'm computing normals in fragment shader)

I have a scene with a sphere, and a directional light. I can rotate and move camera around scene.

The problem I have is that when I rotate camera the sphere shading changes - I mean the sphere diffuse intensity changes becouse the angle between lightDirection vector and sphere normal changes.

I can't just multiply normals by camera rotation matrix, as I do with light direction vector becouse this will cause the shading of sphere to be constant regardless of the camera position.

I want the shading of sphere to change accordingly to camera position and not to change when camera is rotated.

So my question is how should I transform normals to achieve that?

that is how I do it right now:

vertex shader:

uniform mat4 projectionMatrix;

uniform mat4 modelViewMatrix;

in vec4 vertex;

void main()

{

gl_Position = projectionMatrix * modelViewMatrix * v;

}

fragment shader:

uniform vec3 lightDirection; // this is already multiplied by camera rotation matrix

out vec4 fragColor;

void main()

{

vec3 normal;

normal.xy = gl_PointCoord.xy * vec2(2.0, -2.0) + vec(-1.0, 1.0);

float mag = dot(normal.xy, normal.xy);

if (mag > 1.0f)

discard;

normal.z = sqrt(1.0f - mag);

normal = normalize(normal);

float diffuseIntensity = max(0.0f, dot(normal, lightDir));

fragColor = vec4(0.1f, 0.1f, 0.1f, 1.0f) * diffuseIntensity * vec4(1.0f, 0.0f, 0.0f, 1.0f);

}