per vertex lighting to per pixel

Can someone explain how this code transforms something from per vertex lighting to per pixel?

In a tutorial there was a diffuse value calculation of the type


float diffuse_value = max(dot(vertex_normal, vertex_light_position), 0.0);

…on the vertex shader.

That was supposed to be making per vertex lighting if later on the fragment shader…

gl_FragColor = gl_Color * diffuse_value;

Then when he moved the first line - appropriately (by outputting vertex_normal and vertex_light_position to fragment) - to the the fragment shader, it is supposed to be transforming the method to “per pixel shading”.

How is that so? The first method appears to be doing the diffuse_value calculation every pixel anyway!

First version performs the lighting calculation on each vertex and interpolates the resulting diffuse_value, i.e. a color is interpolated.
Second version interpolates the normal and performs the lighting calculation for each pixel.

How is that so? The first method appears to be doing the diffuse_value calculation every pixel anyway!

no, diffuse_value is only calculated on each vertex and then interpolated - that is different from calculating the value for each pixel from an interpolated normal (which is what the second version does).

Thanks very much, that interpolation difference makes it clear.

Since both interpolate I don’t see how they would produce different results; e.g. I usually read ‘per pixel is superior’ but here they both interpolate, none is ‘absolutely right’ and the other ‘interpolates and hence is blunt’.

EDIT: …but looking into it, I heard it requires to normalize the inputs…

tobindax, what matters is what is linearly interpolated.

Example :

  1. y = a*x + b

compute y0 and y1 given x0 and x1 : then you can linearly interpolate between y0 and y1, as if it was the non-interpolated function. This is the vertex shader.

Now with :

  1. z = y*y

this is no more possible to have a nice approximate of z with just a few values of y : you will need to compute z for almost each pixel, or the curve will not feel right. This is the fragment shader.

To recap, doing both 1) and 2) in the vertex shader will reveal more artifacts than doing 1) in the vertex shader and 2) in the fragment shader.

In reality, it would be even better to have per-pixel normals (no interpolation at all) : that is the idea behind normal mapping.
http://en.wikipedia.org/wiki/Normal_mapping

EDIT : yes, linear interpolation between 2 unit length vectors gives a vector smaller than unit lenght, so normalization is needed after interpolation.

OK, I’m looking for a simple example.

Right now I get the same solid color on the whole of each triangle (it changes but only in whole). I guess per pixel should have differences in the length of the triangle.

main code involved:

out vec3 vertex_normal;         
out vec3 vertex_light_position; 

… on the vertex shader.

with

vertex_normal = normalize(NormalMatrix * in_Normal);

// old gl_NormalMatrix: "transpose of the inverse of the upper
// leftmost 3x3 of gl_ModelViewMatrix"

mat3 NormalMatrix = transpose(
                        inverse(
                            mat3(

                                ModelViewMatrix

                            )
                        )
                    );

and on fragment shader:

float diffuse_value = MAX(dot(vertex_normal, vertex_light_position), 0.0);

gl_FragColor =  out_Color * diffuse_value

even if I normalize vertex_normal in the fragment it doesn’t stop it having a solid color in a triangle :\

I guess per pixel should have differences in the length of the triangle.

Do the vertex normals have different values? Try just drawing the per-fragment normals (writing them out as colors).

Yes, they were different.

I found a solution that gives per pixel differences on the same triangle.

vec3 Light = normalize(light_position - vec3(out_Vertex));
float diffuse_value = MAX(dot(vertex_normal, Light), 0.0);

I had to represent Light with the fixed light minus the vertex position. I’m yet to fully understand it.

I guess output vertex position from the vertex shader is interpolated in the fragment shader, otherwise it wouldn’t have a Light gradually changing.

The initial tutorial with it (http://www.opengl.org/sdk/docs/tutorials/ClockworkCoders/lighting.php) had out_Vertex here as gl_ModelViewMatrix * gl_Vertex though I used the very end vertex; I guess I should refine that.