Vertex shader vs Fragment shader computations

Hello, I’m following a tutorial on modern OpenGL, but I have trouble understanding why (in the Gouraud and Phong shading section), if we do lighting computations in the vertex shader, the fragment shader will not accept the out color given by the vertex shader for the fragments that are not vertices, and why, if we do the same calculations in the fragment shader, they will be done and the final color will take into account the calculations. Anyone knows how this works?

Hold-up, geranimo! You’re going too fast.

If you’re lighting and shading triangles (what you imply), “for the fragments that are not vertices” makes no sense. In this case, fragments are never vertices. Vertices are transformed, they compute values which are interpolated across each triangle, and those interpolated values are passed to the fragment shader for each fragment in the triangle.

Please back up and tell us what you tried, what happened that was unexpected, and what conclusions you draw from that. Post some code to help illustrate your question.

…and why, if we do the same calculations in the fragment shader, they will be done and the final color will take into account the calculations. Anyone knows how this works?

Lots of folks here know how this works, but I’m really struggling to understand what your problem is and what you’re asking about. Please try again: source code snippet, observed result, expected result, your conclusion.

I need to do some calculations for the lighting, and the output colors are different whether I do them in the vertex or fragment shader. I don’t understand why the output is not the same. This is the result: https://youtu.be/Fl5i-UnlQps

Original fragment and vertex shaders:

vertex shader

#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec3 normal;

out vec3 Normal;
out vec3 FragPos;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

void main()
{
    gl_Position = projection * view *  model * vec4(position, 1.0f);
    FragPos = vec3(model * vec4(position, 1.0f));
    Normal = mat3(transpose(inverse(model))) * normal;  
} 

fragment shader

#version 330 core
out vec4 color;

in vec3 FragPos;  
in vec3 Normal;  
  
uniform vec3 lightPos; 
uniform vec3 viewPos;
uniform vec3 lightColor;
uniform vec3 objectColor;

void main()
{
    // Ambient
    float ambientStrength = 0.1f;
    vec3 ambient = ambientStrength * lightColor;
  	
    // Diffuse 
    vec3 norm = normalize(Normal);
    vec3 lightDir = normalize(lightPos - FragPos);
    float diff = max(dot(norm, lightDir), 0.0);
    vec3 diffuse = diff * lightColor;
    
    // Specular
    float specularStrength = 0.5f;
    vec3 viewDir = normalize(viewPos - FragPos);
    vec3 reflectDir = reflect(-lightDir, norm);  
    float spec = pow(max(dot(viewDir, reflectDir), 0.0), 32);
    vec3 specular = specularStrength * spec * lightColor;  
        
    vec3 result = (ambient + diffuse + specular) * objectColor;
    color = vec4(result, 1.0f);
} 

now this is the modified shaders where I do the lighting computations in the vertex shader

vertex shader

#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec3 normal;

out vec3 result;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

uniform vec3 lightPos; 
uniform vec3 viewPos;
uniform vec3 lightColor;
uniform vec3 objectColor;

void main()
{
    gl_Position = projection * view *  model * vec4(position, 1.0f);
    vec3 FragPos = vec3(model * vec4(position, 1.0f));
    vec3 Normal = mat3(transpose(inverse(model))) * normal;

	// Ambient
    float ambientStrength = 0.1f;
    vec3 ambient = ambientStrength * lightColor;
  	
    // Diffuse 
    vec3 norm = normalize(Normal);
    vec3 lightDir = normalize(lightPos - FragPos);
    float diff = max(dot(norm, lightDir), 0.0);
    vec3 diffuse = diff * lightColor;
    
    // Specular
    float specularStrength = 0.5f;
    vec3 viewDir = normalize(viewPos - FragPos);
    vec3 reflectDir = reflect(-lightDir, norm);  
    float spec = pow(max(dot(viewDir, reflectDir), 0.0), 32);
    vec3 specular = specularStrength * spec * lightColor;  
        
    result = (ambient + diffuse + specular) * objectColor;
} 

fragment shader

#version 330 core
in vec3 result;
out vec4 color;

void main()
{
    
    color = vec4(result, 1.0f);
} 

It seems that the vertex shader and the fragment shader are executed 1 time for each vertices, and then the fragment shader is executed for each fragment without accepting any new “in”, dealing with the values given for the vertices. (in that case the 8 corners)

Your video looks good (both vertex lighting and fragment lighting). It appears the light source is located toward the top of the video w.r.t. how it’s rendered.

The whole point of doing fragment lighting is the output isn’t exactly the same. Instead of only computing a lighting value at each vertex and interpolating that color across each triangle, you compute a unique lighting value per “fragment”.

You ask: why is the result different? With the fragment lighting solution a unique lightDir is computed for each fragment, generating a new lit color value.

Put the light source closer to directly above the center of your quad (using fragment lighting), and the importance of the lighting differences between your two cases (fragment vs. vertex lighting) will be more obvious.

1 Like

This depends on the type of calculation you do in your lighting.

Typically (I’m simplifying a little here) the outputs from a vertex shader are linearly interpolated to form the inputs to a fragment shader. That works, and works well, so long as the calculations in a vertex shader can be plotted along a straight line. However lighting calculations typically involve normalization, and normalization involves square roots. The plot of a square root isn’t a straight line, it’s a curve, and so you should expect there to be differences.

1 Like

Ok thanks I got it :slight_smile:

1 Like