Trying to understand the this one line of code used for brightness

Excuse me if I’m wrong, but my understanding of what the dot product of 2 vectors is, it’s the angle between them, like the following drawing (sorry made it quick):

I’m taking a lighting tutorial and there’s a portion of the following line of code I don’t understand:


float brightness = dot(normal, surfaceToLight) / (length(surfaceToLight) * length(normal));

I think I understand the left side of the division, which is hopefully the drawing I made above, but I can’t wrap my head around the right side:

(length(surfaceToLight) * length(normal))

Btw, I understand, what the magnitude of a vector is, but not understanding what its used for in the above line of code.

you are wrong :doh:
the dot product = the cosine of the angle between both vectors, multiplied with the magnitude of both vectors

dot(A, B) = |A| * |B| * cos(phi)

the right side (behind the division sign) makes sure that the vectors are normalized

[QUOTE=john_connor;1283890]you are wrong :doh:
the dot product = the cosine of the angle between both vectors, multiplied with the magnitude of both vectors

dot(A, B) = |A| * |B| * cos(phi)

the right side (behind the division sign) makes sure that the vectors are normalized[/QUOTE]

I’m glad I asked, thanks John, and yeah you’re right about the cosine of the angle, I’m reading about it and the article also proves I was wrong.

Thanks again, your explanation helped.

by the way: it should rather be

dot(normal, -surfaceToLight)

than

dot(normal, surfaceToLight)

to get a positive value for the angle

More accurately, it eliminates the requirement that they are normalised.

To obtain the cosine of the angle, the equation

dot(A, B) = |A| * |B| * cos(phi)

can be rewritten either as

cos(phi) = dot(A/|A|, B/|B|)

or as

cos(phi) = dot(A, B)/(|A| * |B|)

The first version (normalising the vectors) is probably more common, and arguably results in cleaner code, but the second is more efficient: there’s only one division, and it’s performed on a scalar rather than on vectors.

However, if one or both of the vectors are used in multiple calculations (e.g. diffuse component, specular component, reflected vector for environment mapping, etc), it may end up being more efficient to just normalise that vector.

[QUOTE=GClements;1283893]More accurately, it eliminates the requirement that they are normalised.

To obtain the cosine of the angle, the equation

dot(A, B) = |A| * |B| * cos(phi)

can be rewritten either as

cos(phi) = dot(A/|A|, B/|B|)

or as

cos(phi) = dot(A, B)/(|A| * |B|)

The first version (normalising the vectors) is probably more common, and arguably results in cleaner code, but the second is more efficient: there’s only one division, and it’s performed on a scalar rather than on vectors.

However, if one or both of the vectors are used in multiple calculations (e.g. diffuse component, specular component, reflected vector for environment mapping, etc), it may end up being more efficient to just normalise that vector.[/QUOTE]

Thanks GClements, thankfully I understood it more.

The light equation (Gouraud shading) is actually pretty simple and straight forward. Basically, the vector dot product of the surface normal and the direction to the light is the percentage of color you apply. In plain English, that means that if the surface perfectly faces the light source it gets 100% of the color. If it faces 90 degrees or more away it gets 0% of the light or color. A different light (ambient light) is used if you don’t want the dark side to be pitch black. Any percentage of angle between straight into the light and 90 degrees will get that percentage amount of the color. For white light this would be a gray scale amount.

(Ambient light is just color applied to everything equally, which will give you a silhouette rather than 3D if used by itself. But it can be used effectively to fill in the dark side where Gouraud fills in the light side. Usually I mix both so that the light side gets both colors. )

I like to mix with the surface color through multiplication, especially if the surface color is a texture. White is 1 for r, g, and b and Black is 0. So, you can easily imagine what decreasing the light color will do to the surface color.

The equation

(length(surfaceToLight) * length(normal))

can be simplified because by definition the length of any normal is 1. If it’s not 1 then it’s not a normal. And anything multiplied by 1 is unchanged. So, that could just be removed from the equation.

I think all that’s in there to normalize the vectors, but there’s a normalize function to take care of that. I believe the vector dot product won’t work as expected unless both vectors are normalized. To normalize a vector, you divide by it’s length/magnitude. So, they’re basically doing the normalization and dot product for the percentage as one equation. It’s unnecessary if the vectors are already normalized, but there’s also the question of how much you can trust the input coming in to be normalized.

I use


	LightDirection = -normalize(DiffuseLightDirection);	//Normal must face into the light, rather than WITH the light to be lit up.
	DiffuseLightPercentage = max(dot(VertexNormal, LightDirection), 0.0);	//Percentage is based on angle between the direction of light and the vertex's normal. 
	DiffuseLight = clamp((DiffuseLightColor * InputColor) * DiffuseLightPercentage, 0.0, 1.0);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0.


I think the max with 0 was put in there to avoid anything going negative. The clamp likewise is there to prevent values from going out of range. In theory that may not be necessary. The negative light direction is kind of important because otherwise it will probably work backwards.

The specular highlight is the hard part. This part is just using the dot product to get a percentage of the light color to mix with the surface color.

This is really all you need to achieve 3D, but the specular highlight can give a glossy effect that is common. These days you are more likely to apply it with a gloss map, which gets a little more complex, but not that much more complex than texturing.

Anyway, this is exactly the sort of thing I cover in my HLSL video series. The shader below is basically the exact same shader you have by the end of my HLSL shader series written in GLSL except the GLSL version has fog and I never covered that in the HLSL series. Probably should have. Probably should have covered normal mapping too, which would have been the next video in the series, but I got side tracked with other things for the past couple of years.


#version 450 core

in vec2 TextureCoordinates;
in vec3 VertexNormal;
in vec4 RGBAColor;
in float FogFactor;
in vec4 PositionRelativeToCamera;
in vec3 WorldSpacePosition;

layout (location = 0) out vec4 OutputColor;


uniform vec4 AmbientLightColor;
uniform vec3 DiffuseLightDirection;
uniform vec4 DiffuseLightColor;
uniform vec3 CameraPosition;
uniform float SpecularPower;
uniform vec4 FogColor;
uniform float FogStartDistance;
uniform float FogMaxDistance;
uniform bool UseTexture;
uniform sampler2D Texture0;



vec4 BlinnSpecular(in vec3 LightDirection, in vec4 LightColor, in vec3 PixelNormal, in vec3 CameraDirection, in float SpecularPower)
{
	vec3 HalfwayNormal;
	vec4 SpecularLight;
	float SpecularHighlightAmount;


	HalfwayNormal = normalize(LightDirection + CameraDirection);
	SpecularHighlightAmount = pow(clamp(dot(PixelNormal, HalfwayNormal), 0.0, 1.0), SpecularPower);
	SpecularLight = SpecularHighlightAmount * LightColor; 

	return SpecularLight;
}


vec4 PhongSpecular(in vec3 LightDirection, in vec4 LightColor, in vec3 PixelNormal, in vec3 CameraDirection, in float SpecularPower)
{
	vec3 ReflectedLightDirection;	
	vec4 SpecularLight;
	float SpecularHighlightAmount;


	ReflectedLightDirection = 2.0 * PixelNormal * clamp(dot(PixelNormal, LightDirection), 0.0, 1.0) - LightDirection;
	SpecularHighlightAmount = pow(clamp(dot(ReflectedLightDirection, CameraDirection), 0.0, 1.0), SpecularPower);
	SpecularLight = SpecularHighlightAmount * LightColor; 
	

	return SpecularLight;
}


void main()
{
	vec3 LightDirection;
	float DiffuseLightPercentage;
	vec4 SpecularColor;
	vec3 CameraDirection;	//Float3 because the w component really doesn't belong in a 3D vector normal.
	vec4 AmbientLight;
	vec4 DiffuseLight;
	vec4 InputColor;

	
	if (UseTexture) 
	{
		InputColor = texture(Texture0, TextureCoordinates);
	}
	else
	{
		InputColor = RGBAColor; // vec4(0.0, 0.0, 0.0, 1.0);
	}


	LightDirection = -normalize(DiffuseLightDirection);	//Normal must face into the light, rather than WITH the light to be lit up.
	DiffuseLightPercentage = max(dot(VertexNormal, LightDirection), 0.0);	//Percentage is based on angle between the direction of light and the vertex's normal. 
	DiffuseLight = clamp((DiffuseLightColor * InputColor) * DiffuseLightPercentage, 0.0, 1.0);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0.

	CameraDirection = normalize(CameraPosition - WorldSpacePosition);	//Create a normal that points in the direction from the pixel to the camera.

	if (DiffuseLightPercentage == 0.0f) 
	{
		SpecularColor  = vec4(0.0f, 0.0f, 0.0f, 1.0f);
	}
	else
	{
		//SpecularColor = BlinnSpecular(LightDirection, DiffuseLightColor, normalize(VertexNormal), CameraDirection, SpecularPower);
		SpecularColor = PhongSpecular(LightDirection, DiffuseLightColor, normalize(VertexNormal), CameraDirection, SpecularPower);
	}

	float FogDensity = 0.1f;
	float LOG2 = 1.442695f;
	float FogFactor = exp2(-FogDensity * FogDensity * PositionRelativeToCamera.z * PositionRelativeToCamera.z * LOG2);
	FogFactor = 1 - FogFactor;
	OutputColor = RGBAColor * (AmbientLightColor * InputColor) + DiffuseLight + SpecularColor;
	OutputColor = mix (OutputColor, FogColor, FogFactor);
	//OutputColor = vec4(0.0f, 0.5f, 0.0f, 1.0f);
};



#version 450 core
layout (location = 0) in vec3 Pos;
layout (location = 1) in vec2 UV;
layout (location = 2) in vec3 Normal;
layout (location = 3) in vec4 Color;

uniform mat4 WorldMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;



smooth out vec2 TextureCoordinates;
smooth out vec3 VertexNormal;
smooth out vec4 RGBAColor;
smooth out vec4 PositionRelativeToCamera;
out vec3 WorldSpacePosition;


void main()
{
	gl_Position = WorldMatrix * vec4(Pos, 1.0f);				//Apply object's world matrix.
	WorldSpacePosition = gl_Position.xyz;						//Save the position of the vertex in the 3D world just calculated. Convert to vec3 because it will be used with other vec3's.
	gl_Position = ViewMatrix * gl_Position;						//Apply the view matrix for the camera.
	PositionRelativeToCamera = gl_Position;
	gl_Position = ProjectionMatrix * gl_Position;				//Apply the Projection Matrix to project it on to a 2D plane.
	TextureCoordinates = UV;									//Pass through the texture coordinates to the fragment shader.
	VertexNormal = mat3(WorldMatrix) * Normal;					//Rotate the normal according to how the model is oriented in the 3D world.
	RGBAColor = Color;											//Pass through the color to the fragment shader.
};


[QUOTE=BBeck1;1283948]The light equation (Gouraud shading) is actually pretty simple and straight forward. Basically, the vector dot product of the surface normal and the direction to the light is the percentage of color you apply. In plain English, that means that if the surface perfectly faces the light source it gets 100% of the color. If it faces 90 degrees or more away it gets 0% of the light or color. A different light (ambient light) is used if you don’t want the dark side to be pitch black. Any percentage of angle between straight into the light and 90 degrees will get that percentage amount of the color. For white light this would be a gray scale amount.

(Ambient light is just color applied to everything equally, which will give you a silhouette rather than 3D if used by itself. But it can be used effectively to fill in the dark side where Gouraud fills in the light side. Usually I mix both so that the light side gets both colors. )

I like to mix with the surface color through multiplication, especially if the surface color is a texture. White is 1 for r, g, and b and Black is 0. So, you can easily imagine what decreasing the light color will do to the surface color.

The equation

(length(surfaceToLight) * length(normal))

can be simplified because by definition the length of any normal is 1. If it’s not 1 then it’s not a normal. And anything multiplied by 1 is unchanged. So, that could just be removed from the equation.

I think all that’s in there to normalize the vectors, but there’s a normalize function to take care of that. I believe the vector dot product won’t work as expected unless both vectors are normalized. To normalize a vector, you divide by it’s length/magnitude. So, they’re basically doing the normalization and dot product for the percentage as one equation. It’s unnecessary if the vectors are already normalized, but there’s also the question of how much you can trust the input coming in to be normalized.

I use


	LightDirection = -normalize(DiffuseLightDirection);	//Normal must face into the light, rather than WITH the light to be lit up.
	DiffuseLightPercentage = max(dot(VertexNormal, LightDirection), 0.0);	//Percentage is based on angle between the direction of light and the vertex's normal. 
	DiffuseLight = clamp((DiffuseLightColor * InputColor) * DiffuseLightPercentage, 0.0, 1.0);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0.


I think the max with 0 was put in there to avoid anything going negative. The clamp likewise is there to prevent values from going out of range. In theory that may not be necessary. The negative light direction is kind of important because otherwise it will probably work backwards.

The specular highlight is the hard part. This part is just using the dot product to get a percentage of the light color to mix with the surface color.

This is really all you need to achieve 3D, but the specular highlight can give a glossy effect that is common. These days you are more likely to apply it with a gloss map, which gets a little more complex, but not that much more complex than texturing.

Anyway, this is exactly the sort of thing I cover in my HLSL video series. The shader below is basically the exact same shader you have by the end of my HLSL shader series written in GLSL except the GLSL version has fog and I never covered that in the HLSL series. Probably should have. Probably should have covered normal mapping too, which would have been the next video in the series, but I got side tracked with other things for the past couple of years.


#version 450 core

in vec2 TextureCoordinates;
in vec3 VertexNormal;
in vec4 RGBAColor;
in float FogFactor;
in vec4 PositionRelativeToCamera;
in vec3 WorldSpacePosition;

layout (location = 0) out vec4 OutputColor;


uniform vec4 AmbientLightColor;
uniform vec3 DiffuseLightDirection;
uniform vec4 DiffuseLightColor;
uniform vec3 CameraPosition;
uniform float SpecularPower;
uniform vec4 FogColor;
uniform float FogStartDistance;
uniform float FogMaxDistance;
uniform bool UseTexture;
uniform sampler2D Texture0;



vec4 BlinnSpecular(in vec3 LightDirection, in vec4 LightColor, in vec3 PixelNormal, in vec3 CameraDirection, in float SpecularPower)
{
	vec3 HalfwayNormal;
	vec4 SpecularLight;
	float SpecularHighlightAmount;


	HalfwayNormal = normalize(LightDirection + CameraDirection);
	SpecularHighlightAmount = pow(clamp(dot(PixelNormal, HalfwayNormal), 0.0, 1.0), SpecularPower);
	SpecularLight = SpecularHighlightAmount * LightColor; 

	return SpecularLight;
}


vec4 PhongSpecular(in vec3 LightDirection, in vec4 LightColor, in vec3 PixelNormal, in vec3 CameraDirection, in float SpecularPower)
{
	vec3 ReflectedLightDirection;	
	vec4 SpecularLight;
	float SpecularHighlightAmount;


	ReflectedLightDirection = 2.0 * PixelNormal * clamp(dot(PixelNormal, LightDirection), 0.0, 1.0) - LightDirection;
	SpecularHighlightAmount = pow(clamp(dot(ReflectedLightDirection, CameraDirection), 0.0, 1.0), SpecularPower);
	SpecularLight = SpecularHighlightAmount * LightColor; 
	

	return SpecularLight;
}


void main()
{
	vec3 LightDirection;
	float DiffuseLightPercentage;
	vec4 SpecularColor;
	vec3 CameraDirection;	//Float3 because the w component really doesn't belong in a 3D vector normal.
	vec4 AmbientLight;
	vec4 DiffuseLight;
	vec4 InputColor;

	
	if (UseTexture) 
	{
		InputColor = texture(Texture0, TextureCoordinates);
	}
	else
	{
		InputColor = RGBAColor; // vec4(0.0, 0.0, 0.0, 1.0);
	}


	LightDirection = -normalize(DiffuseLightDirection);	//Normal must face into the light, rather than WITH the light to be lit up.
	DiffuseLightPercentage = max(dot(VertexNormal, LightDirection), 0.0);	//Percentage is based on angle between the direction of light and the vertex's normal. 
	DiffuseLight = clamp((DiffuseLightColor * InputColor) * DiffuseLightPercentage, 0.0, 1.0);	//Apply only the percentage of the diffuse color. Saturate clamps output between 0.0 and 1.0.

	CameraDirection = normalize(CameraPosition - WorldSpacePosition);	//Create a normal that points in the direction from the pixel to the camera.

	if (DiffuseLightPercentage == 0.0f) 
	{
		SpecularColor  = vec4(0.0f, 0.0f, 0.0f, 1.0f);
	}
	else
	{
		//SpecularColor = BlinnSpecular(LightDirection, DiffuseLightColor, normalize(VertexNormal), CameraDirection, SpecularPower);
		SpecularColor = PhongSpecular(LightDirection, DiffuseLightColor, normalize(VertexNormal), CameraDirection, SpecularPower);
	}

	float FogDensity = 0.1f;
	float LOG2 = 1.442695f;
	float FogFactor = exp2(-FogDensity * FogDensity * PositionRelativeToCamera.z * PositionRelativeToCamera.z * LOG2);
	FogFactor = 1 - FogFactor;
	OutputColor = RGBAColor * (AmbientLightColor * InputColor) + DiffuseLight + SpecularColor;
	OutputColor = mix (OutputColor, FogColor, FogFactor);
	//OutputColor = vec4(0.0f, 0.5f, 0.0f, 1.0f);
};



#version 450 core
layout (location = 0) in vec3 Pos;
layout (location = 1) in vec2 UV;
layout (location = 2) in vec3 Normal;
layout (location = 3) in vec4 Color;

uniform mat4 WorldMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;



smooth out vec2 TextureCoordinates;
smooth out vec3 VertexNormal;
smooth out vec4 RGBAColor;
smooth out vec4 PositionRelativeToCamera;
out vec3 WorldSpacePosition;


void main()
{
	gl_Position = WorldMatrix * vec4(Pos, 1.0f);				//Apply object's world matrix.
	WorldSpacePosition = gl_Position.xyz;						//Save the position of the vertex in the 3D world just calculated. Convert to vec3 because it will be used with other vec3's.
	gl_Position = ViewMatrix * gl_Position;						//Apply the view matrix for the camera.
	PositionRelativeToCamera = gl_Position;
	gl_Position = ProjectionMatrix * gl_Position;				//Apply the Projection Matrix to project it on to a 2D plane.
	TextureCoordinates = UV;									//Pass through the texture coordinates to the fragment shader.
	VertexNormal = mat3(WorldMatrix) * Normal;					//Rotate the normal according to how the model is oriented in the 3D world.
	RGBAColor = Color;											//Pass through the color to the fragment shader.
};


[/QUOTE]

Didn’t see this post until now, thanks a lot Beck. Appreciate the explanation.