Stoopid Newbie Question- Light Positions

Hiya,

very stupid question:
I’ve been working on velvet surface effect, based on the ATI ‘Velvety’ shader.
It’s working very nicely, but I’d like to change it slightly, and I can’t work out quite how to do it.

Currently, I can rotate the object, but the lighting rotates with it (so effectively what I’m actually doing is orbiting the virtual camera around the whole mesh/lighting setup). What I’d like to do is to rotate/scale the mesh, without moving the light. I can’t work out why the code below isn’t allowing me to do that. I’m sure this is a REALLY basic error I’m making, and apologies in advance.

Here’s the VS code:

/////////////////////////////////////
////	     VELVET SHADER VS		////
/////////////////////////////////////

/*
	Translated from HLSL shader for VVVV
	vNoiseVelvety.fx
	by Desaxismundi 2008
	
	Velvet shader originally by ATI
	
	GLSL conversion toneburst 2008
*/

// Transformation matrices
mat4 MVIT = gl_ModelViewMatrixInverseTranspose;
mat4 MVI = gl_ModelViewMatrixInverse;
mat4 MV = gl_ModelViewMatrix;

// Varyings (sent to Fragment Shader)
varying vec3 LightVec;
varying vec3 WorldNormal;
varying vec3 WorldView;

// Uniforms
uniform vec3 lPos;			// Light position

//	Velvet vertex shader function
//	Takes normal as input and modifies varyings
void velvetVS(in vec3 n, in vec4 v)
{
	WorldNormal = (vec4(n,1.0) * MVIT).xyz;
	vec4 Po = vec4(v.xyz,1.0);
	vec3 Pw = (MV * Po).xyz;
	LightVec = normalize(lPos - Pw);
	WorldView = normalize(MVI[3].xyz - Pw);		
}

/////////////////////////////////////

uniform vec4 TT_0;
uniform vec4 TT_1;
uniform vec4 TT_2;
uniform vec4 TT_3;

void main()
{
	// Pre-transform matrix
	mat4 tt = mat4(TT_0,TT_1,TT_2,TT_3);
	
	// Pre-transform vertices
	vec4 vertex = tt * gl_Vertex;
	
	// Normal
	vec3 N = normalize(gl_NormalMatrix * gl_Normal);
	
	// Velvet vertex function
	velvetVS(N, vertex);
	
	// Transform vertex by modelview and projection matrices
	gl_Position = gl_ModelViewProjectionMatrix * vertex;
	
	// Forward texture coordinates after applying texture matrix
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;
}

And the Fragment Shader:

/////////////////////////////////////
////	     VELVET SHADER VS		////
/////////////////////////////////////

/*
	Translated from HLSL shader for VVVV
	vNoiseVelvety.fx
	by Desaxismundi 2008
	
	Velvet shader originally by ATI
	
	GLSL conversion toneburst 2008
*/

// Transformation matrices
mat4 MVIT = gl_ModelViewMatrixInverseTranspose;
mat4 MVI = gl_ModelViewMatrixInverse;
mat4 MV = gl_ModelViewMatrix;

// Varyings (sent to Fragment Shader)
varying vec3 LightVec;
varying vec3 WorldNormal;
varying vec3 WorldView;

// Uniforms
uniform vec3 lPos;			// Light position

//	Velvet vertex shader function
//	Takes normal as input and modifies varyings
void velvetVS(in vec3 n, in vec4 v)
{
	WorldNormal = (vec4(n,1.0) * MVIT).xyz;
	vec4 Po = vec4(v.xyz,1.0);
	vec3 Pw = (MV * Po).xyz;
	LightVec = normalize(lPos - Pw);
	WorldView = normalize(MVI[3].xyz - Pw);		
}

/////////////////////////////////////

uniform vec4 TT_0;
uniform vec4 TT_1;
uniform vec4 TT_2;
uniform vec4 TT_3;

void main()
{
	// Pre-transform matrix
	mat4 tt = mat4(TT_0,TT_1,TT_2,TT_3);
	
	// Pre-transform vertices
	vec4 vertex = tt * gl_Vertex;
	
	// Normal
	vec3 N = normalize(gl_NormalMatrix * gl_Normal);
	
	// Velvet vertex function
	velvetVS(N, vertex);
	
	// Transform vertex by modelview and projection matrices
	gl_Position = gl_ModelViewProjectionMatrix * vertex;
	
	// Forward texture coordinates after applying texture matrix
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;
}

Can anyone spot where I’m going wrong?

a|x

http://machinesdontcare.wordpress.com

I think that your problem does not come from your shader. You give the light position through an uniform variable lPos which is defined by your application.
You must do some changes in your application to not transform the light position. I assume that you define the gl light position after affecting the modelview matrix with some transformations. In this case just define the light position before doing these transformations…

Hi dletozeun,

thanks for getting back to me.
The odd thing is, I’m not manipulating the ModelViewMatrix, just transforming the position of the vertices of the mesh with a custom matrix. That’s why I’m confused that the light is also still moving…

a|x

I did read the fragment shader until now and I have seen that you have posted twice the vertex shader ^^. Can you post the fragment shader?

Hi dletozeun,

well spotted!
Very silly mistake.

Here’s the Fragment Shader code:


/////////////////////////////////////
////		VELVET SHADER FS		////
/////////////////////////////////////

/*
	Translated from HLSL shader for VVVV
	vNoiseVelvety.fx
	by Desaxismundi 2008
	
	Velvet shader originally by ATI
	
	GLSL conversion toneburst 2008
*/

// Lighting Controls
uniform vec4 lDiffColor;	// Diffuse colour
uniform vec4 lSpecColor;	// Specular colour
uniform vec4 lSubColor;		// Under colour
uniform float lRollOff;		// Edge rolloff

// Lighting varyings
varying vec3 LightVec;
varying vec3 WorldNormal;
varying vec3 WorldView;

// Texture input
uniform sampler2D texture;

//	Velvet Fragment Shader function
//	Input texture color
vec4 velvetFS()
{
	vec3 Ln = normalize(LightVec);
	vec3 Nn = normalize(WorldNormal);
	vec3 Vn = normalize(WorldView);
	vec3 Hn = normalize(Vn + Ln);
	float ldn = dot(Ln,Nn);
	float diffComp = max(0.0,ldn);
	vec3 diffContrib = (diffComp * lDiffColor).rgb;
	float subLamb = smoothstep(-lRollOff,1.0,ldn) - smoothstep(0.0,1.0,ldn);
	subLamb = max(0.0,subLamb);
	vec3 subContrib = subLamb * lSubColor.xyz;
	float vdn = 1.0-dot(Vn,Nn);
	vec3 vecColor = vec3(vdn,vdn,vdn);
	vec3 diffuseContrib = vec3((subContrib+diffContrib).xyz);
	vec3 specularContrib = vec3(vecColor*lSpecColor.rgb);
	vec3 result = specularContrib + diffuseContrib;
	return vec4(result,1.0);
}

/////////////////////////////////////

uniform vec4 tColor;	// Color transform

void main()
{
	gl_FragColor = velvetFS() * tColor;
}

Also note, a limitation in the development environment I’m using (Apple’s Quartz Composer) means I have to pass a mat4 into the vertex shader as 4x vec4s, and assemble the matrix in the shader itself (very annoying).

Thanks again,

a|x

Ok thanks.


vec4 Po = vec4(v.xyz,1.0);

I think you should better divide v x,y and z components by the w component, something like:
Po /= Po.w


LightVec = normalize(lPos - Pw);

There is a problem, Pw is in the eye space and lPos is in the object space.


// Normal
vec3 N = normalize(gl_NormalMatrix * gl_Normal);

Your normal should be transformed like the vertex. In the code above you just go to object space coordinates system to eye space system. I think that you should post multiply the normal by the inverse transpose of the ‘tt’ matrix.

Maybe all these anomalies help to move your light…

Hi again dletozeun,

thanks for getting back to me again.

So, for lPos, I should also multiply that by the ModelViewMatrix, so it’s in the same coordinate space as Pw?


vec3 lightPos = (MV * vec4(lPos,1.0)).xyz;
LightVec = normalize(lightPos - Pw);

I’ve decided not to transform the model’s vertices with the matrix ‘tt’.
Does this mean I no longer need to transform the normal as you suggest (given that it’s already multiplied with MVIT to create the ‘WorldNormal’ vec)?

Sorry for these really basic questions, and thanks again for your help and advice,

a|x

Hi,

If lPos is your light position in world space, multiplying by the ModelViewMatrix would transform it into eye space but would also transform light like your vertices. But if you use a custom matrix to transform your vertices: tt, ModelViewMatrix would just contains eye space transformation i.e the View matrix.

But a built-in uniform gl_LightSource[i].position (where i is the gl light number) is available in your shader and this light position is already in eye space according to the glsl specification.

Now, if you keep vertices in object space, you don’t have to transform normals because they are given in object space too.

Ah… OK.

I was trying to avoid using the builtin lighting stuff, just because of the added complexity. I will give it a go though.

Cheers again,

a|x

you are welcome. These built-in variables often help a lot. These ones sometimes save you from doing additionally computations in your shader, or giving more uniforms from the application.

The situation is complicated a little by the fact I’m not writing an application directly, but creating realtime 3D effects using a modular application called Quartz Composer. I can use GLSL shader code in QC, but don’t have access to arbitrary OpenGL code. The setup for setting OpenGL lighting properties is a bit fiddly, requiring several nested macros. Having said that, using them does save extra maths in the shader, so I should probably do it anyway…

a|x

Ok, I understand, It depends if these shaders are intended to be used in a future ogl application…it is up to you! :slight_smile:

Hmm… I’m getting the feeling I’m going round in circles with this one…

Im now using the OpenGL lighting position attribute, and it’s still acting weirdly. Here’s the code:

Vertex Shader:

/////////////////////////////////////
////	     VELVET SHADER VS		////
/////////////////////////////////////

/*
	Translated from HLSL shader for VVVV
	vNoiseVelvety.fx
	by Desaxismundi 2008
	
	Velvety shader originally by ATI
	
	GLSL conversion toneburst 2008
*/

// Varyings (sent to Fragment Shader)
varying vec3 LightVec;
varying vec3 WorldNormal;
varying vec3 WorldView;

// Uniforms
//uniform vec3 lPos;			// Light position

//	Velvet vertex shader function
//	Takes normal as input and modifies varyings
void velvetVS(in vec3 n, in vec4 v)
{
	// in v = vertex
	// in n = normal
	//WorldNormal = (vec4(n,1.0) * gl_ModelViewMatrixInverseTranspose).xyz;
	WorldNormal = gl_NormalMatrix * n;
	vec4 Po = vec4(v.xyz / v.w,1.0);
	vec3 Pw = (gl_ModelViewMatrix * Po).xyz;
	vec3 lPos = gl_LightSource[0].position.xyz;
	vec3 lightPos = (gl_ModelViewMatrix * vec4(lPos,1.0)).xyz;
	LightVec = normalize(lightPos - Pw);
	WorldView = normalize(gl_ModelViewMatrixInverse[3].xyz - Pw);		
}

/////////////////////////////////////

void main()
{	
	// Pre-transform vertices
	vec4 vertex = gl_Vertex;
	
	// Normal
	vec3 N = normalize(gl_NormalMatrix * gl_Normal);
	
	// Velvet vertex function
	velvetVS(N, vertex);
	
	// Transform vertex by modelview and projection matrices
	gl_Position = gl_ModelViewProjectionMatrix * vertex;
	
	// Forward texture coordinates after applying texture matrix
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;
}

Fragment Shader:

/////////////////////////////////////
////		VELVET SHADER FS		////
/////////////////////////////////////

/*
	Translated from HLSL shader for VVVV
	vNoiseVelvety.fx
	by Desaxismundi 2008
	
	Velvety shader originally by ATI
	
	GLSL conversion toneburst 2008
*/

// Lighting Controls
uniform vec4 lDiffColor;	// Diffuse colour
uniform vec4 lSpecColor;	// Specular colour
uniform vec4 lSubColor;		// Under colour
uniform float lRollOff;		// Edge rolloff

// Lighting varyings
varying vec3 LightVec;
varying vec3 WorldNormal;
varying vec3 WorldView;

// Texture input
uniform sampler2D texture;

//	Velvet Fragment Shader function
//	Input texture color
vec4 velvetFS()
{
	vec3 Ln = normalize(LightVec);
	vec3 Nn = normalize(WorldNormal);
	vec3 Vn = normalize(WorldView);
	vec3 Hn = normalize(Vn + Ln);
	float ldn = dot(Ln,Nn);
	float diffComp = max(0.0,ldn);
	vec3 diffContrib = (diffComp * lDiffColor).rgb;
	float subLamb = smoothstep(-lRollOff,1.0,ldn) - smoothstep(0.0,1.0,ldn);
	subLamb = max(0.0,subLamb);
	vec3 subContrib = subLamb * lSubColor.xyz;
	float vdn = 1.0-dot(Vn,Nn);
	vec3 vecColor = vec3(vdn,vdn,vdn);
	vec3 diffuseContrib = vec3((subContrib+diffContrib).xyz);
	vec3 specularContrib = vec3(vecColor*lSpecColor.rgb);
	vec3 result = specularContrib + diffuseContrib;
	return vec4(result,1.0);
}

/////////////////////////////////////

uniform vec4 tColor;	// Color transform

void main()
{
	gl_FragColor = velvetFS() * tColor;
}

I’m still getting really weird results.
Here are some screenshots of the shaded applied to a sphere primitive. I’ve rotated the sphere on the X-axis by manipulating the gl_ModelViewProjectionMatrix (I say ‘I’- Quartz Composer does that for me, in fact).

0 degrees x-rotation:

90 degrees:

180 degrees:

270 degrees:

As you can see, the lighting changes completely as the sphere is rotated. I’m really not sure what’s going on…
No complicate matter more, I don’t really have many examples of what this shader is SUPPOSED to look like. I have managed to find these shots though on Desaxismundi’s Flickr page:

http://farm2.static.flickr.com/1124/1423312675_fb80ece552.jpg?v=0
http://farm2.static.flickr.com/1225/1424196978_45ac47294c.jpg?v=0
http://farm2.static.flickr.com/1035/1423313089_6a81b9b1df.jpg?v=0
http://farm2.static.flickr.com/1286/1423312521_14fd0207c0.jpg?v=0

which give an idea of what it’s supposed to look like (though on a more complex mesh than my sphere).

Any advice on where I may still be going wrong much appreciated,

Thanks again,

a|x

According to the glsl specification, gl_LightSource[0].position is already in eye space, so you don’t need to tranform the light position with the modelview matrix, because you will in addition, transform the light position like the vertex ans you don’t want this, only a transformation from object space to eye space.

So this:


vec3 lPos = gl_LightSource[0].position.xyz;
vec3 lightPos = (gl_ModelViewMatrix * vec4(lPos,1.0)).xyz;
LightVec = normalize(lightPos - Pw);

could be reduced as:


vec3 LightVec = normalize(gl_LightSource[0].position.xyz - Pw)

Some other things like


vec4 Po = vec4(v.xyz / v.w,1.0);

seems unuseful You don’t need to w divide after a modelview transformation if you do transformations like rotation, translation, scale.

Hmm… OK.

I’ve made those changes, but I still get the same results as shown in my post above.
It’s really annoying that the lighting changes completely as you rotate the object, which really shouldn’t be the case with a static light-source and a sphere. It’s certainly not the case with other lighting setups I’ve been working on.

The perils of working on code you don’t understand…
This is a translation of some HLSL shader code I found in the NVIDIA Shader Library. I’ve had some success translating other lighting shaders from there, but for some reason this one has never worked as I assume it should. I don’t suppose you happen to know of any other velvet-like surface shaders, do you?

Cheers,

a|x

Sorry, I don’t know any glsl velvet shader…
To make it work, try to simplify your code as more as you can. The bug should in another part of the shader and I have seen some other things unuseful in your code, like the computaion of “WorldView” which is “Po” in your vertex shader…

I think the problem is to do with the WorldView varying, in fact.

Below are some screenshots from of the original HLSL shader running in FXComposer.
The 3 shots are of the same object viewed from different angles. In this example, the light changes because I’m shifting the position of the virtual camera in FXCOmposer’s preview window, while the light stays in the same location relative to the sphere. As you can see though, the edge of the object always appears lighter. In fact, the mesh edge is always coloured according to the value of the SpecColor uniform. Don’t know if this makes anything clearer…



Sooo, I think calculating the value of WorldView correctly is probably the key to getting this thing to work, since WorldView is the inverse of the vector between the current view position and the vertex, I think. What’s got me really confused is that GLSL doesn’t have a ‘View’ matrix in the way that HLSL does, as far as I can tell. It seems that in HLSL, there are builtin matrices to define the position and orientation of the viewing camera. I know how to transform between ‘model’ and ‘world’ spaces, but I’m not sure how to deal with HLSL code that use View matrices. I’m assuming that since GLSL doesn’t have a concept of a camera or view matrix, the viewpoint is assumed to be in a fixed location. Am I correct in this?

Sorry to fire lots of questions at you, and thanks again for all your help,

a|x

What’s got me really confused is that GLSL doesn’t have a ‘View’ matrix in the way that HLSL does, as far as I can tell.

Yes you are right, I understand your confusion. Just remember that vertices coordinates that you set in the application are in object space and the modelview matrix contains 2 matrices : model (object space -> world space) and view (world space -> eye space). You have to do all computations in the same space, otherwise it would not have any sense.
I advise you to check each line of your shader code, each variable to see if operands are always in the same space.

When you multiply a position in eye space by the gl_ModelViewMatrixInverse, you come back to a object space position not world space.

If you really need to do computations in world space, you should set your custom matrices to the gl modelview matrix, and gives these ones to your shader as uniforms. But maybe, it possible to do all computations in object space.

I finally got this to work by making an arbitrary camera position vec.

Here’s a simpified vertex shader-only version of the effect that seems to work fine:

// Lighting controls
uniform vec4 PrimaryColor;	// Diffuse
uniform vec4 FuzzColor;		// Specular
uniform vec4 UnderColor;	// SubColor
uniform float Rolloff;		// Edge-rolloff
uniform vec3 LightPosition;

// Matrices
mat4 WorldITXf = gl_ModelViewMatrixInverseTranspose;
mat4 WorldXf = gl_ModelViewMatrix;

// Velvet Vertex Shader function
vec4 VelvetyVS(in vec4 V, in vec3 N)
{
	vec3 Nn = normalize(gl_NormalMatrix * N).xyz;
	vec4 Po = V;
	vec3 Pw = (WorldXf * Po).xyz;
	vec3 Ln = normalize(LightPosition - Pw);
	float ldn = dot(Ln,Nn);
	float diffComp = max(0.0,ldn);
	vec4 diffContrib = diffComp * PrimaryColor;
	float subLamb = smoothstep(-Rolloff,1.0,ldn) - smoothstep(0.0,1.0,ldn);
	subLamb = max(0.0,subLamb);
	vec4 subContrib = subLamb * UnderColor;
	// Arbitrary camera position
	vec3 cameraPos = vec3(0.0,0.0,1.0);
	vec3 Vn = normalize(cameraPos-Pw);
	float vdn = 1.0-dot(Vn,Nn);
	vec4 vecColor = vec4(vdn,vdn,vdn,1.0);
	vec4 DiffColor = subContrib + diffContrib;
	vec4 SpecColor = vecColor * FuzzColor;
	
	return DiffColor + SpecColor;
}

void main()
{	
	//Transform vertex by modelview and projection matrices
	gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
	
	//Forward current color and texture coordinates after applying texture matrix
	gl_FrontColor = VelvetyVS(gl_Vertex,gl_Normal);
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;
}

Dunno if it’s technically correct, but it looks OK. Any advice on how I might make it more efficient, of course, much appreciated.

Thanks for all your advice. I’m on such a steep learning curve here (considering I’m completely self-taught, am rubbish at maths, and I’m not a fulltime programmer).

Cheers,

a|x

Great! Your are welcome.
Yes, a last advice, take care of variable names. For example Pw is the vertex position in eye space whereas the name make me think it is a position in World space because of the “w”.

The thing with the camera position is weird. You give a camera position that seems to be in world space and Pw is in eye space. If you want to know the ray between the camera and the vertex, just transform the vertex position in eye space giving Pw. In fact Pw is the vertex position in eye space, so related to the camera which is at (0, 0, 0) in eye space.