View Full Version : Skinning & Normals

08-10-2004, 04:18 AM
Is there an exact method for calculating normals when using mutiple weights & transforms per vertex or do I need to calculate the normals out of the generated vertex data

08-10-2004, 03:31 PM

08-10-2004, 04:02 PM
You transform the normals as vectors.

08-11-2004, 12:21 AM
I don't think so. The transponent of the inverse matrix * normal vector i only valid for rotations etc and doesn't take into account the differences between two adjacent vertex weights ?

E.g. lets say that one weight matrix is a scaling matrix in the x direction and unit in y and z and the original weight matrix is a unit matrix.

Lets take a surface that has all normal in the x direction. When blending between the scale matrix and the unit matrix i get only normals pointing in the x direction but the weights can generate any slope as the weights can vary beteen the vertexes.

You need some kind of algo that uses the adjacent weights as well in the calculation of the normal.

Rob The Bloke
08-14-2004, 01:46 AM
dorbie is right. You can simply transform them by the matrices, sum the resulting normals and then normalise at the end. Alternatively, calculate the normals on the fly since if you extend the animation system to support FFD's or blendshapes, it becomes too hectic to bother transforming them.

It also allows you to create very generic animation systems that also support NURBS surfaces etc. (since then a base class containing just a list of points can be deformed by any generic deformer)

08-15-2004, 09:10 AM
Well I was looking for an algo that use the weight dx/dy/dz locally. Just using the common method to interpolate and weight the transformed normals gives incorrect normals as i stated above. I think it should be possible to use a weight dx/dy/dz function and to combine that with the common normal transform interpolation so you can calc a "correct" normal on the fly without calculating the normals out of the surface on the fly.

08-16-2004, 07:19 AM
You ... don't ... need ... to ... recompute ...
face normals ... on the fly ...

Interpolate vertex normals. Clear enough ?

08-17-2004, 03:00 AM
Hey Mr "SeskaPeel". I dont want to offend you "expert" advice but you CAN NOT just interpolate the vertex normals when you have differences between two adjacent vertex weights as the weights will introduce a slope.

I am looking for an algo that uses the local slope of the weight values that could give me the "correct" normal.


08-17-2004, 08:07 AM
Well I don't see the problem you're having with this, you should be able to weight the normals and normalize the results.... unless the normals are too extreme going into the weighting but that's a given. Your issue with scale suggests you're not weighting the normals prior to averaging them.

out_normal = normalize(normal1*weight1 + normal2*weight2);


out_normal = normalize((in_normal*matrix1)*weight1 + (in_normal*matrix2)*weight2);

I don't see how this could produce your problem with scale on the x axis as the weight would blend between the normals. This is just an average of course some folks like to slerp with weights instead of average (lerp) for the obvious reasons. Your scale is an extreme case and not the best example BTW.

I need pictures :-)

08-17-2004, 12:33 PM
I will try to make a picture later on. I will now try to explain it in words once again.

Lets assume a coordinate system with x to the right, y upwards and z towards the viewer ( -z into the screen ). A typical OpenGL coordinate system.

Lets take a generic geometry e.g. a tube that extends along the y axis a certain distance (height) with a given radius r in the x,z plane.

Now all normals along the tube side have values (xn,0,zn). Lets just look at these normals and skip the normal on the bottom and top cap.

Now consider a matrix A that scales or translates in the x,z direction. Any transformed normal will still be in the direction (xn,0,zn) with a scaling or translation factor of the actual base coordinate where the normal begins.

Any linear combination between a transformed normal and the original normal still gives a normal in the (xn,0,zn) plane.

So if you now modify the surface to blend between the unit matrix and the A matrix, you will get a varying surface that evidently doesn't have normals in just the (xn,0,zn) direction.

E.g. let the matrix A be a scale 2 maatrix. If you let the weights linearly weight from 0 to 1 from the bottom of the tube to the top you will get a cone with radius r at the bottom and a radius 2*r at the top. now this new geometry has normals in the (xn,-sqrt(xn^2+zn^2),zn) direction and these normals can never be interpolated out of the original ones without considering the dx/dy/dz operator of the weight function.

You can also se this quite clearly if you go back to define the normal as the cross product between two adjacent coordinate vector differences. If you blend in the equation to linearly interpolate the new resulting coordinates as cn=w*m1*c+(1-w)*m2*c and the express the normals as the cross product out of two adjacent diff vectors (c1-c2)x(c1-c3) you will se that the weight factors will blend into the actual equation.

In some cases when you apply rotations (most apps uses rotations to do skinning) the result will not be as obvious as in the scale and the translate case but it will still be there so my point is that most apps get faulty normals as they do skinning the "normal" way.

My engine gizmo3D has now added a feature where any generic surface can be blended into another geometry using an arbitrary set of control weights and "bone" transfoms and now I can clearly see that i need to express the normal calc usning the weighs to get perfect result.

Now it is pretty simple to do a surface dx/dy/dz calc using the cross product and normalize, but my guts tells me that i can precompute a weight function dw/x dw/y dw/z and use this as a constant in the normal linear interpolation.

It should be possible to do a simple linear combination of matrixes, normals and the dw/x.. etc constants per vertex and locally get a correct interpolated normal without recomputing the entire surface....

08-17-2004, 01:16 PM
Ahh... now it's crystal clear. I don't think skinning implementations attempt to solve this. Offhand I'd say it requires knowledge of adjacent verts, there's no per vert solution. Infact even that can't work except for isolated individual primitives.

Imagine your cylinder with vertical divisions (rings) and scale weights, now imagine that each ring has a different weight. Or even that alternate rings have alternating weights 0&1 then 1&0. There is no way per vert or even per primitive to shade this correctly in a vertex shader with a streaming architecture. Information about several adjacent primitives is required.

I don't think your problem has a solution as you envisage.

08-17-2004, 01:31 PM
Agree, but I think there is a good enough approx where the differences in adjacent weights (to be considered as constant for a certian set of weights, just recalc when weight change) could be used to eliminate some basic visual artefacts but still be fast to calc when only bone transforms changes

08-17-2004, 01:35 PM
You will get the same problem when you use the cross product to calc normals.

If you use the weights in the cross product, you can see that you get a bunch of factors that can be expressed as the original linear weight transform + a bunch of factors with weight slope values etc. .... just a thought...

08-23-2004, 06:39 PM
Tangent vectors should work by transforming using the same matrix you use for position (same linear spatial properties...). Thus if you have a tangent vector, and a binormal, I think you should be able to skin those using your matrices and then cross them to get the normal.

I'm not sure if that is what you meant with calculating the cross of the normals, but it should work. Otherwise... you could try inflating the entire model by the normals and then record the weights for those positions, and then skin the normals with those (like: skinned outer pos. - skinned inner pos.). Me thinks though that you'd still have artifacts, but that *might* improve it....

I would try the tangent and binormal stuff as that should work, if I'm remembering my ray tracing class right....

Edit: this would require tangent and binormal storage (computed offline), but on the upside, it would only be 1 vector more than before, and it would work in with normal mapping type schemes (if that's even applicable). Also, this assumes a linear R3 transform (i.e. no projection-like matrices), since you'd need to cross them....

08-23-2004, 07:38 PM
Yeah, so I worked out a bit of the math, and I think that n = cross( skinned tangent, skinned binormal) will work, but it will distort the normal quite a bit (1/det(A)^2 I think...), so it would be somewhat numerically unstable for high distortion matrices. To combat this you could normalize the skinned normals individually before blending them, but it probably wouldn't be too bad if the determinants aren't too big.... In either case you should normalize them after at least....

I can post the math I've done if anyone wants. Also, please let me know if it works :) .

08-24-2004, 04:35 AM
It will work if you express your tangent and binormal as (transformed vectors + a differential vector depending on weight slope). If you can derive the weight slope vector as a function of only the current vertice ( to keep the slope function constant when the neighbouring weights doesn't change) you will be able to calculate the normal as the cross product of a linear combination of the tangent and the weight slope tangent and the binormal and the weight slope binormal.

Just got it working.. Thanx..