Calculating Accurate Vertex Normals

The obvious way to compute vertex normals is to base them on the average of the adjacent polygon face normals. But this does’nt always cut it. When my program loads a cube model (or anything else with perpendicualar faces), the shading is messed up at the edges. Sharp edges have an incorrect beveled look.

One idea is check if the normals of two adjacent faces are more than a certain amount apart and avoid calculating an average normal for the adjacent vertices if they are (and simply assign to them the same face normal). But I have no idea as to how to implement this.

My triangle class maintains 3 pointers to each vertex and each vertex maintains a list of array indexes (not pointers) to the adjacent polygons. Vertices are shared.

After the triangle normals are computed I have this code:

for each vertex {
reset normal

for each adjacent triangle referenced {
    normal += adjacent triangle normal
   }

normalize normal

}

Any ideas as to how I could adapt this code to calculate the normals accuratly? Speed isn’t too important as this takes place in the pre-processing stage prior to being raytraced.

OpenGL doesn’t suffer this problem even with immediate mode rendering; so how on earth does it calculate vertex normals, when the only data it has is the single polygon being passed between a glBegin() and glEnd() call!?

Any help appreciated.

You pretty much have this correct, there’s nothing incorrect about your approach.

If you want to test before averaging you could use a dot product of the normals to compare the angle between adjacent facets on your model prior to averaging.

OpenGL does not compute normals, the application sends the normals between the glBegin and glEnd calls. Applications either calculate normals in a way similar to your description or they load normals with the model they are drawing, a modelling tool having been used to calculate the normals in the file.

I’m moving this thread to the beginners forum.