How do I use Smoothing Groups?

I’m still having trouble generating accurate vertex normals for objects with perpendicular faces.

How do smoothing groups work? Are they a part of OpenGL or an informal programming method?

Please help.

It is done using an informal programming method.

If the vertices of two adjacent faces share a normal they will appear to be smooth shaded, however if the vertices do not share normals they will appear flat shaded or at least there will be an apparent shading discontinuity at the edge.

Your criteria for sharing surface normals across an edge are entirely up to the application or modelling software. Some software for example will only share normals on an edge if the dot products of the adjacent face normals falls above some threshold. Otherwise each vertex will have a unique normal based on the face normal.

It’s very simple to get a rudimentary smoothing with threshold based normal averaging. This should not present you with much of a problem.

I’ve implemented smoothing based on vertex averaging which works fine for spheres etc, but not for cubes, or the caps of cylinders. I just can’t get anything to work effectively.

How do you do it? I would suspect that simply taking the dot product of the current vertex normal with an adjacent polygon normal will not always produce consistent results as it is dependent on the order in which the adjacent polys are visited. And comparing adjacent polys directly could be very messy to program; if you find that the previous poly’s normal is >90 degrees subtended by the current poly’s normal, should you use the current or previous normal for the vertex? :confused:

you can’t just blindly plow through your data! you need adjacency info. you need to know which vertices are shared, and, consequently, which normals are to be averaged. you can precalculate all this stuff ahead of time, if you want to.

hope i helped!

:slight_smile:

Each triangle has 3 pointers to it’s vertices and each vertex has a list of pointers to the triangles using it. Vertices are shared. So, yes; the necessary data structures are in place. I just need to come up with a robust algorithm to determine which triangle normals should be used to average the vertex normal, and which should not.

That’s the hard part :frowning:

Originally posted by _new_horizon:
[b]Each triangle has 3 pointers to it’s vertices and each vertex has a list of pointers to the triangles using it. Vertices are shared. So, yes; the necessary data structures are in place. I just need to come up with a robust algorithm to determine which triangle normals should be used to average the vertex normal, and which should not.

That’s the hard part :frowning: [/b]
If you don’t want threshholding :

  1. Search the vertex array, eliminate duplicate vertices.

  2. Allocate memory for each vertex for storing normals

  3. Allocate memory for each triangle to store face normal.

  4. compute normal for each triangle, and store it.

  5. search for triangles sharing vertex X, average their face normals.

For threshholding, it is a little more complex.

The big question is, do you need thresholding because you are writing a modelling app, or because you want to retain hard edges that you artists have put into the model, and you just want to retain them?

If it’s the latter reason, then simply convert your data to a vetex array. This gives you a quick way of generating normals accurately since a vertex array element MUST share a vertex and normal.

  1. memset the normal array to zero.
  2. Run through the triangle index list in groups of 3 indices (a tri)
  3. Calculate the face normal and normalise*
  4. Add the normal to the three vertex normals as indicated by the face indices.
  5. Normalise all normals in the normal array (or enable GL_NORMALISE / normalise within a vertex program)

[*if the mesh has similar sized faces, you can ignore the normalisation of each face normal. This makes the calculation slightly inaccurate, but it’s not a hugely noticeable difference]

 

// if 1 then when calculating the face normal, the output gets normalised.
// If adjacent polys have almost the same area then when the per vertex normalisation
// should still sort them out.
#define GN_NORMALISE_PER_FACE 1

// if set to 1, the normals get normalised. use 0 if you enable GL_NORMALIZE to sort
// the normalisation for you.
#define GN_NORMALISE_PER_VERTEX 0


//--------------------------------------------------------------------------
/// \brief	This function generates the normal vectors for a mesh stored
///			within a vertex array. 
/// \param	num_indices	-	num indices within the vertex index array
/// \param	num_elems	-	num elements within the vertex arrays
/// \param	indices		-	the index array
/// \param	verts		-	the vertex array
/// \param	norms		-	the normal array
///
template
<
	typename TIndex,
	typename TData
>
void GenerateNormals(const unsigned int num_indices,
					 const TIndex num_elems,
					 TIndex* indices,
					 TData* verts,
					 TData* norms)
{
	// set all normals to 0,0,0
	{
		TData* n = norms;
		TData* end = n + num_elems*3;

		for( ; n != end; ++n )
		{
			*n = 0;
		}
	}

	// loop through all the faces in the mesh
	TIndex* it = indices;
	TIndex* end = it + num_indices;

	for( ; it != end; it+=3 )
	{

		// get the vertices for the triangle
		const TData* v1 = verts + *it * 3;
		const TData* v2 = verts + *(it+1) * 3;
		const TData* v3 = verts + *(it+2) * 3;

		// get the normals
		TData* n1 = norms + *it * 3;
		TData* n2 = norms + *(it+1) * 3;
		TData* n3 = norms + *(it+2) * 3;

		// calculate vector between v2 and v1
		TData e1[3] = {
			v1[0] - v2[0],
			v1[1] - v2[1],
			v1[2] - v2[2]
		};

		// calculate vector between v2 and v3
		TData e2[3] = {
			v3[0] - v2[0],
			v3[1] - v2[1],
			v3[2] - v2[2]
		};

		// cross product them
		TData e1_cross_e2[3] = {
			e2[1]*e1[2] - e2[2]*e1[1],
			e2[2]*e1[0] - e2[0]*e1[2],
			e2[0]*e1[1] - e2[1]*e1[0]
		};

		// might want to normalise the face normals
#if GN_NORMALISE_PER_FACE
		TData itt = 1.0f/((TData)sqrt( e1_cross_e2[0]*e1_cross_e2[0] + 
									  e1_cross_e2[1]*e1_cross_e2[1] + 
									  e1_cross_e2[2]*e1_cross_e2[2] ));

		e1_cross_e2[0] *= itt;
		e1_cross_e2[1] *= itt;
		e1_cross_e2[2] *= itt;
#endif

		// sum the face normal into all the vertex normals this face uses
		n1[0] += e1_cross_e2[0];
		n1[1] += e1_cross_e2[1];
		n1[2] += e1_cross_e2[2];

		n2[0] += e1_cross_e2[0];
		n2[1] += e1_cross_e2[1];
		n2[2] += e1_cross_e2[2];

		n3[0] += e1_cross_e2[0];
		n3[1] += e1_cross_e2[1];
		n3[2] += e1_cross_e2[2];
	}

	//	we will now have un-normalised versions of the normals. For lighting 
	//	to work correctly, we really need to normalise them. We could either
	//	use code like this, or just use glEnable(GL_NORMALIZE) 
	// 
#if GN_NORMALISE_PER_VERTEX
	{
		TData* n = norms;
		TData* end = n + num_elems*3;

		for( ; n != end; n+=3 )
		{

			TData it = 1.0f/((TData)sqrt( n[0] * n[0] + 
										  n[1] * n[1] + 
										  n[2] * n[2] ));
			n[0] *= it;
			n[1] *= it;
			n[2] *= it;
		}
	}
#endif
}
 

The problem with this method is that it creates seams along the UV boundaries, so you need a fix to blag it.

as a pre-processing step, you need to build up sets of indices that use the same vertex and normal, but do not use the same UV coord. This process can be sped up a bit if you create a mapping array that tells you which original vertex was used within the respective vertex array position. The code’s a bit too long to post up here, but i have a little example here that dynamically recalculates the normals and tangents for blend shapes.

If you need the calculation for a modelling app, then you will need a slightly different approach that will involve the use of an edge based data structure.

 
struct Face {
    ushort VertIndices[3];
    ushort NormIndices[3];
    ushort UvIndices[3];

    // the face normal
    float FaceNormal[3];
};
struct Edge {
    // two pointers to the faces 
    // connected to this edge
    Face* Face0;
    Face* Face1;

    // indices to the two normals used by the
    // first face
    ushort NormalIndexF0_0; // normal at first point
    ushort NormalIndexF0_1; // normal at second point

    // indices to the two normals used by the
    // second face
    ushort NormalIndexF1_0; // normal at first point
    ushort NormalIndexF1_1;// normal at second point

    // the threshold angle for this edge. For 
    // speed reasons it's better to store the 
    // cosine of the angle. The angle should 
    // vary between 0 and 180 
    //
    float CosAngle;

    // a function to average the normals for this
    // edge. Simply pass in the normal array
    void AverageNormals(float *normal_array);
};

void Edge::AverageNormals(float *normal_array) {
    // get pointer to the two face normals
    float* n0 = Face0->FaceNormal;
    float* n1 = Face1->FaceNormal; 

    // get pointers to the vertex normals
    float* f0n0 = normal_array + 3*NormalIndexF0_0;
    float* f0n1 = normal_array + 3*NormalIndexF0_1;   
    float* f1n0 = normal_array + 3*NormalIndexF1_0;    
    float* f1n1 = normal_array + 3*NormalIndexF1_1;

    // first add the face normals to the normals 
    // that will be used to render those faces
    f0n0[0] += n0[0]; 
    f0n0[1] += n0[1]; 
    f0n0[2] += n0[2]; 
    f0n1[0] += n0[0]; 
    f0n1[1] += n0[1]; 
    f0n1[2] += n0[2];

    f1n0[0] += n1[0]; 
    f1n0[1] += n1[1]; 
    f1n0[2] += n1[2]; 
    f1n1[0] += n1[0]; 
    f1n1[1] += n1[1]; 
    f1n1[2] += n1[2];


    // dot product the two face normals. For this
    // to work properly, you must normalise the
    // face normals first.
    //
    float dot = n0[0]*n1[0] + n0[1]*n1[1] + n0[2]*n1[2];

    // compare the dot product result with the
    // one from this edge. If the two face 
    // normals are similar, then the dot product 
    // will return a value approaching 1. If 
    // opposite, then the dot value will be -1
    //
    // We can simply say then that if the dot 
    // value is greater than the CosAngle value
    // then the normals should be averaged. This 
    // be done by adding the face normals to the
    // vertex normals used by the other face
    //
    if( dot > CosAngle ) 
    {
        f0n0[0] += n1[0]; 
        f0n0[1] += n1[1]; 
        f0n0[2] += n1[2]; 
        f0n1[0] += n1[0]; 
        f0n1[1] += n1[1]; 
        f0n1[2] += n1[2];

        f1n0[0] += n0[0]; 
        f1n0[1] += n0[1]; 
        f1n0[2] += n0[2]; 
        f1n1[0] += n0[0]; 
        f1n1[1] += n0[1]; 
        f1n1[2] += n0[2];
    }
};

void AverageNormals(Edge* edges,int num_edges,float *normal_array,int num_norms) {
    
    // first set the vertex normal array to zero
    memset(normal_array,0,sizeof(float)*num_norms*3);

    // loop over all edges and average normals
    for(int i=0;i<num_edges;++i) {
        edges[i].AverageNormals(i);
    }
}
 

hope that helps, rob

Originally posted by _new_horizon:
How do smoothing groups work? Are they a part of OpenGL or an informal programming method?
Smoothing groups are basically just the sets of polys that have normals which need to be averaged. To use them, build up a list of the face indices (or pointers) of all faces in the group. Average only the face normals for polygons within this group.

Thanks, I’l try that out. This is for a modeling program but the algo must be able to generate accurate normals for surfaces read in from files where the geometry is not known beforehand. Only the polygon/vertex data is available.