I think you’re missing the point somewhat. In OpenGL, smooth implies interpolation. Having the shade model set to GL_SMOOTH means nothing more than “compute the fragment color by interpolating the vertex colors.” If the shade model is set to GL_FLAT, the vertex color of the last vertex of the primitive will be used as the color of every fragment.
What you see in the above screen shot isn’t pure vertex coloring but some light source illuminating the objects. The difference is that the final color is not only dependent on the assigned vertex color but also the color value obtained during lighting computation.
What you ask for is a flat shaded, illuminated sphere.
If you want per pixel lighting/shading with flat surfaces, you have basically two options.
First option: Use shaders. Fragment shaders give you per pixel shading. This is the best option, and allows the best quality.
Second option: Subdivide the surface to get more vertices. This will make per vertex lighting will look a bit better. This is perhaps the simplest approach if your application is currently using legacy fixed function OpenGL. Only limited quality improvement.
Finally, it is also possible to use legacy fixed function texture combiner to perform a simple dot3 function. This can give you extremely limited per pixel lighting. But this approach is not worth it any more, shaders are easier and way, way more powerful.
First option (shaders): Is is possible to get flat normal by face with shaders with smoothed per vertex color? In that case, this could be the solution.
Second option (subdiv): Change the geometry is not something I can/want to do.
Third option (use texture): Interesting, I will investigate on this.
Yes. You can compute normals to feed your shaders any way you want. You can also send multiple normals, say, both flat normals and smoothed normals, if your shader happens to have use for both.
Flat normals are easy to compute. Compute polygon normals. Duplicate each vertex for each polygon uses it, and copy polygon normal to the vertex.
Also, I would not recommended the third option. And while texture environment setup strictly speaking requires dummy a texture, it would only do dot(L,N) where N comes from vertex color and L is constant color, texture does not need to be sampled.
Using shaders you can of course - like tksuoran explained. You use the face normal to compute homogeneous illumination per face and interpolate the vertex color just like you would with any other value from vertex to fragment stage. The illumination you’ll want to compute per vertex, and pass the resulting color value to the fragment shader with flat (i.e. no) interpolation. Modulate vertex and the computed color and your done.