glNormal3fv

Hi,
if the normal are defined for some vertex in the glNormal3fv(n)
and I want to change the defined normal /or some of its componens .Can I do it?(not using the data about n)

Thanks

You can do whatever You want with your normals in vertex shader.
What do You mean by ‘not using data about n’?
You have some vbo with vertices/normals and You don’t want to re-upload it to VRAM?
or are You stuck with immediate mode, and You do not have your data in any array (calling glVertex/glNormal with hardcoded values)?

Edit:
Of course You are using immediate mode (glNormal3fv)
Store your vertex/normal data in some array, and then You can change it at any time.

I use GL_LIGHT_MODEL_TWO_SIDE to draw the mesh.
So OpenGL flips the normals when it is necessary.I want to know/use the actual value of normals(flipped) OpenGL uses.

It simply negates the values of the normal components based on face winding, nx = -nx, ny = -ny, nz = -nz, Is this really your question?

This also has profound implications for dual lighting and storing color multiple lighting results when you have smooth shading and two-sided lighting in hardware.

Normals are also possibly normalized and transformed to eyespace for lighting of course.

You’re not really giving us much to work with here. Unless your question is really this simple.

My question is what to do when the shading model is - Smooth
shading!
In smooth shading the normal per vertex has to be calculated.
(Probably averaging all face normals ,the vertex is included in).But in two-side lighting I don`t know when OpenGL flips the normal for the face and when not!-So,the averaging will not give results…
So,How is it possible to calculate normals per each vertex in this case?

You don’t worry about this. OpenGL flips the normal not you.

Unless you are a driver guy trying to implement OpenGL with two sided lighting.

As an application developer you send in your normals. Then you enable two sided lighting. When that is done OpenGL will flip the normal for you and apply two sided lighting. It will also do clever things to do double lighting and store two results for interpolation purposes. It will aslo switch materials for you.

Now if this is slow on your card (and the card makers contrive to make it slow because they want pros to pay big bucks for graphics cards), then the poor man’s approach is to use face culling. You draw the object once with back face culling, then you flip the normals and draw it again with front face culling.

Now if you’re trying to implement two sided lighting in shaders you’d have to do double lighting (flipping normals for the dot products) pass on both lighting results for interpolation and then select an attribute based on the face winding flag available to you in the fragment shader pipeline stage. Not too tricky.

The model I am using has ununiform winding!
This is a reason I use two-side lighting.
The normals are computed for each face!(when I actually dont know if the normal direction is correct)
And in order to use SMOOTH shading I have calculate normals in each vertex!