Normals and texture coords

How do you light stuff properly, I know that you have to use normals, but the documentation that I have on this is brief and cryptic, also there is generating texture co-ordinates - is this the same kind of thing? Help me please coz my objects look poor!
Simon Hall

Hi,

to use correct lighting you must specify a normal at each vertex. You can get really nice lighting if you take the origin as the middle rotating point and for the normals the vertices itself.

texture coordinates (glTexCoord), what about them ?

Edo

So do you mean that for every vertex I define, I define a normal using the same co-ords?

Texture co-ordinates - how do you calculate them because my objects are coloured using the first pixel of the bitmap? Usin glu objects I’m fine tho…
Simon Hall

Hi, i just would like to know what sort of object do u want to light… Is it a simple opengl objec like gluSphere? or is it a complex object like a car in 3DSMAX ??

If it’s a simple object, the normals should be created correctly. If it’s a complex object, u have to verify if the normals are oriented from the faces to the exterior of the object and not to the interior.

Hope that helps.

JC

Hi there,

I hope I can help… I’m pretty much an OpenGL newbie but have dabbled a little in 3D graphics programming generally, so maybe I can help solving this.

Normals are vectors that are ‘standing straight’ on a surface:

   ^  normal
   |
   |

------------ surface

Normals have by defninition the unit length 1 (do glEnable(GL_NORMALIZE); or GL_RESCALE_NORMALS in your init function, to preserve the unit length of the normals when scaling objects!)

Your lighting problems occur, because the normal vectors are used for the lighting:

let L be a vector pointing from the light source to the point P on the polygon to be rasterized, and N be the normal vector at this specific point, then the brightness of the light for this point is computed by the angle a between N and L - the smaller the angle, the more light is being reflected (for diffuse lighting).

light source O ^
\ |N
\ a|
L\ |
v|
-------P-----------

for specular lighting, the angle s between the reflected light vector R and the vector E from the eye point to P is used to determine the brightness of the specular highlight:

                                 Eye
                     ^          -

light source O ^ /R -
\ |N / -
\ | / s -
L\ | / - E
v|/<-
-------P-----------

I hope this crappy ASCII art helps a little :wink:

So, you have to specify normals for the surfaces. In OpenGL a normal has to be specified for every vertex, for example like this:

glBEgin(…);
glNormal3f(-1.0, 0.0, 0.0);
glVertex3f(0.0, 0.0, 0.0);
glNormal3f(-1.0, 0.0, 0.0);
glVertex3f(1.0, 0.0, 0.0);
glNormal3f(-1.0, 0.0, 0.0);
glVertex3f(1.0, 1.0, 0.0);
glEnd();

This would specify the correct normals for a surface stretching along the X and Y axes (they’re pointing ‘out of the screen’).

Since the normals are used for the lighting, you can achieve smooth edges of objects by adjusting them in direction of the neighboring surface of a vertex. Imagine you’re looking at 4 sides of a cube from the top:

sharp edges: smooth edges:

^ ^ ^ ^
| | \ /
<-xxxxx-> xxxxx
x x x x
x x x x
x x x x
<-xxxxx-> xxxxx
| | /
v v v v

in the right picture, one of the arrows shows the direction of 2 normals for the vertices that are the same for 2 quads.

There is a way to generate normals automatically using OpenGL evaluators, but don’t ask me how (haven’t figured that out yet).
Important is, that when you generate your normals, they have to have unit length (1.0)! If you’re not sure about that enable GL_NORMALIZE when generating the normals Also, scaling your objects will deform the normals, because normals will always be scaled with the inverse modelview matrix when using glScalexx(); Enabling GL_RESCALE_NORMALS gets rid of that problem, too.

I hope I could contribute to getting rid of your problem

Hi again,

ok I just realized you have to use a non-proportional font to read my first message
And I forgot about your texture coordinate problem, so here:

Texture coordinates define, which point on the texture represents which vertex on a polygon. For example, for mapping a 256x256 texture on a quad, each of the vertices of the quad could represent one of the corners of the texture, like this (monospaced font!):

your quad on screen: the texture:

  _____B            A----------B

A---- | | |
| | | o o |
| | | v |
| | | _/ |
D —
_| | |
C D----------C

The texture coordinates have to be specified with the vertices, for example like this:

glBegin(GL_QUAD);
glTexCoord2i(0, 0);
glVertex3f(-1.0, -1.0, 0.0);
glTexCoord2i(256, 0);
glVertex3f(1.0, -1.0, 0.0);
glTexCoord2i(256, 256);
glVertex3f(1.0, 1.0, 0.0);
glTexCoord2i(0, 256);
glVertex3f(-1.0, 1.0, 0.0);
glEnd();

glTexCoord2i() has 2 parameters, the s and t texture coordinates. The first one, s represents the horizontal coordinate in your texture image, the second one, t, the vertical coordinate. s and t are both in pixels.
You can achieve nice effects when you move the texture coordinates while your objects are animated. For example, by moving the texture coordinates toward the center of the image, the texture will zoom in on the mapped polygon.

Without defining texture coordinates for each vertex, all coordinates are assumed to be 0 - which maps the pixel 0,0 of your texture image all over the polygon, cause all vertices of the polygon represent coordinate 0,0 in the texture image. That’s where your problem is coming from

There are other forms of the glTexCoord command, that allow to specify up to 2 more texture coordinates, r and q. From what I understand, these are only needed for projective textures, which I also haven’t quite figured out yet.