Lev

05-11-2001, 11:25 PM

Hello!

I'm trying to unterstand (and implement, but thats later) how per-pixel

lighting works and have some problems because I found no tutorial so far

which explains it from the basics.

What i've found out so far is:

the lighting equation opengl uses (in simplified form and only

considering diffuse lighting, no specular and ambient):

color = attenuation * max((n dot l), 0) * lightcolor * materialcolor

where n is a normal and l is the light vector.

now to do this stuff per-texel we must have all the stuff for each

texel:

attenuation should be no problem, could be stored in a texture, as well

as normal map (there are some problems here), but how the heck do you

get a per-texel light vector? You can calculate it per-vertex, but I

really have no idea how to do it per-texel.

Then there is a problem that graphics hardware cannot do this max/min

thing, so how can I solve this problem using only multiplies and adds?

As far as I get it its importand for self-shadowing, because if (n dot

l) is <0 then the surface is facing away from the light and should not

be lighted therefore

Now the problem with normal map is that normals must be transormed to do

lighting in eye space and currently only geforce3 features a

texel-matrix to do it.

Then most papers write that therefore lighting must be done in

surface-local space, which means every vertex must have a unique matrix (which I do not understand) which transforms this vertex to (0 0 0) with normal pointing to (0 0 1) but then I do not understand how can they say the normal points to (0 0 1) if its fetched from a texture, anyway the stuff above is what I understood and hopefully someone can explain the rest of the concept.

Thanks in advance,

-Lev

I'm trying to unterstand (and implement, but thats later) how per-pixel

lighting works and have some problems because I found no tutorial so far

which explains it from the basics.

What i've found out so far is:

the lighting equation opengl uses (in simplified form and only

considering diffuse lighting, no specular and ambient):

color = attenuation * max((n dot l), 0) * lightcolor * materialcolor

where n is a normal and l is the light vector.

now to do this stuff per-texel we must have all the stuff for each

texel:

attenuation should be no problem, could be stored in a texture, as well

as normal map (there are some problems here), but how the heck do you

get a per-texel light vector? You can calculate it per-vertex, but I

really have no idea how to do it per-texel.

Then there is a problem that graphics hardware cannot do this max/min

thing, so how can I solve this problem using only multiplies and adds?

As far as I get it its importand for self-shadowing, because if (n dot

l) is <0 then the surface is facing away from the light and should not

be lighted therefore

Now the problem with normal map is that normals must be transormed to do

lighting in eye space and currently only geforce3 features a

texel-matrix to do it.

Then most papers write that therefore lighting must be done in

surface-local space, which means every vertex must have a unique matrix (which I do not understand) which transforms this vertex to (0 0 0) with normal pointing to (0 0 1) but then I do not understand how can they say the normal points to (0 0 1) if its fetched from a texture, anyway the stuff above is what I understood and hopefully someone can explain the rest of the concept.

Thanks in advance,

-Lev