Just read your question.
But it should be impossible to do real per pixel lighting (think you wan’t to do that) by passing vertexpossition and lightposition to the combiners, and then compute the correct lighting values (that what you do currently on the vertex level). But the computation of the combiners is done only with 9 Bit precision, which is not enough for such complex computations.
But there is a way of doing the lighing in one pass with dot3, by only using one texture.
I do this in my Engine, it works the following way :
- [li]1. per Vertex compute the light direction vector in local polygon coordinates (for the Dot3 Bumpmapping).[]2. Use the x and z Coords from the transformed light to generate texture coordinates for the attenuation map (just a 2D texture).[]3. Compute the z-Distance from the light to the vertex, by using the y component of the light.[]4. Encode the Dot3 lightvector into the primary colors rgb-Part.[]4. Put the Distance into the alpha-Part
The code for this looks like this
// precompute some values
float rangeMult = (1.0f / light->Range);
float rangeMultHalf = (1.0f / light->Range) * 0.5f;
// Go thru all Vertices
for(int iV=0;iVVertexCount;iV++)
{
VertexFormatDot3 *vertex = &vfd[iV];
D3DXVECTOR3 lightDir = lightCoord-vertex->point;
// Transform the Light into the local coordiantesystem of the current vertex (polygon)
Diffuse.x = -D3DXVec3Dot(&lichtDir,&vertex->s);
Diffuse.z = -D3DXVec3Dot(&lichtDir,&vertex->t);
Diffuse.y = D3DXVec3Dot(&lichtDir,&vertex->sXt);
// compute texture coordinates
uvTexs[iV].x = 0.5f + (Diffuse.x * rangeMultHalf);
uvTexs[iV].y = 0.5f + (Diffuse.z * rangeMultHalf);
// compute the remaining distance
det = fMax(0.0f,1.0f - (Diffuse.y *rangeMult));
D3DXVec3Normalize(&Diffuse,&Diffuse);
// Transform it so that it can be easyly expanded in the combiners
Diffuse = (Diffuse + D3DXVECTOR3(1.0f,1.0f,1.0f))*0.5f;
// Put the light vector and the distance into the color Value
LightVector2Dword(&Mesh->colorVals[0][iV],&Diffuse,det);
}
Now you have all the per vertex information. In the combiners you do your normal bumpmapping, and modulate it with the second texture (attenuation map) and with the alpha value from primary color. My Combinersetup looks like this :
// in the constant color is the color of the light source
glCombinerParameterfvNV(GL_CONSTANT_COLOR0_NV, (float*)&c1);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_A_NV,GL_TEXTURE0_ARB, GL_EXPAND_NORMAL_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_B_NV,GL_PRIMARY_COLOR_NV, GL_EXPAND_NORMAL_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_C_NV,GL_ZERO, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER0_NV, GL_RGB, GL_VARIABLE_D_NV,GL_ZERO, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerOutputNV(GL_COMBINER0_NV, GL_RGB,GL_SPARE0_NV, GL_DISCARD_NV, GL_DISCARD_NV,GL_NONE, GL_NONE, GL_TRUE, GL_FALSE, GL_FALSE);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_C_NV,GL_SPARE0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_D_NV,GL_CONSTANT_COLOR0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_B_NV,GL_TEXTURE1_ARB, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glCombinerInputNV(GL_COMBINER1_NV, GL_RGB, GL_VARIABLE_A_NV,GL_PRIMARY_COLOR_NV, GL_UNSIGNED_IDENTITY_NV, GL_ALPHA);
glCombinerOutputNV(GL_COMBINER1_NV, GL_RGB,GL_SPARE1_NV,GL_SPARE0_NV, GL_DISCARD_NV,GL_NONE, GL_NONE, GL_FALSE, GL_FALSE, GL_FALSE);
glFinalCombinerInputNV(GL_VARIABLE_A_NV,GL_SPARE1_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
glFinalCombinerInputNV(GL_VARIABLE_B_NV,GL_SPARE0_NV, GL_UNSIGNED_IDENTITY_NV, GL_RGB);
Of course the bumpmapping is not the highest quality, cause you don’t use a normal map, but i never had problems with artifacts yet.
Lars
[This message has been edited by Lars (edited 03-25-2001).]