normal mapping with ARB_vertex_program

Assuming correctly generated tangents and binormals (why is it so hard to find sample code to generate them for a mesh?) this code should work, right? There shouldn’t ever be anything wierd happening with normal mapping a sphere based on its orientation?

!!ARBvp1.0
OPTION ARB_position_invariant;
ATTRIB iNormal = vertex.normal;
ATTRIB iTangent = vertex.attrib[6];
ATTRIB iBinormal = vertex.attrib[7];
PARAM mvinv[4] = { state.matrix.modelview.inverse };
PARAM lightDir = state.light[0].position;
PARAM half = { 0.5, 0.5, 0.5, 0.5 };
TEMP color, osLight, tsLight;

OUTPUT oColor0 = result.color;

Transform the light from eye to object space

DP3 osLight.x, mvinv[0], lightDir;
DP3 osLight.y, mvinv[1], lightDir;
DP3 osLight.z, mvinv[2], lightDir;

Transform the light from object to tangent space

DP3 tsLight.x, osLight, iTangent;
DP3 tsLight.y, osLight, iBinormal;
DP3 tsLight.z, osLight, iNormal;

normalize

DP3 color.w, tsLight, tsLight;
RSQ color.w, colour0.w;
MUL color.xyz, tsLight, colour0.w;

change range from [-1,1] to [0,1]

MAD oColor0, colour0, half, half;

END

So if I had a fragment program (or equivilent via register combiners, etc) I should be able to use a vector of 0,0,1 to change my normal mapping into some nice per pixel lighting, right?