Normalization in Reg. Combiners vs. Cubemaps

Hi

I just got normalizing in register combiners working. The paper i got the formula from, said that one condition is: “The angle between all pairs of the original per-vertex vectors is
no more than 40º (or so).” And further on it says: “For models of reasonable tesselation (and/or reasonable distance to the light and viewer) #2 holds”

I think this is quite strange. If i have a good tesselation, i don´t need to normalize the vector, at all, because the difference is really only minor. However if i DON´T have my data tesselated, i need normalization. But in this case it doesn´t work properly. I tested it, and i was extremly disappointed with the normalization. On one big poly it was only slightly better than without normalization.

In my test-app cubemaps and register combiners were both equally fast. Therefore i would like to know, if it is worth tesselating my data a bit and using register-combiners? Or should i stick to my cubemap?

Jan.

I’d go for cubemaps, and avoid the extra work in the combiners.

The question comes down to this: can you afford a second texture access? If so, use a cubemap. If not, use the RC normalization.