View Full Version : Normalization in Reg. Combiners vs. Cubemaps

07-01-2003, 04:05 AM

I just got normalizing in register combiners working. The paper i got the formula from, said that one condition is: "The angle between all pairs of the original per-vertex vectors is
no more than 40š (or so)." And further on it says: "For models of reasonable tesselation (and/or reasonable distance to the light and viewer) #2 holds"

I think this is quite strange. If i have a good tesselation, i donīt need to normalize the vector, at all, because the difference is really only minor. However if i DONīT have my data tesselated, i need normalization. But in this case it doesnīt work properly. I tested it, and i was extremly disappointed with the normalization. On one big poly it was only slightly better than without normalization.

In my test-app cubemaps and register combiners were both equally fast. Therefore i would like to know, if it is worth tesselating my data a bit and using register-combiners? Or should i stick to my cubemap?


07-01-2003, 06:00 AM
I'd go for cubemaps, and avoid the extra work in the combiners.

07-01-2003, 09:15 AM
The question comes down to this: can you afford a second texture access? If so, use a cubemap. If not, use the RC normalization.