I was thinking about what would be the fastest way to get the average color of the screen. I would use it to normalize light intensity to prevent clamping at 1. This would simulating how the iris deals with the vastly different intensities of light between say, outside at noon, and a room lit by candles.
I would give my lights a pure color value (100 percent saturation), and then have a seperate intensity value. The sun would probably be 1000 while a candle would be around 10.
An example of how this would look. If I had a candle inside a building the engine would scale its intensity so that it would effect the walls. But if I went outside, its intensity would almost scale to 0, resulting in the expected behavior of a flashlight not having any visible effect during the day.
Lets say I had 3 lights in a scene. One light has an intensity of 100, while the other two have an intensity of 50. So, if they all contributed 100 percent then a surface would reflect an intensity of 200. If I wanted to have this entire dynamic range then I would map 200 to RGB = 1,1,1
However, it is not very likely that all these lights will shine all on one surface at full intensity, resulting in a very dim scene if I mapped the brightest possiblity to RGB=1,1,1. So I am guessing that I probably want to find the average intensity on the screen and then map that to RGB = .5,.5,.5
I think that the first frame would need to be rendered mapping the brightest possible color to 1,1,1 and then by mapping the average to .5,.5,.5 for subsequent frames a feedback loop results that keeps the brightness adjusted properly.
By doing it every second or half second, you save processing time, but you also simulate the slowness of the human iris response when changing from bright to dark areas. For example, if you stared at the sun, then back at the ground, things would momentarily be too dark to see. You could make this shift gradual by interpolating between samples.
So, any ideas on how to quickly average the screen?
My first idea was to render a small version of the screen (say 160x120 = ~80k) and then download that to the CPU to be averaged. I would average R, G, B seperatly, then get an intensity value by averaging the average R, G, and B (R, G, B would be weighted property of course).
Then I thought that maybe the automatic mipmapping extension could be used. Would it be possible to render a screen and make it into a texture with automatic mipmapping without having to download the screen to the CPU? It seems like it would be faster, and I could retreive the 1x1 mip map as my average RGB.
Something tells me that downloading a 4 byte 1x1 texture will be the same speed as an 80k texture due to overhead.
Edit: I was imagining a bright white room, like say the staging room in The Matrix. In my system this room could never be completely white, but always 50%, because that would be the average. The problem is that as stated above, the system assumes that the intensity of the lights are just relative and arbitraury (it doesn’t matter if the sun is 1000 or 10000, just as long as something 100 times dimmer is 10 or 100).
So, to solve that there needs to be a threshold where the eye can no longer compensate and things start to overbright. There needs to be a low threshold as well, so that scenes can actually become black.
[This message has been edited by Nakoruru (edited 07-23-2002).]