HDR in older cards, how?

I’m trying to implement HDR in older cards (at least, geforce6x+), but i’m finding they don’t seem to support “GL_RGB16F” format for FBOs (software emulation is used and everything is very slow). I tried to use GL_RGB16 instead, and normalize the output to 0 … 1 , but somehow that format is also really slow…

How is it recommended to implement HDR in an older video card?

Regardless of the bits per component you use, you can just store color/K in the texture until the tone filter multiplies it by K before toning. (K is chosen by you, can be 2,3,4,etc).

It may require to modify the shaders that draw light contribution.
If you use one pass per light with additive blending enabled, you can save your old shaders, but change the blending equation to (‘constant_alpha’,‘one’) and set the constant alpha to be 1/K.

Also have a look for logLUV and RGBM (particularly useful for lightmaps). And “light indexed deferred rendering”.