View Full Version : HDR in older cards, how?
06-01-2010, 01:51 AM
I'm trying to implement HDR in older cards (at least, geforce6x+), but i'm finding they don't seem to support "GL_RGB16F" format for FBOs (software emulation is used and everything is very slow). I tried to use GL_RGB16 instead, and normalize the output to 0 .. 1 , but somehow that format is also really slow..
How is it recommended to implement HDR in an older video card?
06-01-2010, 07:55 AM
Regardless of the bits per component you use, you can just store color/K in the texture until the tone filter multiplies it by K before toning. (K is chosen by you, can be 2,3,4,etc).
It may require to modify the shaders that draw light contribution.
If you use one pass per light with additive blending enabled, you can save your old shaders, but change the blending equation to ('constant_alpha','one') and set the constant alpha to be 1/K.
06-02-2010, 03:24 AM
Also have a look for logLUV (http://realtimecollisiondetection.net/blog/?p=15) and RGBM (http://graphicrants.blogspot.com/2009/04/rgbm-color-encoding.html) (particularly useful for lightmaps). And "light indexed deferred rendering (http://www.google.co.uk/url?sa=t&source=web&ct=res&cd=2&ved=0CBkQFjAB&url= http%3A%2F%2Flightindexed-deferredrender.googlecode.com%2Ffiles%2FLightIndex edDeferredLighting1.1.pdf&ei=qyIGTOCPF9KW4gbUz-zFDA&usg=AFQjCNF0F3pr7LpQ2dUGOOyM-v9Mub4QJA&sig2=LYcTqRRpgTkX0ueRBm2_pg)".
- for available texture-formats, see
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.