PDA

View Full Version : HDR loading/displaying



supagu
06-04-2005, 05:39 AM
I'm trying to implement HDR textures in my engine.

I'm using the code found in the nvidia sdk to load the HDR image.

So it seems the HDR pixels are of type RGBE (red, green , blue, e = eponent?)

so what i've tried is to display it strait as an RGBA texture, it has some resemblence to the initial image.

but whats the proper way to convert and create a float texture from this data, using glTexImage2D?

(also whats this nvidia extension HILO (GL_HILO_NV)

stanlylee
06-04-2005, 09:32 AM
you should find the glTexImage2D functions help in MSDN or in the Bluebook.

supagu
06-04-2005, 07:18 PM
glTexImage2D isn't the problem, the problem is converting from RGBE to a float texture (or at least 16-bit float texture)

i just go and create the texture converting from R,G,B,E to RE,GE,BE ?

Overmind
06-05-2005, 05:13 AM
It's more like R*2^(E-128), G*2^(E-128), B*2^(E-128), but I don't know the exact numbers and scaling factors...

knackered
06-05-2005, 11:02 AM
I haven't experimented with HDR yet, hoping to find time soon, but I've assumed thus far that the conversion can happen in the pixel shader, so you store the hdr texture as r,g,b,exponent and then in the pixel shader do the calculation to convert this into a single floating point triplet and do your bloom/glare/exposure calculations based on that. I'm not entirely sure why you'd ever need floating point texture formats either, surely this level of precision would never be noticed?

def
06-05-2005, 11:56 AM
Check out this page:
http://www.graphics.cornell.edu/%7Ebjw/rgbe.html
There is source included for reading RGBE format. Check out the function rgbe2float() to do the conversion yourself in a fragment program.

knackered
06-05-2005, 02:04 PM
I believe there's a format called ".hdr", which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from ".hdr" to ".rgbe" or is this being naive?

stanlylee
06-05-2005, 08:37 PM
How about to use OpenEXR?

NitroGL
06-05-2005, 08:41 PM
Originally posted by knackered:
I believe there's a format called ".hdr", which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from ".hdr" to ".rgbe" or is this being naive?.hdr are RLE encoded RGBE images

supagu
06-05-2005, 09:02 PM
yeah, i found more info last night, you simply upload your texture as RGBE (exponenet in the alpha channel),
and use a shader to decode the data into floats.

knackered
06-06-2005, 12:04 AM
Originally posted by NitroGL:

Originally posted by knackered:
I believe there's a format called ".hdr", which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from ".hdr" to ".rgbe" or is this being naive?.hdr are RLE encoded RGBE imagesReally? Great to see another proliferation of file formats...why agree on one when 10 can keep people in work for years.

dorbie
06-06-2005, 01:52 AM
http://www.openexr.com/

capedica
06-06-2005, 04:25 AM
I'm interested too in openEXR, any ideas how to integrate it in opengl and glsl? For example i'd like to:
- load an image .EXR and build a texture;
- use it inside a shader

supagu
06-06-2005, 03:21 PM
if you d/l the nvidia sdk, they have a opengl hdr sample, with some .hdr loading code.

i 2 am in the process of integrating some hdr shaders in to my engine. but im using cgfx

NitroGL
06-06-2005, 04:47 PM
Originally posted by knackered:Really? Great to see another proliferation of file formats...why agree on one when 10 can keep people in work for years.Yeah, though the RLE encoding is kind of odd. It only encodes per scanline, not the max run length. Or something like that anyway.

Personally I use RGBE encoded TGA images, with the TGA RLE encoding. Works pretty well, at least I think so.

execom_rt
06-06-2005, 04:59 PM
You can use the HDR Radiance format (http://www.graphics.cornell.edu/online/formats/rgbe/)

There is a function to load the file to 32bit float (3 x 32bit) : RGBE_ReadPixels_RLE

You can also upload the texture in that format and OpenGL will convert to 16bit float for you (if you specify the internal format to 16bit).

texture format: GL_RGB
internal format: GL_RGB32F_ARB (works on ATI and nVidia)
texture type: GL_FLOAT

This requires : GL_ARB_texture_float or GL_ATI_texture_float

You cannot use texture filtering (important).

It's the easier way.