HDR loading/displaying

I’m trying to implement HDR textures in my engine.

I’m using the code found in the nvidia sdk to load the HDR image.

So it seems the HDR pixels are of type RGBE (red, green , blue, e = eponent?)

so what i’ve tried is to display it strait as an RGBA texture, it has some resemblence to the initial image.

but whats the proper way to convert and create a float texture from this data, using glTexImage2D?

(also whats this nvidia extension HILO (GL_HILO_NV)

you should find the glTexImage2D functions help in MSDN or in the Bluebook.

glTexImage2D isn’t the problem, the problem is converting from RGBE to a float texture (or at least 16-bit float texture)

i just go and create the texture converting from R,G,B,E to RE,GE,BE ?

It’s more like R2^(E-128), G2^(E-128), B*2^(E-128), but I don’t know the exact numbers and scaling factors…

I haven’t experimented with HDR yet, hoping to find time soon, but I’ve assumed thus far that the conversion can happen in the pixel shader, so you store the hdr texture as r,g,b,exponent and then in the pixel shader do the calculation to convert this into a single floating point triplet and do your bloom/glare/exposure calculations based on that. I’m not entirely sure why you’d ever need floating point texture formats either, surely this level of precision would never be noticed?

Check out this page:
http://www.graphics.cornell.edu/%7Ebjw/rgbe.html
There is source included for reading RGBE format. Check out the function rgbe2float() to do the conversion yourself in a fragment program.

I believe there’s a format called “.hdr”, which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from “.hdr” to “.rgbe” or is this being naive?

How about to use OpenEXR?

Originally posted by knackered:
I believe there’s a format called “.hdr”, which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from “.hdr” to “.rgbe” or is this being naive?
.hdr are RLE encoded RGBE images

yeah, i found more info last night, you simply upload your texture as RGBE (exponenet in the alpha channel),
and use a shader to decode the data into floats.

Originally posted by NitroGL:
[quote]Originally posted by knackered:
I believe there’s a format called “.hdr”, which I assume stores the pixels as 3 single precision floats, in which case I imagine it would be fairly straightforward to write a convertor from “.hdr” to “.rgbe” or is this being naive?
.hdr are RLE encoded RGBE images
[/QUOTE]Really? Great to see another proliferation of file formats…why agree on one when 10 can keep people in work for years.

http://www.openexr.com/

I’m interested too in openEXR, any ideas how to integrate it in opengl and glsl? For example i’d like to:

  • load an image .EXR and build a texture;
  • use it inside a shader

if you d/l the nvidia sdk, they have a opengl hdr sample, with some .hdr loading code.

i 2 am in the process of integrating some hdr shaders in to my engine. but im using cgfx

Originally posted by knackered:Really? Great to see another proliferation of file formats…why agree on one when 10 can keep people in work for years.
Yeah, though the RLE encoding is kind of odd. It only encodes per scanline, not the max run length. Or something like that anyway.

Personally I use RGBE encoded TGA images, with the TGA RLE encoding. Works pretty well, at least I think so.

You can use the HDR Radiance format

There is a function to load the file to 32bit float (3 x 32bit) : RGBE_ReadPixels_RLE

You can also upload the texture in that format and OpenGL will convert to 16bit float for you (if you specify the internal format to 16bit).

texture format: GL_RGB
internal format: GL_RGB32F_ARB (works on ATI and nVidia)
texture type: GL_FLOAT

This requires : GL_ARB_texture_float or GL_ATI_texture_float

You cannot use texture filtering (important).

It’s the easier way.