View Full Version : HDR with Rendermokey

Zak McKrakem
01-04-2006, 03:49 AM
Does anyone know if it is possible to use Rendermonkey (1.6) to do HDR?
For example, I read a HDR cubemap stored in a DDS image and Rendermonkey reads it and in the properties it properly says: Format: 64bit A16B16G16R16F
But when I read the texture in a fragment program, the values come clamped [0,1]
Does anyone know if there is any possibility to control the clamping?


01-10-2006, 09:46 PM
did you make the gamma correction ?

Seems like:

01-11-2006, 01:28 AM
I had the 'similar' problem in OpenGL/ATI Radeon 9800 (at least): texture float value got clamped especially especially when a fragment shader is writing into it (or reading from it).

But on DirectX, it not (so it is not an hardware limitation, probably a specification limitation ?)

I was trying to port an 'HDR' tone mapping in OpenGL the other day using GLSL + FBO.

One of the step is to compute the pixel luminance from the original image by averaging some texels, doing a luminance conversion and finally doing a log on the result - (since luminance values are an exponential curve)

After downsampling the luminance of the original image to 1x1 texture (using FBO since I need to render to texture the result back and forth), I wanted to perform luminance_final = exp(luminance_log) in order to get the correct value.

It didn't worked on OpenGL/ATI.

When writing the 'log(texel)' to the texture, the value was apparently clamped to 1.0, so the exp(texel) phase used an incorrect value.

I've spend a lot of time for understanding why it didn't worked because doing the same shader code in HLSL worked flawlessly (as usual) in DirectX 9.0c

My workaround was not do the log / exp thing in OpenGL, although the result was not 'perfectly' correct, but I really need my HDR tone mapping working.