HDR with Rendermokey

Does anyone know if it is possible to use Rendermonkey (1.6) to do HDR?
For example, I read a HDR cubemap stored in a DDS image and Rendermonkey reads it and in the properties it properly says: Format: 64bit A16B16G16R16F
But when I read the texture in a fragment program, the values come clamped [0,1]
Does anyone know if there is any possibility to control the clamping?

Thanks.

did you make the gamma correction ?

Seems like:
pow(color,0.55)

I had the ‘similar’ problem in OpenGL/ATI Radeon 9800 (at least): texture float value got clamped especially especially when a fragment shader is writing into it (or reading from it).

But on DirectX, it not (so it is not an hardware limitation, probably a specification limitation ?)

I was trying to port an ‘HDR’ tone mapping in OpenGL the other day using GLSL + FBO.

One of the step is to compute the pixel luminance from the original image by averaging some texels, doing a luminance conversion and finally doing a log on the result - (since luminance values are an exponential curve)

After downsampling the luminance of the original image to 1x1 texture (using FBO since I need to render to texture the result back and forth), I wanted to perform luminance_final = exp(luminance_log) in order to get the correct value.

It didn’t worked on OpenGL/ATI.

When writing the ‘log(texel)’ to the texture, the value was apparently clamped to 1.0, so the exp(texel) phase used an incorrect value.

I’ve spend a lot of time for understanding why it didn’t worked because doing the same shader code in HLSL worked flawlessly (as usual) in DirectX 9.0c

My workaround was not do the log / exp thing in OpenGL, although the result was not ‘perfectly’ correct, but I really need my HDR tone mapping working.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.