HDR Texture Format Issues

Hello,

I added HDR rendering to my game, but I cannot find a HDR texture format that works across all systems and doesn’t produce artifacts. This format:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16F, m_width, m_height, 0, GL_RGB, GL_FLOAT, NULL);

Gives me really strange black pixelated box artifacts that surround various areas of the screen. It also doesn’t downsample properly when determining the scene luminance, so everything gets very bright.

I was able to get rid of these artifacts on my system by changing the format to:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16, m_width, m_height, 0, GL_RGB, GL_FLOAT, NULL);

From what I have read, that isn’t even an HDR format, since GL_RGB16 clamps values to [0, 1]. But for some reason it works on my machine.

Neither of these formats run on the other machines I have tested besides my own.

What could be causing the artifacts? What format should I use?

Thanks for any help you can offer.

Maybe you are exceeding the limit of RGBA16F. Try RGB32F.

I tried that one too, but it still gave me the artifacts.

Either you are still exceeding the limits of 32 bit float or you have a problem in your shader, perhaps you are generating a NaN somehow. No way to know.

I figured out why it wasn’t working on other people’s machines (GLSL extension issue), but it still only works with the formats that do not have and F on them (I still get artifacts). It can’t be lack of precision or a shader error, since it works with the same precision and same shaders with the “not F” versions (which they shouldn’t, from what I have read, so I would rather get the F versions to work as well).