PDA

View Full Version : GL_RGB10 render target



k_szczech
04-28-2006, 04:45 AM
I'm using this format for faked HDR on SM2.0 cards.
10-bit precision gives 0-1023 color range so I render everything 4 times darker, and multiply color by 4 in final stage.
Unfortunately this leads to loss of color precision. I get the same precision when using GL_RGB8 format.
Hardware: both GeForce7800GT and RadeonX850.

Any hints?

Roderic (Ingenu)
04-28-2006, 05:02 AM
You mean like this :
http://download.nvidia.com/developer/OpenGL_Texture_Formats/nv_ogl_texture_formats.pdf
?
;)

k_szczech
04-28-2006, 08:45 AM
Yeah, I have this pdf, too. That explains why GeForce loses precision, but I'm more interested in Radeons since on GeForce I have true HDR supported.
I've read ATI's 'Toy Shop' demo description. They use RGB10_A2 format for HDR, so I guess Radeon X1k supports this format.
Well, I just leave it at that - my application will use RGB10 format and if GPU supports it color precision will be fine.
Thanks.

Trenki
04-28-2006, 09:19 AM
My Radeon X800 XT seems to support RGB10_A2 textures without precision loss.

Creating a texture with this format and then querying the actual component sizes (GL_TEXTURE_RED_SIZE, ...) with glGetTexLevelParameter yields 10 for rgb and 2 for alpha.

Relic
04-28-2006, 09:59 AM
The question is if you can bind that as render target and render to it in this format.

Trenki
04-28-2006, 10:22 AM
Binding the RGB10_A2 texture as a render target worked as well.
glCheckFramebufferStatusEXT returns GL_FRAMEBUFFER_COMPLETE_EXT.
It even returns GL_FRAMEBUFFER_COMPLETE_EXT for RGBA16 textures.