View Full Version : 16-Bit FBO on GeForce6800 GT

09-26-2006, 12:29 AM
I am currently writing a simulation that needs offscreen rendering with 16 bit precision for each color channel. I was originally using Mesa3D but that was only working in software and so was slow. Instead I now have an NVidia GeForce 6800 GT running under 2000 SP4 on a dual Xenon processor system. The graphics driver is NVidia Forceware 81.98.
I have created an FBO and attached a RenderBuffer. Can I set the pixel format of the RBO to 16 bit? I can set it to RGBA32F and use floating point arithmetic. This produces the correct results but runs slowly - probably because there are a lot of transparent objects involved and that means alpha blending on floats. If I try to set the pixel format to RGBA16 it seems to be accepted, however checks on the state show that there are only 8 bits per color. Furthermore the rendering is consistent with this (i.e. not correct) although it does run much faster.
Any help or suggestions would be appreciated. Note that I have tried updating the drivers to 91.47 (with no effect on the above) but that is another story .....

09-26-2006, 03:09 AM
GL_RGBA16F_ARB should work well on a GF6800, i use it all the time.

09-26-2006, 08:22 AM
- Right, alpha blending is not supported on FP32 formats.
- RGBA16 is not a native format. Check this table: http://developer.nvidia.com/object/nv_ogl_texture_formats.html
- RGBA16F does not offer 16 bit of precision, it's 1 sign, 5 exp, 10 mantissa (1s5e10m), where 32bit floats are 1s8e23m.
- A full list of supported color formats can be found in the NVIDIA GPU Programming Guide http://developer.nvidia.com/object/gpu_programming_guide.html
Check the "Render" column in the "Texture Format" tables.

09-26-2006, 10:24 PM
Guys, thanks for your comments. I have indeed tried GL_RGBA_16F format (knowing the lower resolution issue) and it has approximately the same speed as the incorrect GL_RGBA16 format while being one or two orders of magnitude quicker than GL_RGBA_32F format. So at the moment I am going with RGBA16F.
'Relic', from your comments, are you saying that even for a FBO RGBA16 is still precision substituted under OpenGL 2.0? Will it ever be supported?

09-27-2006, 06:59 AM
Nvidia does not support native 16-bit integer textures (aka RGBA16). ATI, on contrary, supports them. I have no idea what does the spec say about it, hoverer. I don't expect Nvidia to add support for such textures as the usefulness is doubtful.