16-Bit FBO on GeForce6800 GT

I am currently writing a simulation that needs offscreen rendering with 16 bit precision for each color channel. I was originally using Mesa3D but that was only working in software and so was slow. Instead I now have an NVidia GeForce 6800 GT running under 2000 SP4 on a dual Xenon processor system. The graphics driver is NVidia Forceware 81.98.
I have created an FBO and attached a RenderBuffer. Can I set the pixel format of the RBO to 16 bit? I can set it to RGBA32F and use floating point arithmetic. This produces the correct results but runs slowly - probably because there are a lot of transparent objects involved and that means alpha blending on floats. If I try to set the pixel format to RGBA16 it seems to be accepted, however checks on the state show that there are only 8 bits per color. Furthermore the rendering is consistent with this (i.e. not correct) although it does run much faster.
Any help or suggestions would be appreciated. Note that I have tried updating the drivers to 91.47 (with no effect on the above) but that is another story …
Roy

GL_RGBA16F_ARB should work well on a GF6800, i use it all the time.

Guys, thanks for your comments. I have indeed tried GL_RGBA_16F format (knowing the lower resolution issue) and it has approximately the same speed as the incorrect GL_RGBA16 format while being one or two orders of magnitude quicker than GL_RGBA_32F format. So at the moment I am going with RGBA16F.
‘Relic’, from your comments, are you saying that even for a FBO RGBA16 is still precision substituted under OpenGL 2.0? Will it ever be supported?

Nvidia does not support native 16-bit integer textures (aka RGBA16). ATI, on contrary, supports them. I have no idea what does the spec say about it, hoverer. I don’t expect Nvidia to add support for such textures as the usefulness is doubtful.