16 bits unsigned texture

Hello all,
I can’t find a way to define a 16 bits unsigned texture without losing precision. In Any type of definition I try (GL_ALPHA16, GL_LUMINANCE16 etc), the hardware downscale the texels into 8 bits registers that when read by the fragment shader gives me no more than 256 values.
I thought that maybe defining it as a depth texture I can get something (more bits to the depth values), but I am not sure this is the right direction.
Any ideas will be highly appreciated.
Thanks,
Yossi

I noticed a similar problem using GeForce FX hardware. Which hardware do you use?

I am using the 3DLabs wildcat VP990 pro card based on the P10 processor.
Yossi

P10 will only support 8-bits per component.

Thanks,
Will P20 support 16 bits per component?
From what I’ve read it will support floating point per component pixels, so I guess there will be no accuracy problem related to integer maths.
By the way doe’s it means that computations through all the pipeline stages will be held in floating point and just before writing to the frame buffer will be converted to 32 bits, 8 bits pre component?

Thanks again,
Yossi

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.