PDA

View Full Version : 16 bits unsigned texture



yossi
06-02-2004, 04:03 AM
Hello all,
I can't find a way to define a 16 bits unsigned texture without losing precision. In Any type of definition I try (GL_ALPHA16, GL_LUMINANCE16 etc), the hardware downscale the texels into 8 bits registers that when read by the fragment shader gives me no more than 256 values.
I thought that maybe defining it as a depth texture I can get something (more bits to the depth values), but I am not sure this is the right direction.
Any ideas will be highly appreciated.
Thanks,
Yossi

Corrail
06-02-2004, 04:37 AM
I noticed a similar problem using GeForce FX hardware. Which hardware do you use?

yossi
06-02-2004, 04:44 AM
I am using the 3DLabs wildcat VP990 pro card based on the P10 processor.
Yossi

3DlabsDevRel
06-09-2004, 09:25 AM
P10 will only support 8-bits per component.

yossi
06-13-2004, 08:29 PM
Thanks,
Will P20 support 16 bits per component?
From what I've read it will support floating point per component pixels, so I guess there will be no accuracy problem related to integer maths.
By the way doe's it means that computations through all the pipeline stages will be held in floating point and just before writing to the frame buffer will be converted to 32 bits, 8 bits pre component?

Thanks again,
Yossi