16-bit grayscale on a 3D card?

Hello,

I need 16-bit grayscale textures on a normal 3D card.

I take it that no “normal” gaming card can do 16-bit grayscale? OpenGL kinda supports 16-bit grayscale textures (I think?) but the card(s) only take 8-bit in its RGB channels, and work with 8 bits internally, am I correct ?

I have an ATi Radeon 9700 Pro at the moment, but switching to inferior (GeForce cards wouldn’t help anything I guess.

Thanks for any input,

Andru

I thought the Radeon 9700 supported deeper textures, eg. 16-bit-per-component, and floating-point???

Internal precision on Radeon 8500/9000 is 12 bits per channel (my own estimation), around 10 bits per channel on Geforce 2 series and should be much better on a 9k7.

Have you tried internal formats of GL_LUMINANCE16? You can query actual bit depth of textures with glGetInteger*.

Note that unless you get a proper frame buffer depth, texture depth won’t do much good anyway.

Hi,

On GeForce3+ you can use HILO textures to get 16 bits per channel. If a lookup table is needed this can be encoded in a texture (allowing a 12 bit lookup table).

Note that whenever you blend to the frame buffer the high precision is lost…

– Niels

The 9700 does support high-depth texture formats. To get a grayscale 16-bit just request LUMINANCE16. The initial driver release didn’t have it, but I believe the latest drivers should.

The normal (non-float) high-bit-depth textures behave the same as the ones you are used to, except that they do not support border color. The float texture have additional caveates, such as filter modes other than nearest or nearest mip nearest will punt to SW. The float textures are in the next public release, I believe.

-Evan