16bits per channel texture ?

Hi,

I’m trying to create a 16 bits per channel texture,
using the GL_RGB16 format , but the actual texture
displayed by openGL is only using the least significant byte of every pixel …
i.e. if i have pixels defined by
r1r2,g1g2,b1b2
the texture displayed is
00r2,00g2,00b2

The TexImage2d i use is :
glTexImage2D(
GL_TEXTURE_2D,
0,
GL_RGB16,
w,h,
0,
GL_RGB,
GL_UNSIGNED_SHORT,
Image->Data);

any idea on this one ?

Thanks a lot
Francois

That seems quite peculiar. Worst case (that is, the driver giving you RGB8 instead of 16), the driver should still be taking the most significant bits, not the least.

Two questions. One, are there any endian issues here? That is, is your data somehow big endian when it should be little endian? Are you perhaps on a Mac?

Two: what hardware and drivers are you using? Like I said, this behavior is wrong even with an implementation that can’t handle RGB16, but it would be a good idea to verify that your hardware can or cannot handle RGB16 at all.

well concerning the Big/Small endian issue , i thought of this one , but after some time trying to debug my code, i hardcoded values in my buffer.
I did put 0xFF in the MSB and 0x00 in the LSB
image is black
if i put 0x00 in MSB and 0xFF in LSB then full white
it looks like the MSB is never used …

Concerning the hardware, i’m working on a laptop
with a 6800 Go , but tryied it on other machines
and exact same results …

RGBA16 isnt supported on Geforce, you must use HILO textures instead. RGBA16 is supported on Radeons and maybe other cards.

HILO textures ?
never heard of that …
got any links with info ?

thanks !

EDIT: got it to work but using a DWORD for each channels instead of WORDS …
16bits and NVidia are definitely not friends :slight_smile:

You should have at NVidia texture formats for OpenGL doc : http://developer.nvidia.com/object/nv_ogl_texture_formats.html

Cheers.