View Full Version : 16bits per channel texture ?
09-02-2005, 03:23 PM
I'm trying to create a 16 bits per channel texture,
using the GL_RGB16 format , but the actual texture
displayed by openGL is only using the least significant byte of every pixel ...
i.e. if i have pixels defined by
the texture displayed is
The TexImage2d i use is :
any idea on this one ?
Thanks a lot
09-02-2005, 05:52 PM
That seems quite peculiar. Worst case (that is, the driver giving you RGB8 instead of 16), the driver should still be taking the most significant bits, not the least.
Two questions. One, are there any endian issues here? That is, is your data somehow big endian when it should be little endian? Are you perhaps on a Mac?
Two: what hardware and drivers are you using? Like I said, this behavior is wrong even with an implementation that can't handle RGB16, but it would be a good idea to verify that your hardware can or cannot handle RGB16 at all.
09-02-2005, 09:39 PM
well concerning the Big/Small endian issue , i thought of this one , but after some time trying to debug my code, i hardcoded values in my buffer.
I did put 0xFF in the MSB and 0x00 in the LSB
image is black
if i put 0x00 in MSB and 0xFF in LSB then full white
it looks like the MSB is never used ...
Concerning the hardware, i'm working on a laptop
with a 6800 Go , but tryied it on other machines
and exact same results ...
09-03-2005, 12:28 AM
RGBA16 isnt supported on Geforce, you must use HILO textures instead. RGBA16 is supported on Radeons and maybe other cards.
09-03-2005, 12:42 AM
HILO textures ?
never heard of that ...
got any links with info ?
EDIT: got it to work but using a DWORD for each channels instead of WORDS ...
16bits and NVidia are definitely not friends :)
09-05-2005, 12:16 AM
You should have at NVidia texture formats for OpenGL doc : http://developer.nvidia.com/object/nv_ogl_texture_formats.html
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.