12 bits texture

hi everybody. i am using a 12-bit texture, but it doesnt work. my code is as blow:
glTexImage2D ( GL_TEXTURE_2D,
0,
GL_LUMINANCE12,
IMG_SIZE,
IMG_SIZE,
0,
GL_LUMINANCE,
image12bits );
image12bits is a 12 bits gray velue texture. if i change GL_LUMINANCE12 to GL_RGBA, and GL_LUMINANCE ro GL_RGBA, and use a RGBA texture, it works. is there any other things we should note if we apply a 12 bits texture.
thany you all!!
alex

i think the depth of the image has to be a power of 2 value, 16 or 32

Hm I don’t know because a few material accept non-power of two textures, but it depends =/
(i don’t remember exactly)

thank you guys.
i think 12 bits texture is possible. otherwise there would not be LUMINANCE12 or INTENSITY12. for 12 bits texture, two bytes correspond to one pixel, the first four bits are not used. the damned 12 bits texture really gets on my nerve:(

Originally posted by <guyinhell>:
thank you guys.
i think 12 bits texture is possible. otherwise there would not be LUMINANCE12 or INTENSITY12. for 12 bits texture, two bytes correspond to one pixel, the first four bits are not used. the damned 12 bits texture really gets on my nerve:(

GL_LUMINANCE12 is for the internal format only, not for the external format. The external format, the format of the data you pass to OpenGL and described by the last two paramteres before the pointer, does not have the ability to form the necessary format for you. It supports lots of different channel types, sizes and orders, but 12 bit luminance packed in 16 bit data is not one of them, sorry.

Your best bet would, probably, be to unpack the data yourself to a format OpenGL supports, and then provide it with the data. To preserve information, you could unpack the data to 16-bit luminance, and upload it with GL_UNSIGNED_SHORT and GL_LUMINANCE as the format and type parameters.

What Bob said ^^^.
And make sure that the 12 bits are in the most significant bits. I bet most hardware doesn’t support 12 bits natively and uses 16 internally.
Uhmm, yes, check this out: http://download.nvidia.com/developer/OpenGL_Texture_Formats/nv_ogl_texture_formats.pdf

so there is no automatic way that could be used for this 12bit in 16bit to real 16bit values conversion ? (if i try to use this i just get a black image)
doing it myself would take lots of time for larg amount of images that i may need to process (medical images CT/MRI)