View Full Version : 816 bit images as textures (2D,3D) or drawPixels

05-17-2005, 11:12 AM
Im having a trouble with 12bit grayscale images (stored in memory as 16bit) specificaly [would probably have similar issue with other variants].
the images is dispalyed either all black or all white. is there a way to tell opengl that unpack format is GL_LUMINANCE12 instead of jsut GL_LUMINANCE ? [putting GL_LUMINANCE12 makes app crash i guess because that is invalid value for input format ]
also is there a way to change the range or values being displayed while image data is already in gfx mem (as texture ?) [ would like to specify range that is going to be converted to 256 visibile colors.

05-18-2005, 03:30 AM
For 16 bit, pass GL_UNSIGNED_SHORT to type parameter in glTexImage2D
The internal format = GL_INTENSITY16
The format = GL_INTENSITY

glTexImage2D(GL_TEXTURE_2D, 0, GL_INTENSITY16, width, height, 0, GL_INTENSITY, GL_UNSIGNED_SHORT, mypixels);

05-18-2005, 05:23 AM
but there is no way to create a texture from 12bit image packed in 16bit ??

right now im doing conversion from 12bit to 16bit of every pixel with np = p lshft 4;
however glMinmax is giving me strange result for maximm pixel value (100 milions something which cant be true)

also what would be good way to change range of greyscale valeus that are going to be displayed of that texture ? (i want to be able to change that dynamicaly with gfx hardware instead of manipulating pixel data and recreating texture)