Creating 16-bit (565) textures

How do you create a texture from data that is already stored in 16-bit 565 format?

You have only two options, you can either convert it to a format that can be used, or if you are using OpenGL v1.2, you can directly load it using a format of GL_UNSIGNED_SHORT_5_6_5 or GL_UNSIGNED_SHORT_5_6_5_REV along with a type of GL_UNSIGNED_SHORT. Be careful what you use for the internalformat in this case as well, since OpenGL does not have any internal formats that match the 565 format, assuming an extension is not available. You will either have to use an internalformat of GL_RGB8 (i.e. 888, which wastes memory) or GL_RGB5 (i.e. 555, which loses precision). However if you set the internalformat as simply GL_RGB, then there is a possibility that the implementation will choose an internal 565 format on its own, but this behavior may vary among the various OpenGL implementations. If you want to leave no doubt that the texture is stored without degradation, use GL_RGB8 for the internalformat.

[This message has been edited by DFrey (edited 08-04-2001).]

Anyone know what format Q3 used for 16 bit textures?

Nutty

For 16 bit textures Quake 3 used GL_RGBA4 and GL_RGB5 for the internal format. For format it used GL_RGBA, and for type it used GL_UNSIGNED_BYTE. Which means Quake3 converted 16 bit images to 32 bit images after loading them from the file. Which basically means, when loading 16 bit images and using 16 bit textures, Quake3 does an unecessary conversion from 16 to 32 bit. I think that was done for the sake of simplicity, since on some systems, Quake3 must adjust the image directly for gamma compensation before passing it on to OpenGL for texture creation.

[This message has been edited by DFrey (edited 08-05-2001).]