How to use 16-bit texture?

I use
glTextImage2D(GL_TEXTURE_2D,0, GL_RGB5_A1,w, h , 0, GL_RGBA, GL_UNSIGNED_BYTE,
pimage);

the pimage data is r5-g5-b5-a1 format, alpha bit is the lowest bit,
But I found the color is wrong.
I changed pimage format to a1-r5-g5-b5 or other, but it wasn’t good either.

Did I make anything wrong ?

I use glaux to init
auxInitDisplayMode (AUX_DOUBLE | AUX_RGB | AUX_DEPTH);

I want to specify 16-bit mode, but I didn’t find it, is this the reason?

Anyone who could help me ? Thank you.

Think you passed two arguments in the wrong order. The third argument is the internal format, the format the texture will have when it has been uploaded. The seventh argument is the format in which the texture is stored in before uploading. So try to change place of the third and seventh argument. This way OpenGL will look for RGB5_A1 data when uploading, and upload the data as RGBA.

Originally posted by Bob:
Think you passed two arguments in the wrong order. The third argument is the internal format, the format the texture will have when it has been uploaded. The seventh argument is the format in which the texture is stored in before uploading. So try to change place of the third and seventh argument. This way OpenGL will look for RGB5_A1 data when uploading, and upload the data as RGBA.

Thank you.
But I’ve found the mistake. The fact is: you needn’t provide a pimage of RGB5_A1, just provide RGBA (32-bit) data, and
glaux is initialized to 32-bit too. Opengl will generate 16-bit RGB5_A1 internal format and store it as texture in video memory. The result is, memory costs less and fps is higher, but it lowers image quality.