Reading the pixel fromat of a texture

It sounds silly, but it seems like there is no way to determine the pixel format and the data type of the pixel data of the currently selected texture.

The data type of the pixel data should be retrieved with, for example,
glGetTexLevelParameteriv(…, …, GL_TEXTURE_RED_TYPE, …);

Although it works for some types (e.g. GL_FLOAT), it doesn’t work for GL_UNSIGNED_BYTE (and it is the most common type).

glGetTexLevelParameteriv() retrieves 0x8C17, meaning GL_UNSIGNED_NORMALIZED, instead of 0x1401 (GL_UNSIGNED_BYTE), so it cannot be passed to another GL function properly.

The pixel data format doesn’t have appropriate symbolic name for glGetTexLevelParameter*() parameter, or I’ve missed something.

What you want is GL_TEXTURE_INTERNAL_FORMAT. This will return the internal format (the third parameter of glTexImage*) used by the texture.

Nope! I need 7th and 8th parameter of glTexImage*(). I’m alredy reading internal format, but it doesnt contain pixel format I need.
For example, internal format is GL_RGB, but pixel format is GL_BGR_EXT/GL_UNSIGNED_BYTE.

(note: the forum is acting up, so I can’t use quote blocks.)

“but pixel format is GL_BGR_EXT/GL_UNSIGNED_BYTE.”

That is the pixel transfer format. It describes the format of the data that you gave glTexImage*. All this does is tell OpenGL how to read your data. This information is not associated with the texture object in any permanent way, nor does it affect the internal storage of the data.

Or, to put it another way, the texture object neither knows nor cares that you happened to upload the texture with GL_BGR rather than GL_RGB.

Clear! Thanks!
Texturing functions actually convert data in/from specified format, and internal format is not affected by that. I have totally forgotten that. Thank you for clarification!