On the one hand we have DDS format that supports textures like 1:5:5:5, 0:5:5:5,
5:6:5, 4:4:4:4
on the other hand we have corresponding OpenGL internal formats like GL_RGB5_A1, GL_RGB5, GL_RGBA4.
So why couldn’t we load directly say 16 bits 1:5:5:5 to GL_RGB5_A1 and have to convert the data to GL_RGBA consisting of GL_UNSIGNED_BYTE’s instead, that then would be converted by driver to internal GL_RGB5_A1