Failed usage of GL_COMPRESSED_RGB

I did a simple test, and replaced GL_RGB with GL_COMPRESSED_RGB, and GL_RGBA8 with GL_COMPRESSED_RGBA (OpenGL 4.2).

The result was that GPU memory consumption increased! I know the result will not be the same on all targets, but I didn’t expect that. Using GL_RGB5, however, improved GPU memory use as well as rendering speed.

Are there any guidelines (except doing a test on each target and choose from that)?

Edit: mental arithmetic is an unreliable method. I was wrong, and it was an improvement in both cases. Any views on guidelines are still welcome, though.

Well, first I would rather give a try to specify an actual specific compressed internal format constant like GL_COMPRESSED_RGBA_S3TC_DXT5_EXT. Otherwise the implementation is free to choose arbitrary compressed format, even a non-compressed one.

Well that makes total sense. :slight_smile:

Do you know if the Catalyst does that? Maybe Ilian can give some info on the NVIDIA side.

In which particular case could that happen?

I didn’t talk about any particular implementation but rather talked about the fact that the spec allows this (just read GL_ARB_texture_compression).

In general, the rule is to use an explicit internal format (a specific compressed internal format for compressed textures or a sized internal format for uncompressed textures).