DXT compressed textures

I am having some trouble loading compressed textures. I am following the source code which can be found here: http://www.codesampler.com/oglsrc/oglsrc_4.htm#ogl_dds_texture_loader

for( int i = 0; i < bitmaps->mipmap; ++i )
    {
        if( nWidth  == 0 ) nWidth  = 1;
        if( nHeight == 0 ) nHeight = 1;

        nSize = ((nWidth+3)/4) * ((nHeight+3)/4) * nBlockSize;

        ((PFNGLCOMPRESSEDTEXIMAGE2DARBPROC)wglGetProcAddress("glCompressedTexImage2D"))
			(GL_TEXTURE_2D, i, GL_COMPRESSED_RGBA_S3TC_DXT1_EXT, nWidth, nHeight, 0, nSize, (void*)((int)data + nOffset));

        nOffset += nSize;
        
        nWidth  = (nWidth  / 2);
        nHeight = (nHeight / 2);
    }

The first time it runs, glGetError return 1280 (invalid enum) than the other times it runs for rest of the mipmaps it returns 0.

Are you sure that this is the line that causes the error? Are you sure it isn’t a previous line that caused it?

Did you check your GL driver actually supports GL_ARB_texture_compression ?

And further that it supports EXT_texture_compression_s3tc.

To Alfonse’s comment, make sure that no GL error is present before you make this call. If there is, all bets are off – go fix that first.

Do you have a texture created and bound before you make this call?

glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

It seems that was the problem, when I changed GL_LINEAR_MIPMAP_NEAREST to GL_LINEAR the texture showed up, that seemed to be the cause of the error. I don’t see why it would be if someone could explain it for me that would be helpful.

If you are using GL_LINEAR_MIPMAP_NEAREST, then you are using mipmapping + you need to load all mipmap levels from GL_TEXTURE_BASE_LEVEL (initial value = 0) to GL_TEXTURE_MAX_LEVEL (initial value = 1000), or until you have a 1x1 pixel image, or your texture won’t be mipmap complete and won’t display correctly. Changing it to GL_LINEAR disables mipmapping, so only level 0 is used, and it doesn’t matter whether the texture is mipmap complete.

If your texture file doesn’t include all the mipmap images down to 1x1, you will either need to generate these missing levels somehow, or call:

glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, bitmaps->mipmap-1);//not sure what bitmaps->mipmap represents, but might need special handling if (bitmaps->mipmap == 0) is allowed

Did you check for errors before the function was called, because if your glCompressedTexImage2D call was succeeding, which it must have done if the texture is visible, then the error is unexplained.

ps. I wouldn’t call wglGetProcAddress(“glCompressedTexImage2D”) every single time you want to get the glCompressedTexImage2D entry-point, but only call it once + store the value somewhere. It probably won’t have much effect when it comes to loading texture data, but there’s no point performing unnecessary ops.

Adding that did the trick

Also I know my card supports the compression along with my drivers but I couldn’t find the function to check if the card and the driver support the compression.

glGetString( GL_EXTENSIONS );

Would that be the function to use and than search for the string for example “ARB_texture_compression” ?

Yes, but see also http://www.opengl.org/wiki/GlGetString , because there’s some situations you have to be careful of, such as incorrectly finding an extension because the name happens to be part of another extensions name:

eg: searching for “GL_ARB_fragment_program” might be found inside “GL_ARB_fragment_program_shadow” or “GL_EXT_swap_control” might be found inside “WGL_EXT_swap_control”, you’d have to check that the start of the sub-string found either starts the whole extension string, or is preceded by a space. You also need to check that the end of the sub-string found is either the end of the extension string, or is followed by a space. With OpenGL 3.1+ it is recommended to use glGetStringi instead of glGetString.

Or you could avoid all this and just use an extension loading library.

EXT_texture_compression_s3tc is almost guaranteed to be supported on your card, it’s ancient. Use GLEW to manage OpenGL extensions. You have to make sure that everything is set up correctly before a texture is usable. Make sure that all mipmap levels are loaded correctly and texture filtering is set correctly. You can have a look at the OpenGL Registry, it contains a list of all the extensions and explains the conditions of using any extension.

It is not as widely available as one would like, depending on the platform you are on there can be licensing concerns. It is ancient, it is widely supported on Windows and it is patented and not licensed and intentionally avoided by some vendors on some platforms.

It’s a shame really, the technology in this texture compression was published as Color Cell Compression and Block Truncation Coding and even developed in parallel for texture compression at companies like SGI.

But some companies sidestep implementing it on all platforms because of S3 licensing fees / patent infringement.

The problem with the filter you selected is because MIPMAP filters are only applicable to minification. With magnification it makes no sense to request a mipmap filter.

What platforms have support for s3tc and what are the licensing fees / patent infringements?

Here’s a start:

http://feedback.wildfiregames.com/report/opengl/feature/GL_EXT_texture_compression_s3tc

“Does support” is up-top in the green table. See the “red” table at the bottom for “does not support”.

So pretty much anything ATI, NVidia, or Intel on PC supports.