Generic texture compression

Hello, from what I hace read about the texture compression extension, it is possible to specify that the texure image will be compressed with GL_COMPRESSED_RGB_ARB when calling glTexImage2D and then the driver will select the compression format that is best on that system.

When I’m testing if the texture has been compressed or not, the GL always sends that the texture has not been compressed.

This is the most relevant part of the code:

/*Creation of GL context */

const GLubyte *s;
int ntu;

s=glGetString(GL_EXTENSIONS);

/* The extensions string contains the compression extension */

SDL_Surface *textura;
unsigned int t_id;

textura=SDL_LoadBMP("garabato2.bmp");

glBindTexture(GL_TEXTURE_2D,t_id=1);

glTexImage2D(GL_TEXTURE_2D,0,GL_COMPRESSED_RGB_ARB,textura->w,textura->h,0,GL_RGB,GL_UNSIGNED_BYTE,textura->pixels);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

int tamanho;

glGetTexLevelParameteriv(GL_TEXTURE_2D,0,GL_TEXTURE_COMPRESSED_ARB,&tamanho);

/* Always return 0, not compressed */

glGetTexLevelParameteriv(GL_TEXTURE_2D,0,GL_TEXTURE_COMPRESSED_IMAGE_SIZE_ARB,&tamanho);


/* Returns a large negative number */

glEnable(GL_TEXTURE_2D);

/* Rest of the program: drawing, etc... */

There’s an excellent paper from NVidia about texture compression that describes the steps that have to be taken. I’ve done a few months ago and I’m not sure your code is 100% correct (I only took a quick look, so I could be wrong). Even if it is maybe your card does not support the compression (but can use compressed textures). This happened to me when I tried to compress my textures in a geforce 5200 in a laptop. While it could use precompressed textures it couldn’t compress them. Also, you understand that you have to build an application that will compress the textures before you use them, don’t you?
Furthermore look up S3TC which seems to be the prevailent method for compression.
Go here
to download the paper from nvidia.

Originally posted by dvm:
There’s an excellent paper from NVidia about texture compression that describes the steps that have to be taken. I’ve done a few months ago and I’m not sure your code is 100% correct (I only took a quick look, so I could be wrong). Even if it is maybe your card does not support the compression (but can use compressed textures). This happened to me when I tried to compress my textures in a geforce 5200 in a laptop. While it could use precompressed textures it couldn’t compress them. Also, you understand that you have to build an application that will compress the textures before you use them, don’t you?
Furthermore look up S3TC which seems to be the prevailent method for compression.
Go here
to download the paper from nvidia.

Hello!

Thank you for the paper suggestion. That was precisely the document that encouraged in the first time me to use texture compression, and I found it easier to read than the official extension specification. It’s a coincidence I’m also programming from a laptop, ATI card here instead. The drivers I currently use brings the texture compression extension, but not the s3tc compression one. Maybe this is causing the compression failing (lack of concrete compressed format from the drivers) but if it is so, it sounds a bit silly to implement the first without any other extension that could make use of it.

Anyway, I’d like to point that, according to nVidia’s paper, (page 2)

There are two different approaches to performing the compression of the texture bitmaps. The first method uses
OpenGL to effectively compress the textures.

My understanding here is that you can ask the GL to compress the texture for you.

Also from page 2 of the paper

The ARB_texture_compression extension allows an uncompressed texture to be
compressed on the fly through the glTexImage2D call by setting its <internalFormat>
parameter accordingly. This can be done in one of two ways: use a generic compressed
internal format from Table 1 or use an explicit internal format like one offered by the
S3TC extension listed in Table 2. Basically, the “compressed internal format” works just
like a texture with “base internal format” except that the data is compressed.

I think the meaning here is that you can tell GL that you want a texture to be compressed without specifying actually what kind of compression do you want, only what kind of data is being compressed.

Then, if a generic compressed internal format was used, query OpenGL for the actual
<internalFormat> that has been automatically selected by OpenGL. To do so, one calls
glGetTexLevelParameteriv again with <pname> set to
GL_TEXTURE_INTERNAL_FORMAT.

From here I understand that GL can choose an appropiate method of compression. Maybe is the lack of compressed formats on my machine what it is making that GL doesn’t have a way of compress the data. I don’t know what would be the behaviour if I supply already compressed data because I still would lack the decompression extension that was used to process the image in the first place.

I’ll test the same program with a driver that supports compressed textures and s3tc to see if generic compression from uncompressed data (that is, using GL_COMPRESSED_RGB_ARB as image internal format) is really supported.

Again, thanks for the tips.

I have just tested the same program above with the same machine but different drivers. These drivers support GL_EXT_texture_compression_s3tc in addition to GL_ARB_texture_compression.

The program ran succesfully and the report said the texture was compressed. So my conclussion is that the first drivers couldn’t be able to compress the texture because they lacked another extension which would bring the specific compression format, although you haven’t to specify one to tell GL to compress the texture. In the latter case, GL drivers would choose the best compression format available.

that seems pertinent.

As for the compression format I’d suggest you try it foryourself. You’ll see differences in texture size, quality etc. Then you can decide foryourself. Maybe they do the compression in software or the new drivers have enabled the extension, I don’t know. In my case, I had created a seperate program to create the compressed textures in my own format (image size, bpp, etc) and then dumped the whole thing to disk. When I tried to use this program on the aforementioned laptop, though it gave me the same enumeration of compression formats it failed to compress the texture. From what I’ve read in the paper it would be wise to do the compression once (maybe on the first run of the application) and use the compressed textures from there on. Good luck!