On compressed textures again. s3tc3 [vs] s3tc5

I would like to ask someone to tell me the differenc between the various s3tc texture formats since I miss something about them.

Here’s what I understood:
s3tc1: designed for rgb data. The best for that between all the other formats. Do not use if you have alpha since it will go black.
s3tc3: compresses color like s3tc1 but adds a uncompressed alpha channel.
s3tc5: compresses color like s3tc1 and adds a compressed alpha channel.

Sure, if I got rgb I will go for s3tc1.
If I got rgba, I have 2 chooses: compressed and uncompressed alpha.
Now, I have a paper in which it’s written that both s3tc3 and s3tc5 take the same amount of memory, which somewhat confuses me… I guess when the space taken is the same the compressed format works better, so I would always go for s3tc5 however, this does not make sense. If s3tc5 would be always better, there would be no s3tc3 (except in the case the former came after some technological advantages).

So, is s3tc5 always better than s3tc3?

DXT1 allows for 1-bit alpha (completely transparent or completely opaque).

DXT3 uses uncompressed 4-bit alpha. That’s only 16 different levels of transparency, but if that’s OK for your image, it’ll be accurate.

DXT5 uses interpolation between two 8-bit alpha values. This gives you more accuracy across the image, but doesn’t deal well with large alpha differences within a small area.

Basically, DXT5 is the most difficult to compress, the most general, and the most accurate for all but the most specific of images. If you have one of those special cases, though, one of the other formats will probably be a better choice

Ok, so dxt5 will look better in >90% of the cases. Thank you, a very accurate answer!

Yeah 10x,i didn’t knew the difference either.

You can look at this program which shows the differenses with a nice picture http://opengl.nutty.org/extensions/s3tc.zip

how do I use these formats anyway? I only know of GL_COMPRESS_RGBA_ARB as the third paramter of glTexImage2D(…).

Jan

Doesn’t DXT1 convert textures to 16 bit on NVidia’s cards from the Geforce 4 and earlier?
http://udn.epicgames.com/pub/Content/TextureComparison/

I know that ATI cards at least as far back as the Radeon 8500 keep the texture in 32 bit, but I’m not sure about the GeforceFX.

Originally posted by JanHH:
[b]how do I use these formats anyway? I only know of GL_COMPRESS_RGBA_ARB as the third paramter of glTexImage2D(…).

Jan[/b]

Yeah, I would like to know that too!

how do I use these formats anyway? I only know of GL_COMPRESS_RGBA_ARB as the third paramter of glTexImage2D(…).
Jan

As far as I know, you just need to pass GL_COMPRESSED_RGB_S3TC_DXTn_EXT as internal format, it’s the same thing.
Little note: s3tc does not support borders. Who needs that anyway?

Doesn’t DXT1 convert textures to 16 bit on NVidia’s cards from the Geforce 4 and earlier?

DXTC works ok on geforce3 and better, they greatly improved it. Not sure about geforce4mx however. I always though dxtc was bad on nv1x and good enough on nv2x. I have a geforce2 and a geforce4 and I can tell compressed textures looks much better on nv25, I don’t think they can still improve it knowing dxtc limitations.

so is “texture compression” and “s3 texture compression” (was a big thing in game magazines when it was new) the same? or isn’t it?

Not exaclty the same.
S3TC is a special algorithm to do texture compression. Other algorithms may be used.

When I think at texture compression, I think to “make the texture’s footprint smaller”.
When I think at S3TC i think at a “lossy algorithm to compress textures”.
Yes, it’s almost the same thing.

so, and what to do now if you want your textures to be compressed

a) as much as possible
b) with keeping the quality as good as possible
c) with a good trade off between these two?

Jan