about s3tc and texture compression

Greeting to fellow developers. I’m trying to find information about the s3ct algorithm that is used by opengl to compress texture. It is still not clear to me if s3ct is a company or an algorithm name. I’ve searched google but haven’t been lucky so far. Is there an official site where I can get more information?
Also, if someone happens to know, how come ATI cards seem to support only s3ct and not the default compressed modes (e.g. GL_COMPRESSED_RGBA_ARB). Is it a matter of popularity? I’ve no clue about nvidia, as I haven’t tested my app there.
Also if my program works correctly (it does on a desktop ati 9600) I couldn’t compress a texture on an ATI 9700 on a laptop. But I could read back the file. Is this because the manufacturer is trying to save space and believes the end user will not require to compress the textures himself on a laptop?
Sorry for making a lot of questions and thanks for taking the time to read this.

S3TC stands for “S3 texture compression”. S3 is a company, S3TC is the name for this compression method.

I don´t know about “default” compression modes. AFAIK that´s just a framework to allow compressed textures in general, but only with S3TC the card actually supports a compression method.

Jan.

Well, no wonder I couldn’t get any results in google, since I searched for s3ct :frowning:
Thanks for making clear the acronym!

This compression scheme also goes by the name dxt3 and is used in Direct3D as well. dds image files can store images in this compressed format, so you might want to take a look at the dds file format. And - shameless plug - I have written a little dds viewer (can view/save dds files) which you can get (for free with source) at http://www.amnoid.de/ddsview/ (the directx sdk includes a dds tool too) :stuck_out_tongue:

so the texture is actually compressed in video memory? and this comes without a performance cost? or just saves memory?

or is the texture just decompressed on the hardware side? all at once or just as needed if so?

Read this:

http://developer.nvidia.com/attach/6585

Most compression methods used are a variation on the S3TC compression scheme.

S3TC is itself an interpolated color cell compression system (the original color cell compression being a simple lookup with no interpolation). With interpolation two color values that are implicitly interpolated with a 4 bit value per pixel. This is at the core of most of these schemes.

Before this hit the PC scene SGI had internal prior art designed into hardware that was never released that did exactly the same thing, it’s still used in software form in their viz-server product and they call it “interpolated color cell”.

Variations on the S3 encoding exist to support a separate a black value while allowing multiple additional colors or to support multiple types of cell encoding (ATI’s system) for example.

S3TC or variations on it are explicitly specified in the relevant extension documents and either used under license or worked around due to deviations from the S3 I.P. The calls used for compression extensions are a fairly generic mechanism and the chosen method of compression is indicated by the tokens available and used by the application. The tokens chosen make all the difference but the API doesn’t change, it’s also pretty obvious where the data sent and received in any API call would change and which calls need no change. Whichever you use make sure you have hardware support for the compression system you attempt to specify.

Thanks for the help everybody. Now that I’m searching for the correct keyword I’ve found quite a few articles, describing even the algorithm itself
To michagl: yes the texture resides all the time in texture memory in your GPU. Only when needed does it get decompressed. According to the algorithm, decompression can be done quite fast and since it’s done in hardware…you get the point :wink:
To dorbie: nice things these about the variations, I had no idea. As for the paper from NVidia I had found it some time ago and this is where I based my code to compress and load the textures. The paper describes it pretty clear.
Right now I’m enumarating the device to see what compression algorithms are supported (only RGB-S3TC and RGBA-S3TC-DXT1,3,5). I don’t know if there’s a modern card that doesn’t support them as they seem pretty basic stuff, at least for decompression. But I’ll be sure to keep a set of uncompressed textures around, though my fps will probably drop dramatically. Thanks again, everybody!

Can somebody here tell me if modern cards all support 3D texture compression ? Ie. compression of volume textures ?