ATI's normal map compression

So, the other day I finally got my X800XT PE and I was looking at the extension list and happened to spot GL_ATI_texture_compression_3dc and GLInfo2 lists an ‘unknown’ texture compression format.

Now, this is all very nice and all, but I cant find any infomation on the extension, such as what token to use to upload the texture and any other notes.

So, is there an extension spec I’m missing? (google doesnt give me much nor does a look around the ATI site)

This is what you need:

#define GL_COMPRESSED_LUMINANCE_ALPHA_3DC_ATI 0x8837

It’s built on GL_ARB_texture_compression and works in every way like for instance GL_EXT_texture_compression_s3tc. You can export 3Dc textures with Compressonator.
http://www.ati.com/developer/compressonator.html

There’s also a 3Dc sample in our SDK.
http://www.ati.com/developer/radeonSDK.html
(There should be a new one there very soon, this one is from last July but it has the sample as well)

ah, thanks :slight_smile:

One other question, how does this extension interact with GLSL, if it does at all?

Just like any other texture format, compressed or not. As you can see it’s a 2 component luminance+alpha format, so you have to compute z yourself in the shader.

We are reading x and y from green and alpha, meaning that our shader is fully compatible with xRBG shuffled dxt5 textures (with a blue weight of 0) as well, if 3dc is not available.

The formats are converted with compressonator commandline like this: (it’s documentation is pretty limited…)

TheCompressonator -MipMaps -convert file.tga file.3dc ATI2
TheCompressonator -MipMaps -convert file.tga file.dds xRBG +blue 0.0

ah, cheers

the reason I asked is i’m sure i read something about an extra instruction in the DX shaders for doing that automatically, so I was kinda wondering if there was going to be some kinda GLSL extension to do the same trick (ofcourse, if its not an automatic thing in hardware this wouldnt make any sense beyond doing it the way you outlined above… .)

Originally posted by PsychoLns:
We are reading x and y from green and alpha, meaning that our shader is fully compatible with xRBG shuffled dxt5 textures (with a blue weight of 0) as well, if 3dc is not available.
Another option (if you don’t want to include both 3Dc and DXT5 textures) is simply decoding the 3Dc texture into GL_LUMINANCE8_ALPHA8 for cards that don’t support 3Dc. It is pretty straighforward and 100% equivalent in all occasions, even when 3Dc is used for other purposes than normalmaps.

Originally posted by bobvodka:
[b]ah, cheers

the reason I asked is i’m sure i read something about an extra instruction in the DX shaders for doing that automatically, so I was kinda wondering if there was going to be some kinda GLSL extension to do the same trick (ofcourse, if its not an automatic thing in hardware this wouldnt make any sense beyond doing it the way you outlined above… .)[/b]
There’s no magic instruction to compute the last component, either in GL or DX. But if you’re already normalizing the bumpmap normal you can consider 3Dc for free as it takes three instructions to compute the last component, which is exactly the same as a normalize.

hmmm i must have dreamt it or something, heh, cheers for clearing it up :slight_smile:

You are probably thinking of the (uncompressed) nvidia specific 2 component (in memory) textureformat CxV8U8, which reads as a normalized 3 component format and is faster (on gffx and below) than doing the calculation in the shader.

So does the texture get compressed for you when it’s uploaded, like S3TC? Or do you have to compress the texture to a file with The Compressonator (or the ATI_compress library) and then load it into the card already compressed?

Well, you could try and see if it works. I actually don’t know if the driver will compress it for you. Either way, it’s of course preferable to store a precompressed texture anyway since it’s faster to upload and would likely get better quality anyway.