one/two channel float rendering on ATI R300 or better

I just tried the GL_ATI_r_rg extension last night and it works on both my Radeon 9800 Pro (R360 core) and my Radeon X1600 Pro. It allows for one channel ® and two channel (RG) floating point (16-bit and 32-bit) textures and render-targets. I only tried it with FBOs, it may work with P-Buffers, also.

The GL_ATI_r_rg extension has no spec (that I know of), but the tokens are available in glATI.h, which can be found various places. Here are the tokens:

#define GL_R_FLOAT32_ATI 0x8838
#define GL_RG_FLOAT32_ATI 0x8839
#define GL_R_FLOAT16_ATI 0x883A
#define GL_RG_FLOAT16_ATI 0x883B
#define GL_RG_ATI 0x883C

Also, this extension is most likely not advertised in the extension list. If the tokens are accepted (no errors are generated) when creating the texture image, it will probably work. Like I wrote above, it works on my 9800 Pro and my X1600 Pro.

GL_LUMINANCE appears to work for the [FORMAT] parameter when specifying texture images with the GL_R_FLOATxx_ATI internal format. GL_RG_ATI is used for the GL_RG_FLOATxx_ATI formats.

Some things I discovered when experimenting with the GL_RG_ATI format:

glColorMask(GL_FALSE, GL_TRUE, GL_TRUE, GL_FALSE);
for rendering only to the R channel

glColorMask(GL_TRUE, GL_FALSE, GL_FALSE, GL_TRUE);
for rendering only to the G channel

That is probably because it is stored as BGRA. I’m not sure.

A texture fetch returns 0.0 in the blue channel and 1.0 for alpha.

To disable color rendering when using the 16-bit one-channel format, disable RED and BLUE with glColorMask(). Obviously setting all of them to true or false would work also. However, it seems like the Red and Blue are the only ones that matter. To disable color rendering when using the 32-bit one-channel format, disable all 4 of them.

Although GL_LUMINANCE is used with the one-channel formats, the green and blue seem to be 0.0 instead of the value of red.

I have no clue about support for anti-aliasing, blending, or texture filtering. Naturally, blending/filtering would only possibly be supported on cards that support floating-point blending/filtering.

I haven’t found anything about this extension on the internet so I thought I would share what I found. Sorry if any of this is wrong. I’ve only experimented for < 24 hours. I’m pretty excited about this though! Enjoy!