I modified the “Render to a Texture” tutorial at GameTutorials.com to render the depth buffer to the texture instead of the colour buffer. For good measure, I decided to incorporate the ARB_depth_texture extension. Normally the thing ran at more than 700 fps, but after my modfications I got 8 fps. After a bunch of fooling around I found that turning off FSAA resulted in the correct framerate of >700 fps again. Does anyone know why this occurs?
I’m using a Radeon 9700 Pro with the Catalyst 3.0a drivers, BTW. . .
It could be the framebuffer format vs the texture format used. If they are incompatible there might be some expensive swizzle going on, perhaps even in software back on the CPU.
Well, it’s up on the developer site, but has been leaked for almost as long as it’s been up there, a quick google search on “ati 6275” gave me several links.