View Full Version : Copying the depth buffer back and forth from aux buffer
I tried using glDrawPixels and glReadPixels to store and retrieve the depth buffer but couldn't quite get it to work. Then I found that the card I'm working on supports auxiliary buffers which should avoid the memory transfer so I'm trying that route. I think I can save the depth info by setting glDrawBuffer(Aux1) and using a glCopyPixels(Depth_component) but I can't quite figure out how to draw the depth info I just saved from the aux buffer back into the depth buffer. glDrawBuffer doesn't seem to take depth buffer as a parameter and glCopyPixels only seems to designate a source for the copy. How do I set the depth buffer as the destination? Any ideas?
I finally have glReadpixels and glDrawpixels working properly to store and retrieve the depth buffer. But does anyone know a way to do this using glCopyPixels and an auxiliary buffer so it doesn't suffer from the memory transfer off the card???
glCopyPixels(0, 0, Width, Height,GL_DEPTH);
I can copy the depth info from the depth buffer into the auxiliary buffer, but how do I copy that back into the depth buffer? glDrawBuffer only takes color buffers as arguments - how do I point it to the depth buffer?
What do you really want to do ? To my opinion, it makes no sense wanting to write back to the depth buffer.
It's neceesary in order to implement Goldfeather's algorithm for CSG (constructive solid geometry). Basically, surfaces are rendered using stencil and depth buffers in several passes and then accumulated in the end. Each step essentially requires at least two depth buffers, one for the current surface rendering, and one to store the accumulated depths. It works now using glReadPixels/glDrawPixels but this requires transferring data off the card. Ideally, this could be optimized so auxilliary buffers could be used and everything performed on the card. It seems straightforward to write into the depth buffer from off the card - why can't it be done from an aux buffer? It looks like it can be stored on the card but not retrieved and that makes no sense to me. Seems like an obviously needed implementation...
02-08-2006, 10:33 PM
why can't it be done from an aux buffer?Because an AUX buffer is a color buffer in the same color depth as the primary.
Lets say you have a depth buffer with one component of 24 bits and an AUX color buffer with 4 channels RGBA each 8 bits.
OpenGL does not split one component input into three component output bitwise for you.
There is an extension which can do the copy from depth to RGBA, but not from RGBA to depth, so not helping in this case.
If you just need to switch between different depth buffer data without ever touching the data other then rendering to it, use the extension
which can quickly save and restore pixel data.
Beware of the pixelownership test! Every pixel data overlapped by other windows or moved off the desktop is undefined.
If you really need the depth data, you can only access multiple depth values if you store the data in depth textures.
Search for examples using texture internalFormat GL_DEPTH_COMPONENT24.
Thanks Relic! I neglected to mention that I'm trying to run this on Linux but I think I've found the equivalent glx extensions. For now though, to do it right, I'm looking into the depth textures. Thanks again for the pointers...
Powered by vBulletin® Version 4.2.2 Copyright © 2015 vBulletin Solutions, Inc. All rights reserved.