View Full Version : ATI and depth buffer readback
11-28-2007, 04:00 AM
for some reason, I have to use glReadPixels on a PBuffer to save depthbuffer values. This works fine on nVidia cards (using GL_UNSIGNED_INT as type parameter). But on our ATI x1900 card it takes _seconds_ to read the data back! Using GL_FLOAT helps a little, bit is still slow.
Is there any "hardware friendly" type/format that I can use on ATI cards? I do not need to process the data on the CPU in any way. I just have to read back and restore the depthbuffer later. Are there other peculiarities I have to take care of (once, I read that on ATI it is useful to have PBuffer widths a multiple of 8... is that still true... has it been ever true??)
11-28-2007, 10:39 AM
I'm not sure why on ATI this would be so slow compared to Nvidia, but if you say you don't process the data on the CPU you shouldn't read it back in the first place. Find a way to copy and restore the data on the GPU instead. This should be much faster.
I've never done this myself, but it should be possible to copy the depth buffer into a depth texture or you could use FBOs and a depth renderbuffer. Maybe also PBOs (Pixel Buffer Objects) can be used for this, but they are probably unsupported on older hardware.
[ www.trenki.net (http://www.trenki.net) | vector_math (3d math library) (http://www.trenki.net/content/view/16/36/) | software renderer (http://www.trenki.net/content/view/18/38/) ]
11-28-2007, 01:09 PM
I've never used this extension myself, but WGL_ARB_buffer_region (http://www.opengl.org/registry/specs/ARB/wgl_buffer_region.txt) sounds like what you need if you want to copy and restore a buffer.
11-28-2007, 02:01 PM
You can also use glCopyTexSubImage2D, should be way faster than glReadPixels. I am using it. Still hurts the framerate, but not as badly.
11-28-2007, 03:29 PM
I should have mentioned that I use the ReadPixels/DrawPixels pair to copy the contents of the PBuffer back to the framebuffer. WGL_ARB_buffer_regions seems not to work this way.
glCopyTexSubImage2D implies that I have an extra depth texture around. Also, how am I supposed to "blit" the contents of that texture into the depthbuffer of the main framebuffer without using shaders? Directly using PBuffer+RTT is probably not going to work, since the PBuffer is using multisampling.
FBO is still not an option on ATI... AFAIR, 24bit depth buffered FBOs are still not supported, not even speaking about EXT_framebuffer_multisample and EXT_framebuffer_blit.
PBO might be an option. I don't know how widely it is supported on ATI yet. And if I indeed hit a software fallback path (requesting the depthbuffer either as GL_UNSIGNED_INT or GL_FLOAT, additionally this requires a multisample-resolve), I doubt that PBO helps here... it might even get worse.
11-29-2007, 02:04 PM
FBO is still not an option on ATI... AFAIR, 24bit depth buffered FBOs are still not supported
24bit depth buffered FBOs have been supported from the beginning. If you also need it as a texture, it should be supported on X1900 and X1600, but the X1800 can only sample 16bit depth textures.
Powered by vBulletin® Version 4.2.3 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.