Copying the depth buffer

I have a Windows OpenGL program that has two pairs of rendering and device contexts. The first pair is for a static scene that I want to render only once and then store in the device context. On every frame, I copy the first device context to the second device context and then render the dynamic objects in the second context. However, the depth buffer doesn’t get copied by the BitBlt from the first device context to the second. Is there a way to copy the depth buffer from one context to another?

You can’t use BitBlt even for copying the color buffer from one context to the back buffer of another. Unless your 3D card has special support for it, GDI can’t draw to the back buffer.

  1. You can use glReadPixels & glDrawPixels.
  2. You can use GL_KTX_buffer_region extension (if available).

[GL_KTX_buffer_region](http://www.west.net/~brittain/3dsmax2.htm#OpenGL Buffer Region Extension)

// hmm, hyperlink doesn’t work…

[http://www.west.net/~brittain/3dsmax2.htm#OpenGL Buffer Region Extension](http://www.west.net/~brittain/3dsmax2.htm#OpenGL Buffer Region Extension)

[This message has been edited by Serge K (edited 06-09-2000).]

Hmmm…my program does successfully copy the color buffer from the first device context to the second. Could it be because I don’t use double buffering? When I’m finished rendering I actually BitBlt the second device context to a DirectDraw surface and then Blt that to a back buffer surface. (I’m not going to get flamed for mentioning DirectX on an OpenGL discussion board, am I?)

> When I’m finished rendering I actually BitBlt the second device context to a DirectDraw surface and then Blt that to a back buffer surface.

I am afraid that it works not always.
It depends on the driver, …etc.
Does it work for a software implementation?
Have you tried it at Windows NT4?

I’m running it in Windows NT 4.0 SP5. I have a FireGL 1000 video card, if that makes any difference.

What do you mean by a software implementation?