View Full Version : glCopyTexImage2D problem
02-12-2005, 10:05 AM
I use glCopyTexImage2D to copy the depth buffer to a texture:
glCopyTexImage2D(GL_TEXTURE_RECTANGLE_NV, 0, GL_DEPTH_COMPONENT , 0, 0, m_nCurrentW, m_nCurrentH, 0);
However, I found the memory usage keep increasing. There are anything I did wrong?
and how to use glCopyTexSubImage2D to read the depth buffer? Thanks!
02-14-2005, 04:48 AM
If you're creating a new texture object (glGenTextures()) each time before uploading your data with glCopyTexSubImage2D then you'll be wasting memory. Create the texture object once and then just bind the same one before your texture upload.
In order to use the depth buffer as a texture using glCopyTexImage you'll need to read the frame buffer into memory first using glReadPixels using the GL_DEPTH_COMPONENT enumeration. This is because glCopyTexImage expects the texture data to be in system memory.
An alternative (which saves you the data transfer overhead - which is significant by the way) is to render the scene to a pbuffer (or frame buffer object - which im not sure is available yet) and write a fragment shader that that sets the fragment color's RGB value to that fragments z-depth. In the end you get a grayscale image that is a representation of your depth buffer - you can then bind the frame buffer object or the pbuffer as a texture.
02-14-2005, 05:33 AM
Originally posted by Aeluned:
In order to use the depth buffer as a texture using glCopyTexImage you'll need to read the texture buffer into memory first using glReadPixels using the GL_DEPTH_COMPONENT enumeration. This is because glCopyTexImage expects the texture data to be in system memory.
That's wrong. Spec says:
"The image is taken from the framebuffer exactly as if these arguments were passed to
CopyPixels with argument type set to COLOR or DEPTH, depending on internalformat,..."
02-14-2005, 06:17 AM
Oops, that's right...
I stand corrected.
Powered by vBulletin® Version 4.2.3 Copyright © 2017 vBulletin Solutions, Inc. All rights reserved.