PDA

View Full Version : Depth-stencil texture get corrupt after reading



somboon
06-16-2010, 03:03 AM
I have this strange problem about reading from depth-stencil texture

I tried to add stencil test to my already working deferred rendering engine.
So I create FBO with the following pack depth-stencil attachment.




glGenTextures(1, &depthBufferTexID);
glBindTexture(GL_TEXTURE_2D, depthBufferTexID);

/*1*/glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8, width, height, 0, GL_DEPTH_STENCIL,GL_UNSIGNED_INT_24_8, NULL);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_NEAREST);

/*2*/glFramebufferTexture2D(GL_FRAMEBUFFER,GL_DEPTH_STE NCIL_ATTACHMENT, GL_TEXTURE_2D,depthBufferTexID, 0);




I attached depth-stencil as a texture instead of render buffer
so that I can directly read from depth texture in lighting pass(to reconstruct vertex position ,etc..)

Actually the only thing I change from my whole source code is the line mark with /*1*/ and /*2*/

But after reading from the depth-stencil texture(to reconstruct vertex position in lighting pass),the depth-stencil texture appear to be corrupt.

Any depth-testing enabled rendering perform after the depth-stencil texture reading appered to had some weird hole on them,it is very hard to describe (picture below).

//corrupt picture
http://img21.imageshack.us/img21/8784/corrupts.jpg (http://img21.imageshack.us/i/corrupts.jpg/)

If I comment out the texture reading pass(the deferred lighting pass on background geometry),the subsequence rendering pass appeared correct.

//correct behavior picture
http://img690.imageshack.us/img690/2927/notcorrupt.jpg (http://img690.imageshack.us/i/notcorrupt.jpg/)

I dont think there are anything in the rendering process that can cause this,If I remove the line /*1*/ and /*2*/ and go back to using depth only attachment every thing worked again.

Note that in the whole process I didn't even doing any stencil-related operation once.

Has anyone ever encouter something like this ?

test on ATI HD4670 with 10.5 driver.


Thank in advance.

somboon
06-18-2010, 09:23 AM
****UPDATE****

Just try with ATI catalyst 10.6 driver and it still the same , weird hole on any object render after depth-stencil texture reading.

My brother's geforce 9800gt doesn't has this problem , any object render after the problematic depth-stencil texture reading are correctly drawn.

Guess I could avoid this by making a copy of depth-buffer(using blitting) just for reading or use my own depth format , but I would like to know the cause of the problem since this is very weird.

somboon
06-20-2010, 05:05 AM
****UPDATE****

I decided to continue, by writing linear z into seperate buffer (GL_R32F formats) and use that instead of the old depth-stencil texture and just attach depth-stencil as render buffer.

This way I can using stencil test/depth reading in FBO with out any error on ATI.

But I still want to know what cause the error after reading the depth-stencil texture on ATI too since using seperate float buffer to store z/depth is a little slower compare to depth-texture solution.

frank li
06-21-2010, 09:43 PM
We can't reproduce the problem right now.
Could you please narrow down the problem and paste the whole codes to us?
Do you use lighting in your application?

somboon
06-21-2010, 10:30 PM
Thank for your reply,I will try to create a test application.

It might take a long time too since I already modified that part of application to use the linearZ method and also had a deadline for my main job.

I think this one may be (again) my stupid mistake and yes I use shader for lighting (both deferred and forward part/no mixing of fixed-function code).

Thank :)

somboon