Deferred Shading, MRT and Depth Test

Hello,

I’m currently implementing a deferred shader with multiple render targets for albedo, normal and position. The depth test fails as soon as I set the “alpha” value for the normal or position texture to zero. How can I tell OpenGL to use only the albedo texture for its depth tests? Or am I thinking in the wrong direction?

Depth Test is enabled during geometry render (glEnable(GL_DEPTH_TEST)) and disabled when I call the deferred shading.

My Setup for the depth looks like this:
glGenRenderbuffers(1, &ids[depthbuffer]);
glBindRenderbuffer(GL_RENDERBUFFER, ids[depthbuffer]);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24, dim[0], dim[1]);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,
GL_RENDERBUFFER, ids[depthbuffer]);

and my fragment shader to fill the MRT:
lo_fragcolor1 = texture(u_texture, gs_texcoord);
lo_fragcolor2 = vec4(normalize(gs_normal) * 0.5 + 0.5, 0);
lo_fragcolor3 = vec4(gs_position, 0);

Any ideas?
Saski

What makes you think that the depth test is failing?

Hi,
Because objects appear to be partial tranaparent as if the alpha channels of the MRTs are being mixed somehow. Maybe I’m confusing this with a depth test fail.

Saski

problem solved! explicitly calling glDisable(GL_ALPHA_TEST) before rendering the geometry to MRTs did the job :slight_smile:

Saski