FBO and just a depth texture?

With FBO’s are you still forced to have a color attachment if you only want to have a depth texture for shadow mapping? I am not getting any values to show up in the depth texture when having just a depth texture assigned to the FBO?


glGenFramebuffers(1, &id);
	glBindFramebuffer(GL_FRAMEBUFFER, id);
	glGenTextures(1, &texture);
	glBindTexture(GL_TEXTURE_2D, texture);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, useMipmaps ? GL_LINEAR_MIPMAP_LINEAR : GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_DEPTH_TEXTURE_MODE, GL_INTENSITY);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT32, width, height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0);
	if(useMipmaps)
		glGenerateMipmap(GL_TEXTURE_2D);
	glDrawBuffer(GL_NONE);
    glReadBuffer(GL_NONE);
	glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, texture, 0);
	CheckFBOStatus();
	glBindFramebuffer(GL_FRAMEBUFFER, 0);
	ErrorCheck();

You do not need to have the color attachment. I am using code very similar to the one mentioned in the extension specification and it works. You might try another depth format. I am using the GL_DEPTH_COMPONENT16 one. I am also not using the mipmaps on the shadowmap texture.

SIgh, I been at this for two days, I have a texture array version working fine with no color attachments or render buffers.

THis is bugging the hell out of me.

I tried the 16bit depth and no mipmapping still nothing. I a using GLSL to show the values I am using sampler2DShadow, shadow2DProj, is there a state I am missing on the CPU side?

I am calling this when I goto a FFP mode to try and see the depthmap… glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_NONE);

I call glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE); when I call the shader.

THanks,

My FBO binding code is following. One additional difference is that I left the texture set in the default GL_LUMINANCE mode.


glBindFramebufferEXT( GL_FRAMEBUFFER_EXT, framebuffer_object ) ;
glFramebufferRenderbufferEXT( GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, 0 ) ;
glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, texture, 0 ) ;
glDrawBuffer( GL_NONE ) ;
glReadBuffer( GL_NONE ) ;

By the “I a using GLSL to show the values” you mean that you are attempting to show the result of the depth comparison? If yes, is the depth in the depth component of the texture coordinate calculated properly?

From what I can tell yes, I even dropped this code into the “opengl superbible” code they have for non shader FBO shadowmap rendering and nothing…

AFAIK GLUT supports 32bit depth buffers right? These examples are running on GLUT.

You can try following Nvidia example. It appears that it supports FBO.

About the depth buffers in the GLUT. It seems that the GLUT asks for 32bit depth format for the main framebuffer however what it gets depends on the HW, driver and combination of other parameters (e.g. presence of stencil). What is supported as shadowmap texture is different thing. For example, if I remember correctly, the ATI HW before X1600 only supports sampling from 16bit depth textures.

Thanks Komat, I don’t know what I did, but they are working now… :frowning: :slight_smile: