8-bit depth attachment on nvidia ?!

I use fbo depth attachment in order to read depth buffer in pixel program, but it seems depth texture has 8-bit precision, not 24 or 32. How can I get true depth texture ?
Video NVidia GeForce 6800GT, drivers 78.01

fbo creation fragment:

glGenTextures( 1, &depth_texture );
glBindTexture( GL_TEXTURE_2D, depth_texture );
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, W, H, 0, GL_DEPTH_COMPONENT, GL_FLOAT, NULL );

glGenRenderbuffersEXT( 1, &depth_buffer );
glBindRenderbufferEXT( GL_RENDERBUFFER_EXT, depth_buffer );
glRenderbufferStorageEXT( GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT24, W, H );

glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, depth_texture, 0 );

You don’t need the depth renderbuffer if you are attaching a depth texture as the depth component.

How do you know you’re only getting 8-bit precision? I have similar code that seems to be giving the correct 24-bit precision.

Do you see a difference when switching from fixed pipeline to programmable?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.