PDA

View Full Version : FBO and 24bit depth buffer support on ATI?



Mars_999
11-08-2005, 09:25 PM
I just installed the 5.10 drivers on my x700 Mobility card and still no 24bit depth support. Is/will this ever see the light of day on ATI for FBO's?

Mars_999
11-09-2005, 10:57 AM
On my 6800GT when I setup FBOs I can use this


glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT24, texWidth, texHeight);and with ATI I have to use 16 or GL_DEPTH_COMPONENT and my depthmap does look better with the 24 on nvidia...

execom_rt
11-09-2005, 01:37 PM
There is only 16-bit depth buffer in FBO on ATI, because, as far I know, it's emulated.

If a 24-bit depth buffer was available, the emulation would be too expensive.

Humus
11-09-2005, 06:16 PM
For a regular depth buffer you can have 24bits. For a depth texture (for instance for shadowmapping) you can only have 16bit textures. This is a hardware limitation. The X1600 supports 24bit depth textures though.

Mars_999
11-10-2005, 03:56 PM
Originally posted by Humus:
For a regular depth buffer you can have 24bits. For a depth texture (for instance for shadowmapping) you can only have 16bit textures. This is a hardware limitation. The X1600 supports 24bit depth textures though.Ok, I am using this depthmap for my water reflections not shadowing but same difference... So what about x1800XT? does that support 24bit? thanks

Humus
11-10-2005, 07:20 PM
No. The X1600 is a newer revision of the chip and has a couple of features that the X1800 doesn't have, 24bit depth texture fetch and fetch4.

execom_rt
11-11-2005, 10:20 AM
The best solution is try to create a 24-bit depth buffer, and if it fails, try the 16-bit depth component, hoping that the FBO won't be emulated by the ATI OpenGL driver (it shouldn't since there is an 'unsupported' error flags.).

If Humus can confirm that FBO would never create emulated rendering contexts on ATI, that would be a good thing.

Humus
11-12-2005, 09:24 AM
We only support formats that are actually supported by the hardware.

Mars_999
11-13-2005, 05:28 PM
What error should I find if glRenderbufferStorageEXT fails? I know the function is void, but How else would one know whether or not so I can move from 24bit to 16bit? I look through the extension listing and can't find anything. Thanks

zed
11-14-2005, 03:08 PM
i assume this is for shadowmaps?
in that case why do u need 24bit depth?

the only case i can see a real nesessity is point lights covering a huge area.

Humus
11-14-2005, 05:19 PM
Originally posted by Mars_9999:
What error should I find if glRenderbufferStorageEXT fails? I know the function is void, but How else would one know whether or not so I can move from 24bit to 16bit? I look through the extension listing and can't find anything. ThanksglCheckFramebufferStatusEXT() is what you're looking for.

Mars_999
11-15-2005, 03:41 PM
Originally posted by zed:
i assume this is for shadowmaps?
in that case why do u need 24bit depth?

the only case i can see a real nesessity is point lights covering a huge area.I am using it for rendering water. And there is a noticable difference between 16/24 on my 6800GT. Thanks Humus I will look into it.

Andrew Lauritzen
11-20-2005, 09:33 AM
Depending on specifically what you're doing, you may be able to just render depth to a fp32 texture and read from that. It may be slightly more expensive than just reading from the depth buffer, but probably not too much (we haven't had any performance problems doing shadow mapping with floating point textures for example).

Of course your hardware would have to support fp32 textures, but I think everything from Radeon 9500 up does (please correct me if I'm wrong).

Mars_999
11-21-2005, 04:01 PM
Hmmm I tried 16 vs. 24 on the 6800GT and to my suprize the depth map was no different? So I am assuming that Nvidia isn't doing 24bit either but in the driver will fall back to 16 if you select 24??