supported depth buffer sizes

I’m looking for a method to discover what depth buffer sizes (4, 8, 16, 32) a video card supports. Ideally, I’d like a uniform (cross-hardware, cross-os) method of doing so.

I’d like to stress that I’m NOT looking for the size of the current context’s depth buffer (something like glGetIntegerv(GL_DEPTH_BITS, &depth)) but rather all of the sizes the hardware will support.

DescribePixelFormat() is what you lookling for.

As far as I can tell this is a windows-only method. Any other suggestions?

Don’t know if it’s what you looking for, but you could do a loop with different depthbuffer sizes to get all depthbuffersizes that are supported on a card, I mean something like this :

for i equal 1 to 4 do
begin
 Create render context with depth = i*8
 Get depthbuffer's size (glGetIntegerv(GL_DEPTH_BITS, &depth))
 Delete render context
end

I think that this is the easiest way of getting all supported depthbuffer sizes independant of the OS.

[This message has been edited by PanzerSchreck (edited 08-25-2003).]

there is no direct solution for your problem. w32 and X11 API are very different and there is no cross OS (display manager) method to do that.

maybe SDL or other cross platform opengl resource allocator have some functions which return some kind of pixelformat or visual.

if you want to do that yourself, you’ll have to write some OS specific code like:
#ifdef WIN32

#else

#endif

You could use the following calls in a loop to test different depth sizes.

glutInitDisplayString("rgba depth=16 double");
if (!glutGet(GLUT_DISPLAY_MODE_POSSIBLE)) ....