View Full Version : supported depth buffer sizes

08-24-2003, 05:29 AM
I'm looking for a method to discover what depth buffer sizes (4, 8, 16, 32) a video card supports. Ideally, I'd like a uniform (cross-hardware, cross-os) method of doing so.

I'd like to stress that I'm NOT looking for the size of the current context's depth buffer (something like glGetIntegerv(GL_DEPTH_BITS, &depth)) but rather all of the sizes the hardware will support.

08-24-2003, 06:32 AM
DescribePixelFormat() is what you lookling for.

08-25-2003, 11:33 AM
As far as I can tell this is a windows-only method. Any other suggestions?

08-25-2003, 01:03 PM
Don't know if it's what you looking for, but you could do a loop with different depthbuffer sizes to get all depthbuffersizes that are supported on a card, I mean something like this :

for i equal 1 to 4 do
Create render context with depth = i*8
Get depthbuffer's size (glGetIntegerv(GL_DEPTH_BITS, &depth))
Delete render context

I think that this is the easiest way of getting all supported depthbuffer sizes independant of the OS.

[This message has been edited by PanzerSchreck (edited 08-25-2003).]

08-25-2003, 10:46 PM
there is no direct solution for your problem. w32 and X11 API are very different and there is no cross OS (display manager) method to do that.

maybe SDL or other cross platform opengl resource allocator have some functions which return some kind of pixelformat or visual.

if you want to do that yourself, you'll have to write some OS specific code like:
#ifdef WIN32

08-26-2003, 12:40 AM
You could use the following calls in a loop to test different depth sizes.

glutInitDisplayString("rgba depth=16 double");