PDA

View Full Version : supported depth buffer sizes



hijinx
08-24-2003, 06:29 AM
I'm looking for a method to discover what depth buffer sizes (4, 8, 16, 32) a video card supports. Ideally, I'd like a uniform (cross-hardware, cross-os) method of doing so.

I'd like to stress that I'm NOT looking for the size of the current context's depth buffer (something like glGetIntegerv(GL_DEPTH_BITS, &depth)) but rather all of the sizes the hardware will support.

AdrianD
08-24-2003, 07:32 AM
DescribePixelFormat() is what you lookling for.

hijinx
08-25-2003, 12:33 PM
As far as I can tell this is a windows-only method. Any other suggestions?

PanzerSchreck
08-25-2003, 02:03 PM
Don't know if it's what you looking for, but you could do a loop with different depthbuffer sizes to get all depthbuffersizes that are supported on a card, I mean something like this :


for i equal 1 to 4 do
begin
Create render context with depth = i*8
Get depthbuffer's size (glGetIntegerv(GL_DEPTH_BITS, &depth))
Delete render context
end

I think that this is the easiest way of getting all supported depthbuffer sizes independant of the OS.

[This message has been edited by PanzerSchreck (edited 08-25-2003).]

errno
08-25-2003, 11:46 PM
there is no direct solution for your problem. w32 and X11 API are very different and there is no cross OS (display manager) method to do that.

maybe SDL or other cross platform opengl resource allocator have some functions which return some kind of pixelformat or visual.

if you want to do that yourself, you'll have to write some OS specific code like:
#ifdef WIN32
...
#else
...
#endif

Adrian
08-26-2003, 01:40 AM
You could use the following calls in a loop to test different depth sizes.

glutInitDisplayString("rgba depth=16 double");
if (!glutGet(GLUT_DISPLAY_MODE_POSSIBLE)) ....