I’m looking for a method to discover what depth buffer sizes (4, 8, 16, 32) a video card supports. Ideally, I’d like a uniform (cross-hardware, cross-os) method of doing so.
I’d like to stress that I’m NOT looking for the size of the current context’s depth buffer (something like glGetIntegerv(GL_DEPTH_BITS, &depth)) but rather all of the sizes the hardware will support.
Don’t know if it’s what you looking for, but you could do a loop with different depthbuffer sizes to get all depthbuffersizes that are supported on a card, I mean something like this :
for i equal 1 to 4 do
begin
Create render context with depth = i*8
Get depthbuffer's size (glGetIntegerv(GL_DEPTH_BITS, &depth))
Delete render context
end
I think that this is the easiest way of getting all supported depthbuffer sizes independant of the OS.
[This message has been edited by PanzerSchreck (edited 08-25-2003).]