View Full Version : GF 440MX - only 16-bit Z-buffer?
11-24-2003, 10:01 PM
Recently I've got terrible Z-fighting in my game on PC with GeForce 440 MX video card and I found out that reason of it is 16-bit Z-buffer. I've tried to set 24-bit or 32-bit Z-buffer but without any success. So my question is very simple.
Does GeForce 440 MX support more then 16-bit buffer? And if so how can I enable it?
All(?) consumercards uses the same depth for color and (depth+stencil) if you have 16bit desktop and just make a normal window you only get 16bit color, and 16bit depth in your opengl render context.. if you have 32bit desktop you will get 32bit color, and 24bit Depth + 8bit Stencil.
11-24-2003, 10:35 PM
It is a common problem on GeForce1-4.
Request a stencil buffer and you will get a 24-bit depth buffer plus an 8-bit stencil buffer.
If you request no stencil buffer, you may get a 16-bit depth buffer no matter how many bits you require in your pixel format or no matter in which color depth your desktop is.
11-24-2003, 11:32 PM
Hm, looks like I solve the issue. I always requested 32-bit Z-buffer and it works just fine on any cards except GF 440 MX. Now I just ask for 24-bit Z-buffer (instead of 32-bit) and in 32-bit color depth everything works just fine. in 16-bit color depth I've always got 16-bit buffer and Z-fighting http://www.opengl.org/discussion_boards/ubb/frown.gif
in windows, try to not use glChoosePixelformat.. i think it makes to many misstakes. You can use discribepixelformat to get all information about a format, and then decide for yourself if it fits.
11-25-2003, 04:10 AM
>>Request a stencil buffer and you will get a 24-bit depth buffer plus an 8-bit stencil buffer.<<
Don't do that to workaround a problem in your or Microsoft's pixelformat selection.
Just enumerate pixelformats with DescribePixelFormat and choose yourself, or use the wgl variants which match more exact.
Powered by vBulletin® Version 4.2.2 Copyright © 2014 vBulletin Solutions, Inc. All rights reserved.