Hi!
We’re now coding a very large scenario and triangles sometimes “cross” each other (think of a terrain and a huge quad to create the sea). When this happens in the distance, we get an awful sawtooth pattern due to zbuffer unaccuracy. We solved this way back with some nvidia drivers which seemed to work with 32 bit zbuffer, and the problem was gone.
Now those drivers are also gone and we are asking if there is some software method to enforce the zbuffer depth? we are setting the Pixel format Descriptor to 32 in Depth Bits, but does not seem to do the trick.
Supposing you want the range 1 through 100000
(which is too much resolution for a typical
Z buffer with any degree of accuracy). You
can either give up on the “1” and move it
out to 10 or 100 to increase resolution, or
you can draw twice.
In general NVIDIA chips support 16 bit z in 16 bit color resolutions and 24 bit z and 8 bit stencil in 32 bit color resolutions.
Do an enumeration of all the pixelformats with DescribePixelFormat() and check desired fields in the pixelformatdescriptors available.
Be careful not to run in a PFD_GENERIC_FORMAT.
But as said, the best method to avoid z bleeding is to make the ratio of zFar/zNear as small as possible.
In most cases you won’t draw the hole depths of your landscape. So e.g. if the landscape is 1000000 x 1000000 but your viewing depth should only be 1000, then set your far clipping plane to 1000.
My rule of thumb is: Keep zFar/zNear well below the maximum number of discrete z values you can distinguish with the number of depth bits.
Which means, don’t set zFar and zNear to arbitrary values, but enclose your world as tight as possible. Increase zNear where possible.