how to avoid 16Bit ZBuffer artefacts??

I coded an screensaver with OpenGL. It uses 32 Bit Color/ZBuffer, if available. On my NVidia card everything looks fine, but on a Matrox g400 it looks ugly, all edges have a ‘sawtooth’ shape… I can create them artificial on my Riva TNT1 with a Window on a 16-Bit Desktop, so I think it is the 16-Bit ZBuffer. But the newest OpenGL-Driver for Matrox shows a checkbox ‘use 32Bit Zbuffer’ but it makes no difference…
Is there a possibility for avoiding this artefacts? please help
thanx in advance

Set your near plane as far away as possible. I’ve found that 1/1000 of your far plane looks great. So, if you’r far plane is at 20,000 put your near plane to 20.

-BwB

Not sure if my theory is right, but I think that if you set the near plane to 1/65536 (max numbers with 16bit) the far plane, you get 1 unit of precision, so 1/6553.6 will give you 0.1 units of precision. Units are whatever you are passing to OpenGL.

The z-buffer isn’t linear, so if make the near plane very small you lose a lot of precision. The closer you get to the near plane the more precise the z-buffer gets.

BwB: Thanx, your are right the artefacts are disappearing, if I move the near-Plane towards the farplane

Thanx to coco and blaze too…