PDA

View Full Version : Stitching problem (inaccurate z test in OpenGL)



Fastian
11-29-2001, 08:25 PM
Hi,

I played facing worlds (CTF)in Unreal Tournament and it looked great in D3D on my TNT and Geforce 2 MX. Then I tried to experiment and switched my driver to OpenGL in UT. The next time I played Facing Worlds... it appeared very bad. The opposit castle appeared to show strange edges of triangles. Sometimes the back polygon appeared in front and vice versa. When i came close to that castle the visual glitches started to disappear. Thinking that might be UT's bad implementation of OpenGL I ignored it. Then oneday I was playing Quake 3:Team Arena. I noticed the exact same problem in some of the maps i.e. when I looked at a structure from far, this problem appeared. I alsop downloaded Facing Worlds conversion for Quake 3 and it showed the same stitiching problems but a lot worse.

The question here is... Is OpenGL not capable of accurate depth tests?
I know of glPolygonoffsets and I dont think that John carmack or UT ppl are stupid enough to ignore this but still these visual glitches are there.. What might be the problem. Could it be a driver issue?. The problem appears on both the cards i.e. TNT and GF2MX.

Fastian

Bob
11-29-2001, 11:00 PM
It's your depth buffer that doesn't have enough precision to give you precice depth tests.

You say you look at a structure from far, so I assume this map has some large open areas. Games like UT and Q3A is supposed to be indoor-games, and have problems with outdoor environments. It's cause of the way perpsective projection works. You have more depth precision close to the viewpoint, and less precision further away. If you can see far enough, you will loose too much precision, and these problems occur. It's a mathematical problem, and the only way to solve it (no, not solve, but make it appear less often), is to make the view volume smaller, or increase the precision in the depthbuffer.

So, if you run your games in 16.bit color, you get a 16-bit depthbuffer on these cards. Try set the color depth to 32-bit, which will give you a 24-bit depth buffer. This will increase the precision, and maybe "eliminate" the problem.

Rob The Bloke
11-30-2001, 03:05 AM
The problem is GF2MX's have 16bit depth buffers. Compare the card with a GF2 GTS or better and the problem isn't there. I've seen it a lot on these cards

Shag
11-30-2001, 03:44 AM
I think Bob is correct. Q3A uses a BSP to render polys in a back to front order for indoor scenes, but I believe that another method may be used for large scale outdoor scenes.

Rob. The Geforce2 MX 400 does have a 24 bit depth buffer - I'm playing with one at the moment. But I must say it's very dissapointing. Slower than my Geforce DDR in many cases.

Bob
11-30-2001, 04:20 AM
It's probably slower because you are fillrate limited. The only (major) difference between MX and GTS is clock speed. That's why GTS performs better, because it runs at a higher frequency.

And yes, MX can use 24-bit depthbuffer. In the same way GTS can use 16-bit depth buffer. So the problem is not MX-specific. It's there on the GTS aswell.

WhatEver
12-01-2001, 08:35 PM
Set your color depth to 32bit, that'll fix it right up.

WhatEver
12-01-2001, 08:36 PM
Maybe I should read Bobs whole post http://www.opengl.org/discussion_boards/ubb/smile.gif