View Full Version : Depth Buffer: clipping gets very bad when poly's are far away

08-18-2000, 01:59 PM
I'm having the problem where when polygons that intersect eachother are far away from the camera, they create a very jagged line where they intersect rather than a smooth one. I thought of changing the Depth Buffer bit to 32 rather than what I assume is default: 16. But any attempt I made to do that wouldn't change the bit size (perhaps only 16 is supported by my computer?). I did find out about glDepthRange(), but that could only be modified to make my problem worse (more jagged).

Anyone have any other ideas on how I could fix this or have some code to change the depth buffer bit that you know works?

08-18-2000, 02:15 PM
If you can, try moving your near clipping plane farther out.

(In your glOrtho/glFrustum/gluPerspective call)

08-18-2000, 02:21 PM
Well, that helps. I knew about resizing my clipping planes before though, and I didn't want to use it because I still want to be able to get up close and very far away from things. I should have mentioned that. Thanks for the help though (and man, you replied fast! I put this up just a couple minutes ago).

08-19-2000, 01:50 AM
Well, nothing is perfect, not even the depthbuffer. If you want to get good precision at both near and far plane, the only way to succed is to either use a small near to far ration, or a depthbuffer with mode bits. A good near:far ratio for a 16-bit depthbuffer is not more than 1e4 (ratio=far_plane/near_plane). If you have a bigger ratio, the jaggies will be bigger and bigger. The only way to sove it, is to place the far plane as close as possible, and the near plane as far as possible. No other way.

08-19-2000, 10:12 AM
Darn, I was afraid of that. But I keep thinking about games like Q3, half-life, etc. that don't seem to have this problem (well, i have noticed it in half-life when things get extremely far away, but that is rare. That has to do with BSP trees, doesn't it?). What do you mean by a depthbuffer with mode bits though?

[This message has been edited by Daecho (edited 08-19-2000).]

08-19-2000, 10:25 AM
It was a typo, it was suppose to say 'more bits' http://www.opengl.org/discussion_boards/ubb/biggrin.gif

Indoor games like Q3A and HL, uses a far plane that is close to the camera, since you generally don't see that far indoors. And you said HL ad this problem in some rare cases. Well, in the situation you mentioned, it makes sense to me. Far away objects requires that the far plane is further away. And if you want to keep the near at the same position, you will get a greater near:far ratio, and that, in turn, will generate nasty artifacts further away.

08-19-2000, 10:32 AM
So, in other words, Dynamically adjust the far plane to adapt to the environment (i.e. set it just beyond the farthest thing you can see)?

A second thought: Say you had a door in a room, and the outside was only a small part of the screen. Would it be possible to simply do a two pass render, first rendering the outside with the appropriate range, then the draw the inside over that?

08-19-2000, 10:34 AM
Ok, so a depth buffer with more bits CAN help it? That's what I was attempting to pursue in the first place, but I couldn't get it to work. I tried to make it 32 bit (instead of 16). Is it something special that has to be supported by your hardware, or can everyone change it?

08-19-2000, 02:16 PM
Naerbnic: Yes, render in two passes can be a solution. But you have to recalculate the entire projection matrix every frame, and I don't think this is any better compared to the image quality you gain.

Daecho: A depthbuffer with more bits WILL help! Maybe your system doesn't support 32-bit depthbuffer, and that's why you can't find any suitable pixelformat.

08-20-2000, 11:26 AM
One word: Dang.

Yeah, I used that glGetIntegerv(GL_DEPTH_BITS, bitsSupported); thing and 16 is the only number that makes sense that's listed by it (the other numbers are just jibberish). Well, thanks for helping out guys. I just realized that I don't know why I'm pursuing this problem 'cause the game I'm making right now is a side-scroller and I'll never see that problem. http://www.opengl.org/discussion_boards/ubb/smile.gif I get side tracked too much.

David Eckel

08-20-2000, 01:56 PM
Quake doesn't have that problem because it uses a bsp. With a bsp, you get perfect back to front order.

08-21-2000, 02:32 AM
Quake uses a BSP for static geometry. AFAIK the BSP doesn't help with moving objects (monsters, players etc), because recompiling per frame would be far too slow. So the Z-buffer is still needed.

I remember seeing depth-precision artifacts with monsters in Q1.