View Full Version : Depth Buffer Bit
02-17-2004, 01:36 PM
My program displays a 3d model which works fine on my development machine, but when I transfer the program to another machine I have wierd graphic behavour (clipping and such). I changed the openGL settings on this machine to set the depth buffer bit to 24 instead of 16 and it worked fine (intel graphics). I obvoiulsly dont want to have to do this for all machines. Is this a problem with depth testing?. I have enabled GL_DEPTH_TEST what else do I have to do? Any Ideas??
02-17-2004, 02:07 PM
I have some thoughts, but it's hard to say without a screenshot. Could you post a link to a screenshot?
02-17-2004, 02:38 PM
Sorry at this point I am unable to do that http://www.opengl.org/discussion_boards/ubb/frown.gif
My 3D model is made up of separate parts. It is the parts behind other parts that interfer with each other. I get sawtooth interference where they overlap. And I can see parts of the hidden model that I shuld not see. As I said this does not occur when I change to 24 - bit depth buffer, would you know how to change that within the program ( I dont exactly know what this setting does)
[This message has been edited by Corremn (edited 02-17-2004).]
02-17-2004, 03:41 PM
you are losing depth buffer precision. The best way to fix this is either move your near clip plane away or the far clip plane inward. Keep in mind that changes to the near plane will be much more significant
02-17-2004, 04:54 PM
Yep thanks, that was easy http://www.opengl.org/discussion_boards/ubb/smile.gif
Powered by vBulletin® Version 4.2.2 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.