PDA

View Full Version : Depth Buffer Bit



Corremn
02-17-2004, 02:36 PM
My program displays a 3d model which works fine on my development machine, but when I transfer the program to another machine I have wierd graphic behavour (clipping and such). I changed the openGL settings on this machine to set the depth buffer bit to 24 instead of 16 and it worked fine (intel graphics). I obvoiulsly dont want to have to do this for all machines. Is this a problem with depth testing?. I have enabled GL_DEPTH_TEST what else do I have to do? Any Ideas??

endash
02-17-2004, 03:07 PM
I have some thoughts, but it's hard to say without a screenshot. Could you post a link to a screenshot?

Corremn
02-17-2004, 03:38 PM
Sorry at this point I am unable to do that http://www.opengl.org/discussion_boards/ubb/frown.gif
My 3D model is made up of separate parts. It is the parts behind other parts that interfer with each other. I get sawtooth interference where they overlap. And I can see parts of the hidden model that I shuld not see. As I said this does not occur when I change to 24 - bit depth buffer, would you know how to change that within the program ( I dont exactly know what this setting does)


[This message has been edited by Corremn (edited 02-17-2004).]

chowe6685
02-17-2004, 04:41 PM
you are losing depth buffer precision. The best way to fix this is either move your near clip plane away or the far clip plane inward. Keep in mind that changes to the near plane will be much more significant

Corremn
02-17-2004, 05:54 PM
Yep thanks, that was easy http://www.opengl.org/discussion_boards/ubb/smile.gif