View Full Version : Stencil buffer problem ?
I have a big problem .
I can not work with stencil buffer .
When I draw a object , OpenGL display it in frame buffer either I call before it "glStencilFun(GL_NEVER,..." or "glStencilFun(GL_ALWAYS,..." .
What is my mistake ?
Thankyou before your favour.
06-30-2005, 09:35 PM
I assume you have enabled stencil test
glEnable( GL_STENCIL);? :D
06-30-2005, 10:41 PM
First take a step back and check if you have requested and selected a pixelformat with stencil bits.
glGetIntegerv(GL_STENCIL_BITS, &stencilBits) will tell you if you actually got some.
You should have cleared the stencil buffer with glClear(GL_STENCIL_BUFFER_BIT) before rendering into it.
You need to enable glEnable(GL_STENCIL_TEST) to get any data rendered into the stencil buffer.
Then set the glStencilFunc() and glStencilOp() and glStencilMask() they way you need it.
Render your stuff into the stencil buffer.
If you don't want this to occur in the color or depth buffer, use glColorMask() and/or glDepthMask() to disable writes to the respective buffer.
Enable the write masks again, change your stencil funcs and render stuff which should be tested somehow against the previous stencil data.
07-01-2005, 01:03 AM
You wrote "glStencilFun(GL_NEVER,..." :confused:
Don't forget the c from function such as
"glStencilFunc" as said Relic
Powered by vBulletin® Version 4.2.2 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.