Stencil test doesn't work on some cards

I’m trying to do a simple stencil test to clip some data. This works on my development system (laptop with ATI Mobility X300) but does not clip on an Nvidia Quadro FX550). I looked at various tutorials and thought I had it right (until trying the other system).

Here’s what I’m doing (the comments were to help me as I worked this through and are not necessarily accurate)


Gl.glClear(Gl.GL_STENCIL_BUFFER_BIT);
// don't draw things to the screen for now
Gl.glColorMask(Gl.GL_FALSE, Gl.GL_FALSE, Gl.GL_FALSE, Gl.GL_FALSE);
Gl.glEnable(Gl.GL_STENCIL_TEST);    // Enable Stencil Test

// now we setup the stencil. First we say the test always passes
// and the reference and mask are 1
Gl.glStencilFunc(Gl.GL_ALWAYS, 1, 1);
// replace the value in the buffer, replace the 
Gl.glStencilOp(Gl.GL_REPLACE, Gl.GL_REPLACE, Gl.GL_REPLACE);
DrawClippingShape();    // the shape to clip against
Gl.glColorMask(Gl.GL_TRUE, Gl.GL_TRUE, Gl.GL_TRUE, Gl.GL_TRUE);
Gl.glStencilFunc(Gl.GL_EQUAL, 1, 1);
Gl.glStencilOp(Gl.GL_KEEP, Gl.GL_KEEP, Gl.GL_KEEP);

I’m not doing much other setup for this. At startup, I do:


Gl.glClearStencil(0);   // clear the stencil buffer to 0s

So what have I missed?

Perplexedly,

Steve Bass

Make sure you have any stencil bits in the pixel format you got.

I’m not sure I understand what you mean.

I’ve added calls to draw the clipping shape to my draw function (so I can be sure I am drawing where I expect). This works on both systems.

The drawing code is pretty simple. I’m drawing an ellipsoid. After setting up the environment, this is about all I do


Gl.glPushMatrix();
Gl.glTranslatef(xStart + xRadius, yStart + yRadius, zStart + zRadius);
Gl.glScalef(1.0f, yRadius / xRadius, zRadius / xRadius);
Glu.gluSphere(quadric, (double)xRadius, 50, 50);
Gl.glPopMatrix();

When called from the draw routine, this shows up as expected. On the ATI system, it also clips to the stencil. On the Nvidia systems (also including a laptop Nvidia Quadro FX 350M) it doesn’t clip to the stencil.

Thanks,

Steve

I meant try calling glGetIntegerv(GL_STENCIL_BITS, &bits) and see if you get a non-zero value back. Your pixel format may not have any stencil bits. If you get 8 back, then the problem is elsewhere. If you get zero, then check your ChoosePixelFormat() code.

That pointed me in the right direction. StencilBits was being set to 0 in the designer code in VisualStudio. This was ignored by the ATI cards, so I thought I had done everything right.

Thanks for the insight. It got me to look in the right places and solve the problem.

Steve

Just for clarity, you may end up with a better pixel format than was asked for. The driver doesn’t have to expose pixel formats without stencil. Don’t know about earlier hardware, but I just checked on my HD 2900, and there is indeed no pixel format without stencil.