GeForce2 16-bit display issues

I have some severe slow down on GeForce2/16-bit display (work OK on GeForce2/32-bit and GeForce3/16-bit and 32-bit).

I use the following PixelFormat :
dwFlags
PFD_DRAW_TO_WINDOW
PFD_SUPPORT_OPENGL
PFD_DOUBLEBUFFER
PFD_SWAP_EXCHANGE
iPixelType = PFD_TYPE_RGBA
cColorBits = 16
cRedBits = 5
cRedShift = 11
cGreenBits = 6
cGreenShift = 5
cBlueBits = 5
cBlueShift = 0
cAlphaBits = 0
cAlphaShift = 0
cAccumBits = 64
cAccumRedBits = 16
cAccumGreenBits = 16
cAccumBlueBits = 16
cAccumAlphaBits = 16
cDepthBits = 16
cStencilBits = 0
cAuxBuffers = 0

with the same PixelFormat, some other OpenGL apps are running OK.

any idea ? Am I using something that is only supported in 32 bits on GeForce2 ?

Indeed, I was using some glStencil functions.

I thought that the stencil buffer operations were ignored when the pixel format has stencil bits = 0

The format you request is not always the format you get. To see what you really end up getting with your pixel format, you need to use DescribePixelFormat. And also, since the stencil buffer is a core component of OpenGL, you should expect some level of support to be emulated in the absence of hardware support. So try using DescribePixelFormat on the pixel format you select and look at the number of stencil bits then. I bet it will not be 0.

You have a GeForce 2, and I notice that you are trying to get a 64bit accum buffer. Those things are not supported in hardware so you should just set those to 0. It’s kind of silly to even try to use the thing on vid cards like these, unless you just wanted to learn it for whatever purpose.

-SirKnight

I didn’t post the pixel format I request, but the pixel format I got ! (see the first post, it’s not a copy of C/C++ code).
My request for PixelFormat depends of the current display mode (16/32 bits) and if it returns a GENERIC_ACCELERATED, I change some parameters to get an accelerated format.

try using DescribePixelFormat on the pixel format you select and look at the number of stencil bits then. I bet it will not be 0.
The pixel format I posted is the result of the DescribePixelFormat, and stencil bits is 0.

You have a GeForce 2, and I notice that you are trying to get a 64bit accum buffer. Those things are not supported in hardware so you should just set those to 0.
I didn’t request it, but all pixel format have 64bit accum buffer !

[This message has been edited by opla (edited 07-10-2002).]

Can you post the requested pixel format?

Originally posted by SirKnight:
[b]You have a GeForce 2, and I notice that you are trying to get a 64bit accum buffer. Those things are not supported in hardware so you should just set those to 0. It’s kind of silly to even try to use the thing on vid cards like these, unless you just wanted to learn it for whatever purpose.

-SirKnight[/b]

I beleive that even if you don’t ask for it, you get it. I’m not sure why. Perhaps there is plenty of RAM in our PCs.

Don’t worry, memory is not reserved for the accum until you make some accum related calls.

Bottom line, it’s irrelevant.

V-man

Opla… you should anyway always do your stencil functions in a pbuffer if anyway possible, because else the fillrate meight later “kill” you. If you use it for stencil shadows, that will in any case be far faster.

BlackJack