32 vs 16 bit speed on GF2/MX400

We’ve noticed a strange problem occur with the GF2/MX400 cards. When the Control Panel | Display settings are set to 32 bit color, the rendering speed of the 3D video is smooth and very fast.

However, when 16 bit color is selected in Control Panel the 3D display seems to get bogged down for full screen use with complex sets of objects. HW/ICD pixel format is still being used. But nothing like this happens when using 32 bit color.

We’ve tried this with 3 different GF2/MX cards on different machines and the results are identical. The 16 bit mode acts much slower then the 32 bit mode, but only when in full screen with lots of objects.

Anyone know anything about this?

Thanks, Chris.

Do you use the stencil buffer by any chance?

LG

Yep. Does the cause have something to do with that?

GF2 and earlier can’t handle stencil buffer in hardware in 16 bit color depth, and you are kicked into software rendering.

Then why does the PixelFormat list show modes with Color=16, Depth=16, Stencil=8 and ICD accel? The boards list the stencil modes as having HW accel.

The fact that the pixelformat shows it as supporting 8 bits of stencil does not imply that it is supported by hardware. This has been discussed many times on this board (even to the point of explaining exactly why the OpenGL implementation must behave this way for these cards). Also, reread the spec for the pixelformat descriptor a little more carefully and you’ll see that it doesn’t explicitly indicate that any format is supported by hardware, but rather is supported by the hardware OR a device driver. In this case, the format is supported by the device driver, but not the hardware.

[This message has been edited by DFrey (edited 03-11-2002).]