We’ve noticed a strange problem occur with the GF2/MX400 cards. When the Control Panel | Display settings are set to 32 bit color, the rendering speed of the 3D video is smooth and very fast.
However, when 16 bit color is selected in Control Panel the 3D display seems to get bogged down for full screen use with complex sets of objects. HW/ICD pixel format is still being used. But nothing like this happens when using 32 bit color.
We’ve tried this with 3 different GF2/MX cards on different machines and the results are identical. The 16 bit mode acts much slower then the 32 bit mode, but only when in full screen with lots of objects.
The fact that the pixelformat shows it as supporting 8 bits of stencil does not imply that it is supported by hardware. This has been discussed many times on this board (even to the point of explaining exactly why the OpenGL implementation must behave this way for these cards). Also, reread the spec for the pixelformat descriptor a little more carefully and you’ll see that it doesn’t explicitly indicate that any format is supported by hardware, but rather is supported by the hardware OR a device driver. In this case, the format is supported by the device driver, but not the hardware.
[This message has been edited by DFrey (edited 03-11-2002).]