View Full Version : 32 vs 16 bit speed on GF2/MX400

03-08-2002, 11:25 PM
We've noticed a strange problem occur with the GF2/MX400 cards. When the Control Panel | Display settings are set to 32 bit color, the rendering speed of the 3D video is smooth and very fast.

However, when 16 bit color is selected in Control Panel the 3D display seems to get bogged down for full screen use with complex sets of objects. HW/ICD pixel format is still being used. But nothing like this happens when using 32 bit color.

We've tried this with 3 different GF2/MX cards on different machines and the results are identical. The 16 bit mode acts much slower then the 32 bit mode, but only when in full screen with lots of objects.

Anyone know anything about this?

Thanks, Chris.

03-11-2002, 02:48 AM
Do you use the stencil buffer by any chance?


LG http://www.opengl.org/discussion_boards/ubb/biggrin.gif

03-11-2002, 04:37 AM
Yep. Does the cause have something to do with that?

03-11-2002, 04:41 AM
GF2 and earlier can't handle stencil buffer in hardware in 16 bit color depth, and you are kicked into software rendering.

03-11-2002, 02:24 PM
Then why does the PixelFormat list show modes with Color=16, Depth=16, Stencil=8 and ICD accel? The boards list the stencil modes as having HW accel.

03-11-2002, 02:51 PM
The fact that the pixelformat shows it as supporting 8 bits of stencil does not imply that it is supported by hardware. This has been discussed many times on this board (even to the point of explaining exactly why the OpenGL implementation must behave this way for these cards). Also, reread the spec for the pixelformat descriptor a little more carefully and you'll see that it doesn't explicitly indicate that any format is supported by hardware, but rather is supported by the hardware OR a device driver. In this case, the format is supported by the device driver, but not the hardware.

[This message has been edited by DFrey (edited 03-11-2002).]