While not quite consistent with the way ARB_multisample is specified, NVIDIA uses the SwapBuffers operation as a trigger for downsampling multisample sample buffers (other operations such as glReadPixels also trigger downsampling).
I believe the 6200 doesn’t support floating point blending or texture filtering, so maybe your problem is related to that as resolving multisample buffers to single sample buffers is basically a weighted interpolation scheme.
The OP is using the backbuffer and the backbuffer depth buffer is never in a float format. It is either 16 bit or 24 bit integer format.
When you call glReadPixels(…, GL_FLOAT, …)
the driver will download the integer value and convert to float format on the CPU and then it hands it to you.