I tested this with both Nvidia GT200 and G80/G90 series (drivers release Forceware 177.35 & 177.41). If the rendering is done without FBOs i.e. multisampling is controlled either by the Nvidia control panel or by selecting a pixel format with a multisample buffer (ARB_multisample extension) then the AA resolve is gamma correct. The Nvidia control panel parameter "Antialiasing - Gamma correction" must be set to "on" though.

Gamma correct means each MSAA sample is de-gammatized prior resolve and then gammatized again. Gamma correct resolve results in a much better AA quality as shown by the following screenshots of my test application (@ 8xQ - 8x pure MSAA, no CSAA):

Gamma correct star (no FBOs):

Gamma incorrect star (FBOs):

The gamma incorrect version looks like being outlined black...

The theory behind this is well explained here:

In addition, here is a test that "magnify" the MSAA gradient seen on the edge of a white triangle on a black background. The middle gradient band is the MSAA gradient, the upper band the expected gamma correct resolve result and the bottom band is the expected result without gamma correction.

Gamma correct gradient (no FBOs):

Gamma incorrect gradient (FBOs):

From the following specification note it seems all the API is there...

Excerpt from "EXT_framebuffer_sRGB":

"17) How does this extension interact with multisampling?

RESOLVED: There are no explicit interactions. However, arguably
if the color samples for multisampling are sRGB encoded, the
samples should be linearized before being "resolved" for display
and then recoverted to sRGB if the output device expects sRGB
encoded color components.

This is really a video scan-out issue and beyond the scope
of this extension which is focused on the rendering issues.
However some implementation advice is provided:

The implementation sufficiently aware of the gamma correction
configured for the display device could decide to perform an
sRGB-correct multisample resolve. Whether this occurs or not
could be determined by a control panel setting or inferred by
the application's use of this extension."

As a result I would expect this to work in order to get a gamma correct AA resolve:

- Render to FBO with a render buffer's internal format set to GL_SRGB8_ALPHA8_EXT (EXT_texture_sRGB - not supported for now, FBO not complete reported);
- Enable GL_FRAMEBUFFER_SRGB_EXT for the destination FBO (EXT_framebuffer_sRGB);
- Perform blit/resolve function.

However it would be nice to be able to render to an higher precision formats and still be get gamma correct AA (such as RGB10_A2, R11F_G11F_B10F_EXT or GL_RGBA_FLOAT16). So it could work like this:

- Enable GL_FRAMEBUFFER_SRGB_EXT for the source FBO (EXT_framebuffer_sRGB);
- Enable GL_FRAMEBUFFER_SRGB_EXT for the destination FBO (EXT_framebuffer_sRGB);
- Perform blit/resolve function.

Is there a plan to fix this for Nvidia? Is it working on other vendors implementations?