alpha-to-coverage on NV 182.08

I noticed that with the latest Geforce drivers (version 182.08) alpha-to-coverage results into much more transparent surfaces, and the dithering pattern has changed.

There was a bug before 182.08 (I’m not sure which exact version, but a recent one) that caused incorrect rasterization on triangle edges between connected alpha blended (non-multisampled) polygons (e.g. full screen quad) in front of the area of the geometry using alpha-to-coverage. I guess that was a flaw of the old alpha-to-coverage method and they replaced it, hence the different ‘look’.

But the new dither pattern isn’t as nice. :-/ It looks closer to the DirectX pattern now, and is more pronounced.

Anyway, I’m wondering, is the new rasterization correct (that is, more correct than the old)? I guess the ‘amount’ of transparency should compare to regular alpha blending as closely as possible?

Is there anything in place that gives me control over this pattern? I see some new extensions. :slight_smile:

I’ve noticed that the driver setting “Antialiasing - gamma correction”, when enabled, applies gamma correction to the alpha value used in alpha to coverage. To get proper blending I have to disable this driver setting, or apply the inverse of whatever gamma correction function is being used.

In my opinion this driver setting should not apply to alpha, I don’t see the point and it’s really a pain since I don’t know whether this gamma-corrected alpha is going to happen without running some tests.

Disabling “Antialiasing - Gamma correction” indeed fixes the problem.

I also cannot think of a situation where this would be desirable, my guess is that it is a bug. Also, it didn’t happen with the previous driver version.

Thank you very much, I wouldn’t have found this without your post. :slight_smile:

Image to illustrate:

Can someone from nVidia tell us if this is supposed to happen?

remdul,

Which antialiasing setting are you using? 8xQ? 16xQ? I looked at this and I did not find any behavior difference between 177.41, 180.48 and 182.08 releases.

The 2x2 dither pattern is still the same which mean you always have: msaa samples * 2 * 2 transparency levels. What is sub optimal since CSAA modes introduction I think is the 2x2 dither pattern (16xQ, 8 msaa samples among 16 coverage samples). I don’t think it’s against the spec though:

"This specification does not require a specific algorithm for converting an alpha value to a temporary coverage value. It is intended that the number of 1’s in the temporary coverage be proportional to the alpha value, with all 1’s corresponding to he maximum alpha value, and all 0’s corresponding to an alpha alue of 0. "

I looked at the screenshot and I think the difference you see is the correct effect that would have “Antialiasing - Gamma correction”. The alpha value is not gamma corrected or converted but the samples written are considered gamma corrected as the whole frame buffer at MSAA resolve stage (when the MSAA buffer is converted to displayable pixels).

What would be nice to add is an API to enable/disable “Antialiasing - Gamma correction”. This parameter is forced to disable when you use FBOs see: http://www.opengl.org/discussion_boards/…2491#Post242491
The fact is only the application really knows if it’s rendered content is gamma corrected therefore it’s a bad idea for the driver to make a guess.

Cub

When I first observed this problem, the alpha value was noticeably altered when “Antialiasing - Gamma correction” was enabled. The steps between each level of alpha blending in alpha-to-coverage are easy to see in my case, and I noticed much fewer levels of transparency when this driver setting was enabled (the distribution was shifted so that most pixels ended up in only one or two levels, rather than an even spread).

If I insert this into my shader before writing alpha, I get the correct original distribution of transparency levels:
alpha = pow(alpha, 2.2);

So, I think the problem we’re seeing would be in addition to the problems you observed with FBOs.

Which antialiasing setting are you using? 8xQ? 16xQ? I looked at this and I did not find any behavior difference between 177.41, 180.48 and 182.08 releases.

This was 8x, specified by application (non-forced).

I was most likely wrong about the about the pattern being different, it just looks very different due to the non-linear alpha distribution when AA-gamma correction is enabled.

My guess is that during driver installation, the default setting has changed from ‘Off’ to ‘On’ or the functionality of this feature has been reversed in the driver. I did not alter this setting at any time. I usually briefly check my settings before and after installation but I must have missed this if it really changed (which I doubt).

“This specification does not require a specific algorithm for converting an alpha value to a temporary coverage value. It is intended that the number of 1’s in the temporary coverage be proportional to the alpha value, with all 1’s corresponding to he maximum alpha value, and all 0’s corresponding to an alpha alue of 0.”

I guess that means what I’m seeing is intended (or at least allowed by the spec). :frowning:

Bummer, cos I was hoping to use alpha-to-coverage extensively for dense foliage. With this gamma correction option, players can just turn it on and look straight through vegetation. Back to ugly aliased alpha-testing then. :frowning:

I might try those alternatives though, thanks.

And I still don’t get what this thing is good for. Can someone explain?

This is to get more correct antialiasing. ATi did that for years, but it is more recent for nvidia. All that time, nvidia antialiased lines seemed slightly bigger than ati’s or non-AA lines.
Theory is that linearly averaging color values don’t work, as screen light response is non linear.

However no idea of what sould be the correct behaviour for alpha-to-coverage…

Thanks.

I checked to see if wires were drawn different with this option enabled/disabled, there is indeed some minor difference, but lines are thicker when it is ‘on’, thinner when ‘off’. Lines look nicer when it is ‘on’ (similar to using GL_LINE_SMOOTH lines with glLineWidth(1.3) ).

Of course, line AA has little to do with alpha-to-coverage other than that it (ab)uses the same mechanism. I guess nVidia added their gamma correction and forgot about alpha-to-coverage.

I’m pretty sure the correct behavior would be a closest match against GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA blending, in which case AA-gamma correction throws it off (too transparent).

To be more precise, the thicker/thinner depends whether it is white lines on black background or the opposite :slight_smile:

And you are right about alpha coverage, it should mimick the classic blending, so it is indeed a regression.

Here’s some screenshots…

White quad, alpha gradient [0, 1.0] @ 8x MSAA (gamma correction off):
http://cdn-2-service.phanfare.com/images…45e7111799f7d_4

White quad, alpha gradient [0, 1.0] @ 8x MSAA (gamma correction on):
http://cdn-2-service.phanfare.com/images…8b70309c36361_4

Notice there is no band shift between the two screenshots so the difference only comes from the fact that pixels are put back in “gamma space” after fragments averaging (resolve stage) which is correct. The alpha value is not converted somehow.

Here is what happen when gamma correction is on:
each fragment is linearized, averaged, and the result is put back in gamma space (resulting pixel). In my screenshot linearization does nothing since color is 1.0. Quad’s white fragments and black clear color fragments are then averaged and the end result (<1.0) is boosted in gamma space.

I don’t think alpha to coverage need to match alpha blend. It is a better alternative than the aliasing prone alpha test. You may want to detect if “gamma correction” is on at application initialization time by doing a simple test and then adjust your textures or shader… better than reverting back to alpha test.

Foliage and fences are typical usages.

Additional notes:

  • Correct me if I’m wrong but I think on ATI the “gamma correction” is still set to on (default) and there is no way to disable it in their control panel.

  • The 2x2 dither pattern is bad when CSAA is involded (16xQ) even if there is also 8 MSAA samples:
    http://cdn-2-service.phanfare.com/images…7f10cf80e168f_4

Cub

To achieve proper colour space support the two should match. But this means that if the framebuffer is considered sRGB-encoded then blending should involve sRGB<->lRGB conversions as well.