View Full Version : Alpha blending issues, when drawing frame buffer into default buffer.

Goran Milovanovic
10-06-2015, 10:46 PM
I have two scenes:





I want A to serve as a background scene, and B to serve as a foreground scene, so I do this:

* clear depth buffer
* clear color buffer
* render A
* clear depth buffer
* render B

And that works as expected:


But now, instead of rendering everything directly into the default buffer, I want to render B into a FBO first, and then render the FBO into the default buffer. So I did this:

* clear depth buffer
* clear color buffer
* render A
* clear depth buffer
* render B to FBO
* render FBO

However, that produces a different result:


I did some research as to why this is happening, and it seems that the blend function I'm using (typical GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) is not only blending the colors, but the alpha value as well, and following the math:

(sA*sA) + (dA*(1-sA)) = rA ---> (0.8*0.8) + (1.0*(1.0-0.8)) = 0.84

So, that's < 1, and I guess that's why the background is "leaking" when rendering the FBO into the default buffer.

But, why does it work fine when rendering directly into the default buffer? Why doesn't that leak in the same way?

I would like to use the FBO, but get the same blending that I would when rendering directly. Is there a way to do that?

10-07-2015, 01:43 AM
If you want to render to an intermediate buffer with an alpha channel, you need to use pre-multiplied alpha.

The blend function should be glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA), and any colours (including from textures) need to use pre-multiplied alpha, i.e. the pixel values should be (a*r, a*g, a*b, a) rather than (r,g,b,a).

Blending a source colour+alpha (c1,a1) onto a destination colour cD gives:
cD' = cD*(1-a1)+c1*a1

Blending a second source colour+alpha (c2,a2) onto that gives:
cD'' = cD'*(1-a2)+c2*a2
= (cD*(1-a1)+c1*a1)*(1-a2)+c2*a2
= (1-a1)*(1-a2)*cD + (a2*c2) + (1-a2)*(a1*c1)

An intermediate colour (cI,aI) determined by (c1,a1) and (c2,a2) which can be blended onto the destination cD needs to have
ai*cI = (1-a2)*(a1*c1) + (a2*c2)
aI = 1-(1-a1)*(1-a2)
= a1 + a2 - a1*a2
= (1-a2)*a1 + a2

Note that the colours only occur as terms in which they are multiplied by their associated alpha values, i.e. we're not interested in the colour per se but the colour*alpha product.

Using a blending function of (1-src_alpha)*dst+src and colours which are multiplied by their alpha component yields the desired result.

For textures, the multiplication by alpha can either be done in a shader or applied to the texture data before uploading. The latter has the advantage that linear filtering works correctly (i.e. the colour of nearly-transparent pixels doesn't bleed into the result). Some image-loading libraries have an option to return pre-multiplied data.

Goran Milovanovic
10-07-2015, 10:14 PM
Pre-multiplying alpha is not very convenient for me.

Is there no other way?

10-08-2015, 07:11 AM
Pre-multiplying alpha is not very convenient for me.

Is there no other way?
It appears that using glBlendFuncSeparate() means that only the intermediate buffer needs to use pre-multiplied alpha; the source textures/colours can use un-multiplied alpha.

For rendering into the intermediate buffer, use


For compositing the intermediate buffer onto the final buffer, use