View Full Version : Subtractive blending

03-10-2001, 09:38 AM

I'm trying to use subtractive blending on a GE Force 2 with the Linux Nvidia drivers.

The basic problem is that I want to subtract some alpha-only geometry from the on-screen image. I'm doing this in several passes with jittering to get some AA, so a simple GL_ONE_MINUS_SRC_ALPHA blend function won't work for me.

I'm currently using:

glBlendFunc(GL_ONE, GL_ONE);

for(each pass) {
glTranslatef(jitter[i].x, jitter[i].y, 0);
// render alpha-only image

Unfortunately, I see absolutely no effect from this. If I use a glBlendFunc(GL_ONE, GL_SRC_ALPHA) I see an inverted view of what I expect, but I can't get a positive view. glGetError isn't showing anything wrong.

Any hints?


03-10-2001, 11:47 AM
Hmm... replying to myself.

Well, I've worked out part of my problem. As I said, it's an alpha-only image, with the color components set to 0. So I'm not seeing any result, because it's only affecting the framebuffer alpha channel.

So, my new attempt looks like:

glBlendColorEXT(1., 1., 1., 1.);

...but this still does nothing for me. My understanding is that GL_CONSTANT_COLOR should scale my BlendColor (1, 1, 1, 1) by the alpha (as generated by the rendering prims), and then calculate dest = dest - (const_color * src_alpha).

What am I missing?

03-10-2001, 12:39 PM
Originally posted by jsgf:
dest = dest - (const_color * src_alpha).

No, what you've done is dest = dest - src.

I'd suggest an intensity texture and going back to ONE,ONE.

- Matt

03-10-2001, 11:10 PM
whats your clear color? when you have it to 0, then it doesnt change anything.. second make sure your texture-alpha values are correct.. and even with the constantcolor you cant see anything..