View Full Version : alpha blend

09-26-2002, 01:28 AM

Anyone knows how i can stop alphablending for the alpha channel? When i draw a fullscreen poly with alpha=0xff, and on top of that one another poly with alpha=0x7d, i want the framebuffer alpha to be 0xff, and not 0xbf (which it is with blendfunc src, one-src)


09-26-2002, 01:45 AM
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE) should work. It will prevent the alpha channel from being updated.

09-26-2002, 02:00 AM
Yea, but then alpha will be 0, and not 0xff as i would like.

09-26-2002, 02:27 AM
You might want to look at the EXT_blend_func_separate extension:

"Blending capability is extended by defining a function that allows independent setting of the RGB and alpha blend factors for blend operations that require source and destination blend factors. It is not always desired that the blending used for RGB is also applied to alpha."


09-26-2002, 02:33 AM
I will look in to it.

Thanks man!


09-26-2002, 03:00 AM
Isnt EXT_blend_func_separate sgi only? Cant find support for it on either gf4 or a wildcat vp board..


09-26-2002, 04:12 AM
No, the color mask should work. Just clear the buffer as glClearColor(R, G, B, 1.0);
and then set the color mask during blending.


09-26-2002, 08:50 AM
Originally posted by nehh:
Isnt EXT_blend_func_separate sgi only? Cant find support for it on either gf4 or a wildcat vp board..


My ATI Radeon 8500LE s'ports it


09-26-2002, 04:36 PM
Most simple transparency operations use SRC_ALPHA, ONE_MINUS_SRC_ALPHA. If that's all you're doing, it really doesn't matter what shows up in the alpha channel. In fact, there's no need to even ask for alpha bitplanes. Drivers can sometimes squeeze out a little extra performance if they don't have to worry about getting alpha "right".

You can't "see" stored alpha. If your blending doesn't use it, just ignore it.

If you *REALLY* need separate blend functions on alpha, EXT_blend_func_separate is what you need. This is standard in OpenGL 1.4, but not universally supported in hardware.

Current NVIDIA platforms support it only in our software renderer. The upcoming NV30 hardware will support it natively.

For extensions that have made it into the OpenGL core, we have an unofficial policy of exposing the extension string only if the feature is supported in hardware.

Leyder Dylan
09-26-2002, 04:55 PM
On my webiste, I've a little example for using alpha chanel with a TGA file.

09-27-2002, 01:58 AM
Leyder dylan:
thx, i'll check it out..

i use src_alpha, one_minus_src_alpha
Whats shows up in the alpha channel doesnt matter as long as you're not going to use it for any other purpose than showing it. I'm grabbing the framebuffer and use it as a texture on a broadcast framebuffer board, and thus needs the alpha. EXT_blend thingy would do the trick, but i dont use boards which supports it. Therefore i ended up drawing multipass with glColorMask, and different blend functions.

The policy of just exposing the ext if its supported by hardware, is the way to do it. Anything else would be kinda stupid http://www.opengl.org/discussion_boards/ubb/smile.gif

But thanks for the help guys..