View Full Version : Greenscreen effect
10-13-2002, 06:46 PM
I have a quicktime component playing live video from a camera onto a quad, but I would like to be able to remove the background from a "character" similar to that of a weatherman greenscreen effect. Preferably I'd like the background to be white, but I'll take what I can get.
comments, ideas, suggestions, questions?
10-14-2002, 04:42 AM
Well if you’re talking about a character your going to render why would you need the screen at all? Just render the character after you render the background. Sorry if I misunderstood you…
10-14-2002, 05:09 AM
Ah I see, I'll explain better.
The live video is coming from a firewire web camera, and the character would be me against some white wall or easy to remove color.
I know that I can look through the video feed buffer and add alpha manually to each pixel that closely matches my key color. But I was hoping that opengl has some blending function that would act similarly to a luminance or color key.
10-14-2002, 05:14 AM
depends on your hw. i think with a geforce and its registercombiners, you could do colorkeying.
but more important is, your color is not perfect, due shading and video artefacts, so you have to key away a range of colors..
haven't thought much about it (yet). but i think you get the data anyways, so you could process it before you send it to gl. you could try it afterwards again in hw if you need to. dunno..
10-14-2002, 06:42 AM
To reduce artefacts, you may need a lot of light.
Also, there are very useful algoriths that can detect your background color no matter the noise (hey, with some limits obviously) but there are not really real-time oriented.
Do you need your background detection to be realtime ?
10-14-2002, 12:57 PM
Yea, Lighting will be an issue. But I've dealt with similar situations before.
I realize that 30fps at 640x480 is probably outrageous for the live feed, so we're shooting for about 10fps at a 320x240 capture size. While the openGL stuff rolls as fast as it can.
I think I'm resolving to pixel by pixel handling and drawing my own alpha values in my own rgba buffer. blech. I was shooting in the dark hoping there was some secret glBlendFunc that magically did what I wanted.
10-14-2002, 01:03 PM
The best thing I can think of would be to subtract the fragment color from the "target" color and force alpha to 0 if it's close enough, else force alpha to 1 (with probably some fade between).
Ways of doing this include using a 3d texture with one transparent texel and using r/g/b as look-up values, or to write a fragment shader that just does the math and scales/clamps. It seems hard to do in register combiners (and in that case, not really portable anyway).
You could potentially use a 2d and a 1d texture using R/G and then B as your look-up values, using GL_ADD as the texture combine mode, to end up with an appropriate alpha value. Still won't cram into 2 2D texture stages, which is all that's there on pre-GF3 hardware, though. It'll work on a Radeon, luckily :-)
10-15-2002, 02:07 AM
Maybe you can also achieve the test by using sums of dot products. Only the ARB_texture_env_combine and ARB_texture_env_dot3 extensions are needed (available on most ATI and NVIDIA cards).
I was thinking of computing something like : dot_product(RGB, Green)+dot_product(1-RGB,1-Green) where RGB is the texel from your video and where Green is something like (1,0,0).
The value of this equation will be in the range [-3,+3], where +3 means that the texel is green and where -3 means the contrary (the texel is magenta). You can then compress this value in the range [-1,+1] using ALPHA_SCALE from ARB_texture_env_combine. Set this value in the alpha component of your fragment, and you can use the alpha function (glAlphaFunc) to skip the rendering of texels that are too close to Green, corresponding to alpha values close to 1. If you want to eliminate "perfect green" only, call :
glAlphaFunc(GL_LESS, 1.0f); and if you want to be less strict, call : glAlphaFunc(GL_LESS, 0.75f); or something like that.
Hope this makes sense.
10-15-2002, 02:15 AM
You can do that a little easier http://www.opengl.org/discussion_boards/ubb/smile.gif
Use ARB_texture_env_combine (and ARB_texture_env_dot3)
First stage: subtract key color from incoming color (set as 'constant' color).
Second stage: Alpha=RGB dot3 RGB
Result will be that the alpha channel is zero for the exact key color, and positive for everything else (squares are always positive).
Thenyou set an alpha test with
glAlphaFunc(GL_GREATER,0); <- will discard only the exact key color
[This message has been edited by zeckensack (edited 10-15-2002).]
10-15-2002, 04:49 AM
I really like that method, apart the fact that you can not set ALPHA to DOT3 and RGB to PRIMARY in the same stage.
10-15-2002, 05:45 AM
Oh, that's good. I didn't think about doing that way. I'll start putting that together, and see what I can come up with.
gracious thanks and admiration!
10-16-2002, 11:14 AM
Originally posted by vincoof:
I really like that method, apart the fact that you can not set ALPHA to DOT3 and RGB to PRIMARY in the same stage.Whoops. You're right.
I've modified that a bit without checking.
But it can be done with either a Radeon (you get at least three texture units) or a Geforce (have to use register combiners).
Oh well ...
You can still generate a stencil mask with only ARB_texture_env_combine + DOT3 and two texture units, if you can live with multipassing (that's what I did).
Powered by vBulletin® Version 4.2.0 Copyright © 2013 vBulletin Solutions, Inc. All rights reserved.