glAlphaFunc style function for colors?

I was curious, is there a function that will drop pixels based on color, similiar to how glAlphaFunc drops them based on alpha value? I am trying to save some space on old vid cards, and load 16bit textures instead of 32 bit. BUT, i am currently using the alpha test, (which requires a 32 bit texture of course). So what im looking for is a way to say, if the pixel = this color, dont draw it to the screen. Any functions like this exist??

this function is usualy called “chroma keying”.
try this thread:
http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/000655.html

Well CRAP. Thanks for the quick answer, and from what i read, it isnt possible. That sux, that realy should be something they implement in 2.0 dont ya think? Would make some things much much smaller in memory. Alwell. Thankx.

AdrianD: if the conclusion is that chromakeing is not possible in OpenGL, then it’s wrong I think.

dig up man !
ah, there it is : http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/007658.html

ARB_fragment_program can do this using the KIL instruction.

Could you explain that one a little more? Please and thank you.

Explain what please ?
For colorkeing as seen in the thread I linked, all info is in the topic. You “only” need a GeForce or a Radeon (and even less I think).

For ARB_fragment_program you need at least a Radeon 9700 or a GeForce FX

Chroma keying (killing based on the color of a texture lookup) is not supported in standard OpenGL, and I’m not aware of any extensions that provide the functionality, either.

There are some significant issues with that approach. For example, what happens if you are using a linear filter and some of your samples have the magic color and others don’t? You can’t partially kill a fragment.

ARB_fragment_program gets the post-filtered texture lookup results, so it can’t really handle this situation, either.

The usual approach to working around this problem is to use a texture with an alpha channel, in combination with alpha test. If you just want to encode visible/invisible, you only need a single bit of alpha. It wouldn’t surprise me if most/all video cards of interest supported RGB5_A1, or at minimum, RGBA4 texture formats. You could pass in one of these formats, but there’s no guarantee that you get one. To know for sure, you can use proxy textures (or real ones) and call GetTexLevelParameteriv to query the texture component sizes that you would get.

If you are using a transparent encoding, you need to be careful to ensure that your final filtered/blended result is correct. Assume that you have a texture that is black and white, where black means “transparent”. If you have a texture lookup where you get 75% white, 25% transparent, your RGBA values will be (0.75, 0.75, 0.75, 0.75). You can use alpha test to kill fully transparent fragments. But you shouldn’t use standard SRC_ALPHA, ONE_MINUS_SRC_ALPHA blending. In this example, the color contribution of the fragment would be 0.75 * 0.75 = 0.5625 (56% gray). You would want a contribution of 75% gray.

Hope this helps,
Pat

Pat,

I agree that filtering would make using chroma keying with KIL not do the right thing.

However, I believe that many instances of chroma keying use a 1:1 texture ixel ratio, such as with live video display, in which case you can use NEAREST filtering and KIL-keying will work just fine.

I’m not aware of any video capture devices that give you a separate alpha channel :slight_smile: So I guess it all depends on what your source and usage model is.