Some questions about Alphatesting

Hi

I´d like to add alphatesting to my “engine” to be able to use complex objects like fences etc. with only a few polys.
So i played a bit with that function to find out how it works.

My documentation says that alphatesting is only used if the texture has an alpha layer.
I thought that this means, that a texture, which has no alpha values, will always be drawn. However it seems, as if it is the other way round. So all textures, that don´t have alpha values, are never drawn. Can i change this behaviour?

I also tested my app with alphatesting enabled and disabled. I used only textures with an alpha layer. I made also sure, that my app is fillrate-limited, because i thought that alphatesting is a pixeloperation so it should slow down my app if i am already fillrate-limited.

But big surprise! It didn´t slow down in any way. It even speed up a bit (from 330 FPS to 333 FPS :wink: ).
Does that mean, that alphatesting is implemented in hardware in a way which doesn´t impact on performance if you have enabled it or not?

BTW: I have a Geforce 2 Ti 64MB.

Jan.

alpha testing does speed up the rendering, and that’s normal because when the test fails the fragment does not continue through the whole pipeline.

Use glAlphaFunc to control the behaviour of the alpha test. Don’t know why you’re getting nothing drawn for RGB textures - the default func is GL_ALWAYS (which means it should always draw the fragment).
Do this:-
glAlphaFunc(GL_GREATER, 0.0f);

This is basically saying only draw fragments with alpha values greater than zero. If a texture has no alpha component (RGB) then the alpha is assumed to be 1 by OpenGL, so they should always be drawn (1 is greater than 0). (remember though, that if your texture environment is GL_MODULATE on the first texture unit, then the colour/material alpha will be multiplied by the alpha of the texture (1 if no alpha component in texture) which could give you a final alpha value of 0 (if your material/colours alpha is 0, of course)).

The biggest speed improvement you’ll see is when doing alpha blending - the above alpha func allows GL to discard any fragment with an alpha of zero, so it doesn’t need to do any blending operation with it - if you don’t use the alpha test opengl will do the blending maths on a fragment with zero alpha, which in most cases is a waste of time.

[This message has been edited by knackered (edited 12-24-2002).]

Originally posted by Jan2000:
My documentation says that alphatesting is only used if the texture has an alpha layer.
Your documentation is wrong. Alpha testing can indeed be used without any texturing, or with textures that don’t have alpha channels. It’s up to you how you setup your alpha channel, GL_ARB_texture_env_combine being the first method that comes to mind.

When in doubt, read the OpenGL spec, it’s all in there.

Btw, your small speedup is easily explained with fragments that are discarded by the alpha test. They save at least one color+z write, which is always good.

PS: Please steer clear from ‘pure’ alpha testing. It breaks texture filters and multisampled anti-aliasing, people really hate that. If you really have to do transparent stuff with minimum polygon counts, use blending first. Throw in alpha test as an added bonus, if at all.

Originally posted by zeckensack:
Please steer clear from ‘pure’ alpha testing. It breaks texture filters and multisampled anti-aliasing, people really hate that. If you really have to do transparent stuff with minimum polygon counts, use blending first. Throw in alpha test as an added bonus, if at all.

Eh?
Breaks multisampling? Texture filters? In what way?
You really should use the alpha test as a matter of course when doing alpha blending, simply because of the save in framebuffer read/writes already mentioned. If I remember correctly, this is something that is constantly being emphasised at least by the nvidia driver developers.

[This message has been edited by knackered (edited 12-24-2002).]

Originally posted by knackered:
Eh?
Breaks multisampling?

Alpha introduces edges that don’t lie at polygon boundaries. Supersampling will catch those, but multisampling won’t. Jaggies result.

Blending in conjunction with a reasonable texture filter will smooth these jaggies out, at the cost of the additional bandwith burden of blending of course.

Texture filters? In what way?
You can’t generally use alpha testing and texture filters together (without blending also being used). Whatever threshold you choose, you’ll always end up with fragments that aren’t discarded though you would want them to. Say you have a simple fence. Alpha values 1.0, 0.0, 1.0 for a thingy, a gap and another thingy directly next to each other. Filter these together (minification) and you’ll end up with some alpha between 0 and 1. If your alpha test threshold is too low, the fence will completely disappear in the distance. Set the threshold too high and the fence will be completely opaque.

The simple solution is to not use texture filtering, which is bad, very very bad. The better solution is to throw in blending and add the alpha test only to eliminate those pixels that won’t contribute much. The exact meaning of ‘not contributing much’ depends on your blend function and I’d suggest alpha smaller than 10% as a starting point.

Emphasis here: ‘not contributing much’ cannot be defined if blending is not enabled