Hardware Anti-Aliasing

Hi,
I posted this a while ago and ended up very confused !

My NVidia card has settings of 2 and 4 times anti-aliasing … and another option of application control … now, from a generic standpoint, how do you detect the functionality of the card and control such options ? I got a kind response that directed me along the pathway of OpenGL techniques to provide anti-aliasing support but this didn’t seem to be the same as simply switching a driver option.

Secondly, with anti-aliasing enabled - does this process also operate during a glRenderMode(GL_SELECT) ? … obviously it isn’t needed during a pick process but my timings for pick operations are definitely slower with anti-aliasing enabled.

Many thanks

Andrew

Go to your driver config utility and set “Let the apps determine the antialiasing mode”. That’s the only general, reasonnable option. The other options are here only to provide AA functionnality for old, non-AA apps.

Then, in your code, use OpenGL ARB extensions to enable/disable AA. Enable it for rendering, and disable it for picking if that may enhance speed.

Drivers often have options to allow the user to override application choices. The main examples of this are MIPmap filters, anisotropic settings, antialiasing options and internal texture quality representations.

Just look at it as the card makers giving users a way of overriding your application code. I think there are ways of preventing this, but it probably involves messing around with the registry and it’s almost certainly non standard (different between driver manufacturers).

You can see that for applications which don’t explicitly enable antialiasing you might want to turn that on to make your card look good. So there are two ways of enabling AA, one through the driver GUI, and another through OpenGL code (which might be overriden by the driver).

Yea, I don’t blame you for being confused by this. None of this is standard or in any way consistent across different manufacturers.