PDA

View Full Version : Antialiasing questions



Cheps
07-30-2004, 04:05 AM
I'm trying to make a simple program to show the differences between antialiasing 1x, 2x, 4x, etc. I've read many things about antialiasing but I'm not sure I understood everything right so I have a few questions:

1) Is using POLYGON_SMOOTH the same as using GL_MULTISAMPLE set to 1x? Also in my drivers settings, I can only choose "off", 2x and 4x, does 1x not exists for multisampling? This would be single-sampling right?

2) When using multisampling, I can clearly see a difference between 2x and 4x. When I use GL_BLEND with glBlendFunc(GL_SRC_ALPHA_SATURATE, GL_ONE), using POLYGON_SMOOTH gives me exactly the same result as multisampling 4x. If I use multisampling 2x, POLYGON_SMOOTH gives better results(ie it's like 4x). So it seems that POLYGON_SMOOTH with the glBlendFunc above uses the best my graphic card can handle, so what is the point of multisampling? Is it faster because it's an hardware 4x while POLYGON_SMOOTH has to compute it 4 times software?

3) Since GL_SAMPLES is a const you can't change it, which means you can't change the multisampling level from your application. But my drivers settings have an "Apllication-controlled" option which supposedly would allow me to change it(when I use it, multisampling is desactived)?

Xmas
07-30-2004, 09:05 AM
Originally posted by Cheps:
1) Is using POLYGON_SMOOTH the same as using GL_MULTISAMPLE set to 1x? Also in my drivers settings, I can only choose "off", 2x and 4x, does 1x not exists for multisampling? This would be single-sampling right?You can enable or disable GL_MULTISAMPLE. Enabling GL_MULTISAMPLE will disable any effects of GL_*_SMOOTH, according to the spec.
1x is "single sampling", or simply, no antialiasing.


2) When using multisampling, I can clearly see a difference between 2x and 4x. When I use GL_BLEND with glBlendFunc(GL_SRC_ALPHA_SATURATE, GL_ONE), using POLYGON_SMOOTH gives me exactly the same result as multisampling 4x. If I use multisampling 2x, POLYGON_SMOOTH gives better results(ie it's like 4x). So it seems that POLYGON_SMOOTH with the glBlendFunc above uses the best my graphic card can handle, so what is the point of multisampling? Is it faster because it's an hardware 4x while POLYGON_SMOOTH has to compute it 4 times software?Polygon smoothing and multisampling work in completely different ways. The former requires alpha blending and is therefore order dependent. It also requires you to find silhouette edges. It is fast, but only as long as you don't look at the performance hit ordering, blending and silhouette finding takes. And it fails on intersection edges.
It is usually done in hardware or not at all.

Multisampling, on the other hand, just works. And it's quite fast on modern hardware.


3) Since GL_SAMPLES is a const you can't change it, which means you can't change the multisampling level from your application. But my drivers settings have an "Apllication-controlled" option which supposedly would allow me to change it(when I use it, multisampling is desactived)?Use wglChoosePixelFormatARB with the attribute WGL_SAMPLES_ARB set to the # of samples desired.

arekkusu
07-30-2004, 01:05 PM
GL_*_SMOOTH and multisampling are completely separate, and the number of subpixel bits used in smoothing is hardware dependent (http://homepage.mac.com/arekkusu/bugs/invariance/HWAA.html) and can vary for each primitive (point/line/triangle). On some cards you can get 8 subpixel bits with SMOOTH, which would require 256 samples in some FSAA implementations.

Cheps
08-01-2004, 01:08 AM
Ok thanks for the answers, and your link is interesting arekkusu. This leads me to one more question:


Use wglChoosePixelFormatARB with the attribute WGL_SAMPLES_ARB set to the # of samples desired. I'm using glut(trying to be as platform independent as possible), so I can't do that. I bet there is no other way?

arekkusu
08-01-2004, 10:07 AM
man glutInitDisplayString.

i.e.

glutInitDisplayString("rgb double depth=16 samples=4");

Cheps
08-01-2004, 11:12 AM
Ok thanks, I couldn't find anything about glutInitDisplayString because it's not in the docs I have. I have the GLUT API version 3 docs and I couldn't find anything else on this website. Maybe you can link me to a more recent one?

Anyway thanks for your help!

arekkusu
08-01-2004, 11:42 AM
glutInitDisplayString was added in GLUT 3.4 ( changelog (http://www.opengl.org/resources/libraries/glut/CHANGES) ). But the online HTML/PDF docs (and the man pages for some distributions) have not been updated since GLUT 3.2.

Google will easily find you an updated manpage, though.