Multisampling and Polygon Smoothing

I have an application that displays a trianlge mesh. Its all working well except that it looks pixelated. I have read that both polygon smoothing and mutltisampling are two ways to remove that problem. I tried turning on multisampling using:
glEnable( GL_MULTISAMPLE );

This had no effect. I then tried turning on Polygon Smoothing even though my book says in order for this to work my polygons must be ordered from front to back. I used this code for Polygon Smoothing:

glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
glEnable( GL_POLYGON_SMOOTH );

I did this on two different machines. The first is running a Intel 82845G graphics card. There was no change on this machine. Then i tried it on a Nvidia Quadro4 550 XGL. On this machine the pixelation seemed to go away, but it looks like theres a wire frame over the entire model in the background color of the window. Can anyone help me to get multisampling working (or even figure out if the graphics cards i am using support it) or how to resolve the wire frame issue on the mesh with Polygon Smoothing? Thanks -

From your results Intel doesn’t seem to suppport polygon antialiasing.
The edges between the polygon come from the depth test. The edge pixels are blended with the background color once and the depth value written at that pixel is blocking the adjacent polygon’s coverage at that pixel. That is why the OpenGL programming guide says to use the destination blending mode “saturate”, disable depth test and render from front to back. Tha saturation of the destination alpha channel (must be present in the pixelformat!) will keep farther polygons from updating the screen.

The multisample cannot be just enabled. You must have requested a multisample pixelformat before.
Check the tutorials. You need GL_ARB_multisample and WGL_ARB_pixel_format for that. Specs are here:
http://oss.sgi.com/projects/ogl-sample/registry/

Note that on multisample pixelformats enabling polygon_smooth has no additional effect

The two cards I am working with don’t support the WGL_ARB_pixel_format extension nor do their drivers have pixel formats with Multisampling support. So I’m wondering how widely supported multisampling is?

Next question is if multisampling is not widely supported then what is the standard for making 3D models look smooth and not pixelated?

Thanks for the help!

NVIDIA should have support for the WGL_ARB_pixel_format extension. Read the specs, it’s not in the GL_EXTENSIONS string, but queried with the wglGetExtensionsStringARB() function from the WGL_ARB_extensions_string extension.

If the multisample extension is not supported, there still can be a control panel option which allows to force full scene antialiasing, but this can not be enabled programmatically.

The other usual way to do full scene antialising is the accumulation buffer (in the RedBook examples), but it’s not hardware accelerated on your boards either.

You can find a dump of supported extensions on various boards and drivers here: http://www.delphi3d.net/hardware/listreports.php
Multisampling is widely supported today.

Relic - I looked on the graphic card registry and you’re right, the nVidia should support multisampling.

I am using the OpenGL SuperBible’s helper function to determine if the WGL extension is supported. It first tries to get the “wglGetExtensionsStringARB” extension to determine if the pixel_format extension is supported. It is failing here and returning NULL, indicating that the system thinks that the ExtensionsString extension is not supported, even though the registry page said it was for both of my graphics card. I am making the call from the OnCreate method of my window. Do you have any suggestions why i’m not able to get a function pointer to the “wglGetExtensionsStringARB” extension? Thanks -

I’ve just used the glGetString( GL_EXTENSIONS ) to look at all the extensions that my card says it supports. Even though the website said my cards should support the WGL_ARB_extensions_string extension it is not listed in the list of extensions. Does this mean my card doesn’t support it and thus I can’t look at what WGL extensions it supports, or is there something wrong that i’m doing? Thanks

do you have the latest drivers?

Yeah, I just got the drivers for both of the cards.

You need to have a dummy GL context current to get the wgl function pointers. Check some tutorials on wglChoosePixelFormatARB.
Here’s one: http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46
(Use at you own risk, I never used NeHe code.)
But if your GL_EXTENSION string does not contain ARB_multisample, you are not getting anywhere using the wglChoosePixelFormatARB function.

Hi,
From my experience with the antialiasing of triangle strips, the best quality is achieved with
the blending of the computed coverage. For polygons you should use the adequate function:
glBlendFunc( GL_SRC_ALPHA_SATURATE, GL_ONE ) ;
and sort them as Relic mentioned (if they are overlapping). But you have to start with a black
transparent background - if you want to have a white background just draw a final white quad.
Multisampling is a bit below the level of quality of the blending-coverage solution - at least
on GF4Ti and GFFX 59xx, but you don’t have to sort the polygons anymore. GF 6800 may do
this even better, but I don’t have one yet. The Quadros should do the polygon antialiasing
much better. Ati Radeon 9xxx don’t seem to support polygon antialiasing (they do it for lines).
Relic, as a VERY frequent contributor, maybe you can tell me something: how to enable
4x supersampling on nVidia cards. Multisampling is not working properly together with the
blending-coverage antialiasing, but supersampling does (tested it years ago on GF2Ti).
I know it would be slower, but I am interested in quality (on GFFX 5900 this kind of job is
bus-limited anyway, when using vertex arrays).
Thank you

The current rendering context was the problem. Thanks Relic -

As for multisampling its not supported on either of my cards, and i don’t think i have accumulation buffers in which to do my own oversampling. So is the only way to do full-screen antialiasing for machines such as mine to order the polygons and use polygon smoothing?

Originally posted by Tzupy:
Ati Radeon 9xxx don’t seem to support polygon antialiasing (they do it for lines).
Polygon smoothing is not supported on Radeon 9500+. It works on Radeon 7000-9200.

Hi,
Tarek: For the integrated Intel graphics, you can forget about antialiasing, but the
Quadro should get you good results with the blending-coverage antialiasing. It also might be
able to support supersampling, but the driver doesn’t expose it.
Arekkusu: Thank you for the correction - Radeon 9200 is about the same technology as the
8500. I remember seeing polygon antialiasing work on Radeon 7500, and I was very
dissapointed when it didn’t work on 9700. Rumors are that Ati bought the R300 architecture
from a startup company and didn’t bother to integrate some R200 features into it.
Can anyone answer my question on 4x supersampling? I think it’s on topic…

Originally posted by Tzupy:
[QB]
Relic, as a VERY frequent contributor, maybe you can tell me something: how to enable
4x supersampling on nVidia cards. Multisampling is not working properly together with the
blending-coverage antialiasing, but supersampling does (tested it years ago on GF2Ti).
I know it would be slower, but I am interested in quality (on GFFX 5900 this kind of job is
bus-limited anyway, when using vertex arrays).
Thank you
AFAIK, the Quadro4 550 XGL is an NV17GL based board (http://pciids.sourceforge.net/iii/?i=10de).
It doesn’t support multisampling, so exporting ARB_multisample is not useful, which means there are no pixelformats with samples > 1 to select from programmatically.
You can only force supersampling in the control panel and keep your app unchanged.
For chips supporting multismpling, that is what you get, because it’s faster to calculate the color at the multisample-point than on the four supersamples. There is no way to select supersampling via the pixelformats then.

Hi Relic,
I want to enable 4x supersampling from the control panel, if there is some registry tweak that would allow it.
AFAIK the 8x mode is 4x multisampling and 2x supersampling (that’s why it’s so slow), so there should be some supersampling capability hidden in the nVidia drivers, maybe there’s some way to access it…
For applications that require best quality - what I am drawing are MANY tiny lines and quads or triangles (in strips, of course) it would make a difference, added to the blending antialiasing.

Don’t know. Does your control panel have an antialiasing slider? What happens if you set it to 4x?
For registry tweaks I’d look on guru3d.com

Hi,
Of course I can set 4x FSAA from the advanced properties panel. But it’s 4x multisampling, not supersampling. nVidia documentation says that coverage-blending AA doesn’t work with multisampling, probably due to the way the samples are taken. But I’m sure it works with supersampling, had it working on GeForce2Ti - well, for the coverage-blending had a tweaked driver and for the supersampling an additional registry setting…
On old nVidia drivers, 3x.xx I think, it was possible to enable 16x FSAA (4x multisampling and 4x supersampling). I think Quadro FX have it in the FSAA panel, but I’m not going to get one, they are damn expensive (only going to get a 6800 GT). I am not aware if new drivers and tweaks (RivaTuner) can do 16x FSAA. If they can, then pure 4x supersampling should be doable, too.
Tarek, sorry for hijacking the thread…