Polygon smooth antialiasing on white background - please help!

Hi,

When rendering smooth (nicest) polygons, or triangle strips or quad strips, on a white background, the following happens:

  • if the blending function parameters are GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA the antialising if fine (on white, black or other background), but there are some cracks between tringles; is this because of the low (just 4 bits) subpixel precision? Nvidia claims that their QuadroFX, with 12 bits of subpixel precision, gets rid of such cracks.
  • if the blending function parameters are GL_SRC_ALPHA_SATURATE, GL_ONE (as recommended in the OpenGL Programming Guide and the MSDN topic on polygon antialiasing), then the antialiasing works properly ONLY on black background, and without the cracks…
    Do you have any suggestions to get rid of the cracks between triangles AND render on a white background?
    Would it be possible to build a custom blending function as a fragment program, that does the desired job?

Thanks

You need to sort the polygons and render them in back to front order. See the very informative FAQ .

The other option is to either draw only silhouette edges using antialiased lines or to use multisampling AA. See the OpenGL spec under multi sampling for details. To use multisampling AA you need to request it in your pixel format when you create your window.

NVIDIA’s subpixel precision cannot get rid of these cracks (they are referring to a different type of crack with a different cause). The cracks are and should be there because that is what you rendered.

The previous poster is correct about the saturate alpha requiring a sort. One way to get a color clear would be to draw a colored polygon last over the whole screen to clear to the desired background color.

Hi,
Thank you for the replies.
I’m still not able to make the polygon AA work properly with the (GL_SRC_ALPHA_SATURATE, GL_ONE) blending setting. I’m posting this piece of code, please look at it and tell me what’s wrong.

glDisable( GL_DEPTH_TEST ) ;
glClearColor( 1.0, 1.0, 1.0, 0.0 ) ; // white
glClear( GL_COLOR_BUFFER_BIT ) ;
mulval = 0.01 ; // you might want to change this

glEnable( GL_BLEND ) ;
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA ) ; // works fine, except cracks
//glBlendFunc( GL_SRC_ALPHA_SATURATE, GL_ONE ) ; // works only on black background…
glPolygonMode( GL_FRONT, GL_FILL ) ;
glEnable( GL_POLYGON_SMOOTH ) ;
glHint( GL_POLYGON_SMOOTH_HINT, GL_NICEST ) ;

glColor4f( 0.0, 0.0, 0.0, 1.0 ) ; // black, opaque
glBegin( GL_QUADS ) ;
glVertex2f( mulval * -250.0, mulval * -250.0 ) ;
glVertex2f( mulval * -200.0, mulval * 200.0 ) ;
glVertex2f( mulval * 250.0, mulval * 250.0 ) ;
glVertex2f( mulval * 200.0, mulval * -200.0 ) ;
glEnd() ;

glColor4f( 1.0, 1.0, 1.0, 1.0 ) ; // white, opaque
glBegin( GL_QUADS ) ;
glVertex2f( mulval * -150.0, mulval * -150.0 ) ;
glVertex2f( mulval * -100.0, mulval * 100.0 ) ;
glVertex2f( mulval * 150.0, mulval * 150.0 ) ;
glVertex2f( mulval * 100.0, mulval * -100.0 ) ;
glEnd() ;

glColor4f( 0.7, 0.5, 0.3, 1.0 ) ; // some color, opaque
glBegin( GL_QUADS ) ;
glVertex2f( mulval * -100.0, mulval * -100.0 ) ;
glVertex2f( mulval * -50.0, mulval * 50.0 ) ;
glVertex2f( mulval * 100.0, mulval * 100.0 ) ;
glVertex2f( mulval * 50.0, mulval * -50.0 ) ;
glEnd() ;

glDisable( GL_BLEND ) ;

It’s just 2D drawing, for simplicity.
Drawing the edges separately as AA lines or using FSAA result in lesser antialiasing quality. I only have 4x FSAA, probably with 16x it would be acceptable and I wouldn’t care about this blending issue.

Thanks again.

No obvious problem. What’s the results of the above program?
With the saturate blending it should result in a black quad on white background and the other two quads should not draw at all, because the first quad has saturated the alpha channel.
If that’s not the case, make sure your pixelformat has a destination alpha channel!
(Beware, this ain’t gonna work in 16 bit color.)

Hi,

The results of the posted code are:

  • white background, black quad, white quad, light brown quad, using the GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA blending settings; the cracks between the two triangles of each quad seem to appear because the blending of the fragments of the second drawn triangle - on the crack edge - is not done with the background pixels, but with the already blended pixels of the first triangle…
  • white background and white quad (invisible), using the GL_SRC_ALPHA_SATURATE, GL_ONE setting; if the background is black, this blending setting works without cracks, can’t figure out yet why…
    This topic was first posted in the begginer’s forum, by someone else, but only I replied, and then posted it here, thinking that some gurus might have better ideas.
    Any suggestions, please?

Hi,
I forgot to say that I checked for alpha bitplanes, and it’s 8, as requested; strange that color bits are 32, although I asked for 24; the MS dox say color bits are size of RGBA minus alpha bitplanes…

Hi,
I was able to sort out the issue of drawing AA polygons on white background - by using the GL_SRC_ALPHA_SATURATE, GL_ONE setting on transparent black background and drawing the white background as the last polygon.
Thank you.

Originally posted by Tzupy:
Hi,
I was able to sort out the issue of drawing AA polygons on white background - by using the GL_SRC_ALPHA_SATURATE, GL_ONE setting on transparent black background and drawing the white background as the last polygon.
Thank you.

Hello,

I tried to use GL_SRC_ALPHA_SATURATE and GL_ONE. However, it doesn’t work in DoubleBuffer mode (on Windows 2000, my video card is GF4 MX). I drawed a polygon ( color is (1.0f, 0.0f, 0.0f, 0.6f)) on a transparent background((0.0f, 0.0f, 0.0f, 0.0f)). There is nothing showing on the screen.

However, when I turn off doublebuffer, I saw polygon drawn. Anyone knows where the problem is?

Thanks!

Does this blend functionality need any hardware acceleration to get it work?

Another post back from the dead.

Are you calling SwapBuffers()?

Originally posted by rgpc:
[b]Another post back from the dead.

Are you calling SwapBuffers()?[/b]

Yes. I used SwapBuffers(). I used the PIXELFORMATDESCRIPTOR in Windows. However, the windows generic implementatin doesn’t support destination alphaplanes. That’s why the Blending with (GL_SRC_ALPHA_SATURATE, GL_ONE) doesn’t work.

Thanks, Tzupy.

Hi,
I was a bit suprised to see my old topic raise from the dead!
It’s a bit strange that you can’t get the alpha bitplanes when requesting a double-buffer. I had other work to do and couldn’t check yet, I only need single buffer for now.
But the generic implementation doesn’t support alpha bitplanes even with single buffer, as far as I know. What happens with double-buffer could be a matter of graphic card / driver capabilities.

Well, isnt it better he replied to a topic containing the thing he wants to achieve rather than start a new one?

Or what do you guys think?