Antialiasing and cracks

Hi everyone,

I’m trying to draw semi-transparent antialiased polygons. Here’s a code snippet that tries to test this out:

(where w and h are the window dimensions in pixels)

glClearColor( 1, 1, 1, 0 );
glEnable( GL_BLEND );

glViewport(0, 0, w, h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, w, 0, h, -1.0l, 1.0l);

glMatrixMode(GL_MODELVIEW);

glClear( GL_COLOR_BUFFER_BIT );
glEnable( GL_POLYGON_SMOOTH );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
glLoadIdentity();
glRotatef( -10, 0, 0, 1 );
glColor4f( 0, 0, 1, 1 );
glBegin( GL_POLYGON );
	glVertex2f( 140, 300 );
	glVertex2f( 160, 300 );
	glVertex2f( 160, 50 );
	glVertex2f( 140, 50 );
glEnd();
glColor4f( 1, 0, 0, 0.5 );
glBegin( GL_POLYGON );
	glVertex2f( 100, 100 );
	glVertex2f( 200, 100 );
	glVertex2f( 200, 200 );
	glVertex2f( 100, 200 );
glEnd();

And here’s the result:
www.leadtogold.com/temp/cracks.jpg

Everything is exactly how I want it, in terms of the transparency and antialiasing effects, except for those faint cracks running between triangles! They obviously have the same vertices, so I know that’s not the problem, and I can make the cracks go away if I either disable antialiasing or use a different blending mode that kills my transparency effect. I’d really like to have them both, though.

Any suggestions would be very much appreciated!

Unrelated, but you should use GL_QUAD instead of GL_POLYGON.

EDIT : you also specify your polygons with two different orientations.

[This message has been edited by kehziah (edited 09-05-2003).]

Originally posted by kehziah:
Unrelated, but you should use GL_QUAD instead of GL_POLYGON.

No such animal as GL_QUAD, at least in my version of OpenGL. And the cracks appear in both polygons, so orientation likely isn’t the issue here (but thanks for pointing that out).

Another interesting data point: I had a friend try it out on his Geforce FX, which has antialiasing controls. Everything looked perfect when he told the card to use one of its antialiasing settings, but it had the same cracks when he switched it to “Application” control. Does this indicate that there’s a problem in the software implementation of polygon smoothing which nVidia doesn’t have in its hardware implementation?

Originally posted by Samwise415:
No such animal as GL_QUAD, at least in my version of OpenGL.

A quick look in any decent documentation on glBegin should make you understand what he meant by GL_QUAD. Hint: there’s a letter missing somewhere.

Using GL_QUADS yields the exact same “cracked” result as GL_POLYGON.

Sorry for the typo, it’s GL_QUADS not GL_QUAD. I suggested this because GL_POLYGON is notoriously significantly slower than any other primitive type because the driver must split it into triangles before rendering (ok, the QUAD is split into two triangles as well, but its a trivial task). Of course, it’s not much of a problem for such a simple scene. But I had also in mind that GL_POLYGON is rarely used, so it could be possible that this path is bugged when used in conjuction with antialiasing (which is another rarely used feature).

I am also inclined to think that its a driver bug (although the driver should be quite low on the list of things to suspect when an unexpected result is observed). Basically, fragments on the interior’s (artificial) edge should not be affected by antialiasing.

Try your app on as many config as possible and see what happens (and try with quads too).

That’s what edge flags are for.
Noone ever uses them, but they are standard vertex attributes and should remedy this problem.

(I say should because as noone uses them, they may not be implemented well)

edit: Whoops. No need for edge flags on polygons or quads. Welcome to the real world

Try glBlendFunc(GL_SRC_ALPHA,GL_ONE) instead, and see if this helps.

[This message has been edited by zeckensack (edited 09-05-2003).]

Using the modified blending function as suggested does kill the cracks, but it also seems to ignore the alpha settings I use when drawing the polygons - I’m no longer able to produce the effect of an opaque object behind a semitransparent one.

Using glClear( GL_COLOR_BUFFER_BIT ) also doesn’t seem compatible with this approach, since in my test program, at least, it whites out the screen and apparently doesn’t allow subsequent drawing operations with that blend function enabled to actually draw anything.

Is there perhaps some other method I should be pursuing to get that transparency effect?

I would try the other way : keep your blending code and drop GL_POLYGON_SMOOTH. It’s legacy path anyway. You’d better go with MSAA (see GL_ARB_multisample extension).

That’s not in the Red Book, and Google isn’t immediately turning up anything useful, but I’m intrigued. Got any links for me with code examples?

The extension specification is available here : http://oss.sgi.com/projects/ogl-sample/registry/ARB/multisample.txt

More info here : http://developer.nvidia.com/object/gdc_ogl_multisample.html

So how exactly do I use this? Do I have to upgrade to a newer version of OpenGL, or download some extra libraries and headers from the parties who are developing it? There’s not much info on opengl.org about it.

From the spec, it looks like this is a part of OpenGL version 1.2 or thereabouts…? Which might not be an option, since Windows only ships with 1.1, and I need my application to be usable by people who don’t necessarily have OpenGL video cards.

I’ll have to do more research into this… could I develop against a later version of OpenGL and have it be backwards-compatible with older versions of the runtime, as long as I do tests before using newer features?

Originally posted by Samwise415:
From the spec, it looks like this is a part of OpenGL version 1.2 or thereabouts…? Which might not be an option, since Windows only ships with 1.1

ARB_multisample is an extension and has been integrated into OpenGL 1.3.
More info about extensions : http://www.opengl.org/developers/faqs/technical/extensions.htm http://www.mesa3d.org/brianp/sig97/exten.htm

and I need my application to be usable by people who don’t necessarily have OpenGL video cards.

don’t know what you mean here. If their video cards do not have an OpenGL driver, no matter what version you need, it won’t work.

I’ll have to do more research into this… could I develop against a later version of OpenGL and have it be backwards-compatible with older versions of the runtime, as long as I do tests before using newer features?

Yes. See links above.

Originally posted by kehziah:
[b] [quote]and I need my application to be usable by people who don’t necessarily have OpenGL video cards.

don’t know what you mean here. If their video cards do not have an OpenGL driver, no matter what version you need, it won’t work.[/b][/QUOTE]

If I understand the info on the downloads page correctly, anyone who has Windows automatically has the OpenGL 1.1 runtime (presumably implemented in software even if they don’t have hardware for it).

As long as my code will be backwards-compatible, though, it’s all good. Thanks for the links!

To make things clearer : yes, opengl32.dll that ships with every windows system (since 95 IIRC) has entry points for OpenGL 1.1 functionnality.

Drivers from hardware vendors usually support later versions of OpenGL. So the functionnality is there, you can use it. The bad part is that you need to load function pointers from the driver at runtime because opengl32.dll is stuck at 1.1 (and you must have def’ed these function pointers beforehand).
The good news is that there is a very nice library that will do it for you : http://www.levp.de (extension loading library, on the left)