Antialiasing for Outline Font

Hello. I’m using outline font to display some informations in my application. For some reasons it mus be outline. Each character is drawn using

glCallLists(1, GL_UNSIGNED_BYTE, &c);

Text itself is displayed right, but it looks quite ugly. I’m using ATI graphic card and when I’m changing its antialiasing options from “use application settings” to e.g. “level x4” these fonts become nice, sharp and clear. I want to get same result with default graphic card settings (“use app. settings”). For lines, I’ve added code:


glEnable (GL_LINE_SMOOTH);
glEnable (GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint (GL_LINE_SMOOTH_HINT, GL_DONT_CARE);

and they look way better now.
I thought that using similar code for fonts:


glEnable (GL_POLYGON_SMOOTH);
glEnable (GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint (GL_POLYGON_SMOOTH_HINT, GL_DONT_CARE);

will also work, but there is no difference.
So I would be very grateful, for some advice how use AA with that type of fonts.

I don’t recommend ever using glEnable (GL_LINE_SMOOTH), glEnable (GL_POLYGON_SMOOTH) and glEnable (GL_POINT_SMOOTH).

If you want AA, I recommend Full Scene AA
http://www.opengl.org/wiki/Multisampling

Why? Especially for lines, this has good chance of looking much better than multisampled AA afaik.

Unfortunately,it seems that I’m not able to use FSAA. I’m using Visual Studio 2003 and there is compilation error at line:

glEnable(GL_MULTISAMPLE);

That’s surprising, but obviously that edition of VS includes older than 1.3 version of OpenGL. I will just admit, that this is scientific software and efficiency is not so essential like in games (and there is not so much to display). So if there is some way to make it look good, even using SMOOTHs, it’ll be fine.

Lucky for you, GL_MULTISAMPLE is initially enabled.
You just need to select a pixelformat that is multisampled.

But do I really have to deal with all that dummy window stuff, or just setting right pixelformat will be ok?

You can just

SetPixelFormat(hdc, <x>, 0);

(cant remember of top of my head if third param can be null, but it can if im not mistaken)

if you know correct value for <x>, and knowing correct value for <x> involves creating context and using WGL_ARB_pixel_format to discover format you need.

If you know the correct value for your system, you can just feed it to SetPixelFormat and it will work - pixel format ids will change depending on vendor/driver version/etc so you dont want to do that outside of your testing env.

I write just to say - thanks.
Just as said at the end of pointed document about FSAA, it needed some work to be done, but effect is great.

One more thing:
During initialization, there is moment when I need to define how many samples will be used to multisampling (value of WGL_SAMPLES_ARB field). I’ve checked that when I set 4 samples display is better than when using 2 samples - that’s obvious. But when I set 6 samples, fonts are ugly again. I guess, that 6 samples is too much and card can’t use multisampling. If I’m right, is that max amount of samples graphic card dependent and if yes - is there any way to check how many samples can be handled?

To know what the maximum supported multisample is by the video card, you could just check what all the pixelformats are. There are probably over 100 pixelformats (depending on the video card, of course).

Ok, thanks. I think that ends this topic for now.