View Full Version : Antialiasing for Outline Font

03-07-2011, 09:51 AM
Hello. I'm using outline font to display some informations in my application. For some reasons it mus be outline. Each character is drawn using

glCallLists(1, GL_UNSIGNED_BYTE, &c);
Text itself is displayed right, but it looks quite ugly. I'm using ATI graphic card and when I'm changing its antialiasing options from "use application settings" to e.g. "level x4" these fonts become nice, sharp and clear. I want to get same result with default graphic card settings ("use app. settings"). For lines, I've added code:

glEnable (GL_LINE_SMOOTH);
glEnable (GL_BLEND);

and they look way better now.
I thought that using similar code for fonts:

glEnable (GL_BLEND);

will also work, but there is no difference.
So I would be very grateful, for some advice how use AA with that type of fonts.

03-07-2011, 11:05 AM
I don't recommend ever using glEnable (GL_LINE_SMOOTH), glEnable (GL_POLYGON_SMOOTH) and glEnable (GL_POINT_SMOOTH).

If you want AA, I recommend Full Scene AA

03-07-2011, 11:24 AM
I don't recommend ever using glEnable (GL_LINE_SMOOTH)
Why? Especially for lines, this has good chance of looking much better than multisampled AA afaik.

03-07-2011, 11:55 AM
Unfortunately,it seems that I'm not able to use FSAA. I'm using Visual Studio 2003 and there is compilation error at line:

That's surprising, but obviously that edition of VS includes older than 1.3 version of OpenGL. I will just admit, that this is scientific software and efficiency is not so essential like in games (and there is not so much to display). So if there is some way to make it look good, even using SMOOTHs, it'll be fine.

03-07-2011, 11:59 AM
Lucky for you, GL_MULTISAMPLE is initially enabled.
You just need to select a pixelformat that is multisampled.

03-07-2011, 12:16 PM
But do I really have to deal with all that dummy window stuff, or just setting right pixelformat will be ok?

03-07-2011, 12:24 PM
You can just

SetPixelFormat(hdc, <x>, 0);

(cant remember of top of my head if third param can be null, but it can if im not mistaken)

if you know correct value for <x>, and knowing correct value for <x> involves creating context and using WGL_ARB_pixel_format to discover format you need.

If you know the correct value for your system, you can just feed it to SetPixelFormat and it will work - pixel format ids will change depending on vendor/driver version/etc so you dont want to do that outside of your testing env.

03-08-2011, 05:50 AM
I write just to say - thanks.
Just as said at the end of pointed document about FSAA, it needed some work to be done, but effect is great.

One more thing:
During initialization, there is moment when I need to define how many samples will be used to multisampling (value of WGL_SAMPLES_ARB field). I've checked that when I set 4 samples display is better than when using 2 samples - that's obvious. But when I set 6 samples, fonts are ugly again. I guess, that 6 samples is too much and card can't use multisampling. If I'm right, is that max amount of samples graphic card dependent and if yes - is there any way to check how many samples can be handled?

03-08-2011, 08:24 AM
To know what the maximum supported multisample is by the video card, you could just check what all the pixelformats are. There are probably over 100 pixelformats (depending on the video card, of course).

03-08-2011, 10:25 AM
Ok, thanks. I think that ends this topic for now.