Antialiasing under linux

hi,
i’m trying to apply antialiasing to an opengl
application (c++ under linux).
I’m looking for some opengl code or tutorial
that can helps me.
I haven’t found anything, and using windows tutorials looks not possible because the main work for antialiasing is done at initialisation time, wich differs with linux one.

thanks.

It depends on your hardware (and if it supports it).

Mikael

I found this link, mostly about NV hardware : http://www.libsdl.org/pipermail/sdl/2003-September/056182.html

Check with glxinfo if any mode has “ms” (multisampling) in it.

PS: official doc is hard to read but useful : http://oss.sgi.com/projects/ogl-sample/registry/ARB/multisample.txt

[This message has been edited by ZbuffeR (edited 01-12-2004).]

I had FSAA working in my app until I upgraded to the latest NVidia drivers for linux. For some reason it doesn’t work now. I don’t use glut or anything so this might not work for you if you use a helper library.

First create a normal GL window and context.
I used the following attributes to glXChooseVisual.

int attrListDbl = { GLX_RGBA, GLX_DOUBLEBUFFER,
GLX_RED_SIZE, 4,
GLX_GREEN_SIZE, 4,
GLX_BLUE_SIZE, 4,
GLX_DEPTH_SIZE, 16,
None};

then I check if GL_ARB_multisample is in the extensions string. Just check out the nehe.gamedev.net tutorial for extensions to see how to do that.

If the extension exists I destroy the window and recreate a new one with the following arguments to glXChooseVisual.

int attrListDblmultis = { GLX_RGBA, GLX_DOUBLEBUFFER,
GLX_RED_SIZE, 4,
GLX_GREEN_SIZE, 4,
GLX_BLUE_SIZE, 4,
GLX_DEPTH_SIZE, 16,
GLX_SAMPLE_BUFFERS_ARB, 1, GLX_SAMPLES_ARB, 1,
None };

Of course you should probably iterate the the GLX_SAMPLES_ARB to find the best mode available by your graphics card.

Then enable multisampling by
glEnable(GL_MULTISAMPLE_ARB);

I also wrapped all of my antialiasing code in #ifdef’s so I could build the project on old versions of gl which don’t have multisample support.

This seemed to work great for me until I upgraded the nvidia drivers. I’m not sure if I’m doing something wrong or if the drivers have a bug. If I set the Nvidia environment variables to force FSAA the app will be antialiased. glDisable will switch the AA from the environment specified to what I specified in the app. Strange. Maybe you’ll have better luck.

Check to see if GLX_ARB_multisample is listed in the GLX extensions, then request a visual (or fbconfig) with GLX_SAMPLES > 1. That’s the proper way to do it.

so this attribute list would be sufficient given that the extension exists

int attrListDblmultis = { GLX_RGBA,
GLX_DOUBLEBUFFER, GLX_RED_SIZE, 4,
GLX_GREEN_SIZE, 4, GLX_BLUE_SIZE, 4,
GLX_DEPTH_SIZE, 16,
GLX_SAMPLES, 2,
None };

i thought that GLX_SAMPLE_BUFFERS, 1 gives you a multisampling visual and GLX_SAMPLES was the level of AA. I just noticed that GLX_SAMPLES and GLX_SAMPLES_ARB are the same in glxext.h so I guess there is no difference there. It really bugs me that this worked as expected before the newest Nvidia linux drivers.

thanks

[This message has been edited by blip (edited 01-14-2004).]

[This message has been edited by blip (edited 01-14-2004).]

In the NVidia readme files for their drivers, see:


(app-e) APPENDIX E: OPENGL ENVIRONMENT VARIABLE SETTINGS


FULL SCENE ANTI-ALIASING

Anti-aliasing is a technique used to smooth the edges of objects in a
scene to reduce the jagged “stairstep” effect that sometimes appears.
Full scene anti-aliasing is supported on GeForce or newer hardware.
By setting the appropriate environment variable, you can enable full
scene anti-aliasing in any OpenGL application on these GPUs.

Several anti-aliasing methods are available and you can select between
them by setting the __GL_FSAA_MODE environment variable appropriately.
Note that increasing the number of samples taken during FSAA rendering
may decrease performance.

The following tables describe the possible values for __GL_FSAA_MODE
and their effect on various NVIDIA GPUs.

__GL_FSAA_MODE GeForce3, Quadro DCC, GeForce4 Ti, GeForce4 4200 Go,
and Quadro4 700,750,780,900,980 XGL

0 FSAA disabled
1 2x Bilinear Multisampling
2 2x Quincunx Multisampling
3 FSAA disabled
4 4x Bilinear Multisampling
5 4x Gaussian Multisampling
6 2x Bilinear Multisampling by 4x Supersampling
7 FSAA disabled

__GL_FSAA_MODE GeForce FX, Quadro FX

0 FSAA disabled
1 2x Bilinear Multisampling
2 2x Quincunx Multisampling
3 FSAA disabled
4 4x Bilinear Multisampling
5 4x Gaussian Multisampling
6 2x Bilinear Multisampling by 4x Supersampling
7 4x Bilinear Multisampling by 4x Supersampling

NOTE: 2x Bilinear Multisampling by 4x Supersampling and 4x Bilinear
Multisampling by 4x Supersampling are not available when using UBB.


just type this in your shell before running the application:

export __GL_FSAA_MODE=4

I know, that method works fine. But I would like to enable this from within the application itself. This way the application can determine the level of AA or allow the user to configure the AA while it is running. It seems to be a very simple thing to do. In fact I had it working. It just stopped working from within the app in the newest driver (53.28) for me.

[This message has been edited by blip (edited 01-16-2004).]

[This message has been edited by blip (edited 01-16-2004).]

So I tried playing around with this again since I hadn’t looked at it in a while. When my app starts in “windowed mode” with a multisample visual (GLX_SAMPLE_BUFFERS_ARB, 1, GLX_SAMPLES_ARB, 2). I can enable and disable FSAA via glEnable/glDisable(GL_MULTISAMPLE_ARB)

When I switch the app to fullscreen, FSAA doesn’t work anymore. I can no longer turn it on with glEnable.

Now for the crazy part. If I close the app and re-run it I can no longer get FSAA, regardless of being in fullscreen or windowed mode. I have to restart X to have the FSAA work in windowed mode. This is really strange.

Does anyone use FSAA inside their own app written for linux and tried the newest Nvidia drivers? I’m using Fedora Core 1. With the minion patch to fix the Via chipset OOPS. Maybe I should switch back to the 46.20 beta drivers. This all seemed to work with them.

GLX_SAMPLES_ARB is the name of the enum when it was just an ARB extension (i.e., not part of core GL). When the functionality was moved to core GL the _ARB suffix was dropped.

What you describe in you latest post sounds like a driver bug. Have you reported this to Nvidia?

I figured it was probably a driver bug. I haven’t reported it yet. I am using the minion.de patch to the kernel module because I have a VIA chipset which causes the new driver to OOPS. I’m pretty sure that has been reported because many people have experienced this already. I’m not sure how serious they would take the bug since I’m using the minion patch. It isn’t real important to me since my application is just a hobby. I’ll probably report it to them anyway soon.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.