Vsync with glut

Hello!
Maybe somebody can help me out. I am experimenting with glut and openGL where I have an animated camera with gluLookAt. Everything works fine but there is a flickering around the rendering objects. The glut window is configured as double buffered. In the display func glutSwapBuffers is called but there is still flickering in the animation I guess that I have a VSync problem.
I switched Vsync in the nvidia’s panel on but no changes… Maybe I must manipulate the xorg.conf?

It sounds like what you may be seeing is either “tearing” or “drawing”.

If it is drawing, you don’t have a double-buffered framebuffer allocated. To allocate a double-buffered window framebuffer with glut, specify GLUT_DOUBLE in this call, like so:

glutInitDisplayMode    ( GLUT_RGBA    | GLUT_DEPTH | 
                         GLUT_STENCIL | GLUT_DOUBLE );

Then the user won’t be able to see what you are drawing until you actually call the swapbuffers call. With single buffer, they see it while you’re drawing, which would generate something like “flashing”.

If OTOH you are seeing “tearing”, then you need to force sync-to-vblank on. The portable way is to call glXSwapInterval( display, drawable, 1) (see EXT_swap_control), or wglSwapControl on MSWin.

Another way (NVidia-specific and maybe Linux specific, is to put “__GL_SYNC_TO_VBLANK=1” in your environment before initializing OpenGL. For instance:


putenv( (char *) "__GL_SYNC_TO_VBLANK=1" );
glutInit( & argc, argv ) ;
...

Thanks for the help.
So far, I have allocated the double-buffered framebuffer like described…
I tried use the environment variable solution but without success. Is there a hint where I must configure my xorg.conf?
Else how to use "glXSwapInterval? Is it like an init function? And what happens to glutSwapBuffers?

Before you resolve that sync-to-vblank isn’t happening, time the period between the top one frame (beginning of display function) and the top of the next. Unless you’re taking too long to draw a frame, it should be rock-solid steady. If you have an LCD monitor, it’ll likely be nailed at 60Hz (16.6ms/frame).

Is there a hint where I must configure my xorg.conf?

Don’t think you’ll need to do anything there, unless you’re using some option that causes the NVidia driver to not sync-to-vblank. One is if you are using TwinView, you can only sync to the retrace of one monitor, not both. Having both sync to each other is only supported on Quadros with G-Sync capability (AFAIK).

But you can look at it at /etc/X11/xorg.conf (on my Linux anyway). Your NVidia driver settings goodies will likely all be in a “Device” section. For instance:


Section "Device"
    Identifier     "Device[0]"
    Driver         "nvidia"
    VendorName     "NVIDIA"
    BoardName      "GeForce GTX 285"

Else how to use "glXSwapInterval? Is it like an init function?

Can be. Call it after initializing GL (GLUT). You can also call it dynamically while rendering to change from syncing to vblank or not syncing to it.

And what happens to glutSwapBuffers?

Call as usual, at the end of a frame. What we’re tuning with this sync-to-vblank stuff is whether or not the pipeline blocks and stops processing GL commands until vertical retrace whenever it finishes drawing a frame.

OK…
I see your “xorg.conf” and there nothing special… OK…

Which libraries have I to install? (for functions like “glXSwapInterval”). You are right, it is just a LCD Display (my laptop)…

It’s already in the NVidia driver as glXSwapIntervalEXT:


> nm -Do /usr/lib64/libGL.so | grep glXSwapInterval
/usr/lib64/libGL.so:0000000000067950 T glXSwapIntervalEXT
/usr/lib64/libGL.so:0000000000067b00 T glXSwapIntervalSGI

So far… I have implemented the example/documentation which you sent to me. The maxSwap = 8 and minSwap = 1. I used the swap = 1 in the glXSwapInterval function. But it is still tearing. Which value you can recommend me or is it a problem with my grapics driver? I installed the actual one from nvidia’s site. My graphics hardware is an Geforce 8400M GS. I implemented the following code from the example (after creating the glut window):


    glewInit();
    Display *dpy = glXGetCurrentDisplay();
    GLXDrawable drawable = glXGetCurrentDrawable();
    unsigned int swap, maxSwap;

    if (drawable) {
        glXQueryDrawable(dpy, drawable, GLX_SWAP_INTERVAL_EXT, &swap);
        glXQueryDrawable(dpy, drawable, GLX_MAX_SWAP_INTERVAL_EXT,
                         &maxSwap);

        glXSwapIntervalEXT(dpy, drawable, swap);
    }

Is there something missing?

Do you have a change in number of milliseconds per frame ?
I mean, between swap = 0 and swap = 8 for example ?

Yes I have! For example… if the number is high the animation seems to be slower…

So it sounds vsync does its job.
But you should not see tearing with a swap=1
Something later in the chain messes with timings ? Bad compositing window manager ?

Hmm… I don’t know… What I should configure about the window manager? Without any effects?

For example.
Which video driver do you use ? Binary from nVidia ? Or open source ? Which version, in any case ?
What are the results of :
http://www.opengl.org/sdk/docs/man/xhtml/glGetString.xml
(except extensions, too verbose)

Version: 3.3.0 NVIDIA 260.19.29
Vendor: NVIDIA Corporation
Renderer: GeForce 8400M GS/PCI/SSE2/3DNOW!

I am using the driver of nvidia…

And did “without effect” improved things ?

No!
I don’t know, but is it possible that laptops graphics cards have more problems than the desktop variants? I tested the game Balazar (open source game), there were no tearings…
Any idea?