PDA

View Full Version : wglSwapIntervalEXT problem



ewerton
01-06-2013, 01:31 PM
I need vsync in my project. So, I called wglSwapIntervalEXT(60). At function BOOL wglSwapIntervalEXT(int interval), the parameter interval must be my monitor frequency? Do I need call another function to enable vsync?

I just called wglSwapIntervalEXT(60), but doesn't works...

aqnuep
01-06-2013, 01:51 PM
If you would have read the spec, you would know:


The parameter <interval> specifies the minimum number of video frames that are displayed before a buffer swap will occur.

Thus no matter what is your monitor frequency, if you want to enable vsync to match your monitor frequency you have to set it to "1", which is actually the default behavior.

ewerton
01-06-2013, 02:20 PM
Now I call the function by this way: wglSwapIntervalEXT(1)

I'm using cout to see the values of interval between the frames. I think the values must be equals at all iterations and, as my monitor frequency is 60 Hz, this value must be 1000/60 = 16.66 ms. But I got the values 1 or 2 only, like the vsync was disabled... My code:

while(true){
draw code...
SDL_GL_SwapBuffers();
current = clock();
cout << current - old << endl;
old = current;
}

aqnuep
01-06-2013, 02:48 PM
Don't forget that there may be settings in your driver's control panel that may prevent the swap internal from taking effect. Usually there are three options on the control panel: always on, always off and application controlled. Make sure that you don't have "always off" selected.

ewerton
01-06-2013, 03:06 PM
I didn't found this option in my driver' control panel =/ But the function wglSwapIntervalEXT(1) returns 1. This means that the vsync should be enabled?

Aleksandar
01-07-2013, 12:04 AM
Are you serious?

eile
01-07-2013, 12:28 AM
Don't forget vsync works only in fullscreen mode.

No, it doesn't.

Nowhere-01
01-07-2013, 12:45 AM
No, it doesn't.

ok, i've fixed that and i'm gonna remove misleading post;

For vsync to work, your PFD should have parameters "PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER", read more about pixelformatdescriptor (http://www.opengl.org/wiki/Creating_an_OpenGL_Context_%28WGL%29) and contexts. You should end every frame with SwapBuffers(hdc). And you should check if your driver doesn't override vsync(nvidia control panel/amd ccc, or google how to allow vsync through driver).

little advice: although i wouldn't enable vsync in any input precision critical application. the problem is it creates slight input lag. it is ok for third person game or not very fast first person game and other genres. it's ok for some demos or school projects. but for sharp\multiplayer FPS it is unacceptable, and i wouldn't even put option to enable it because there are a lot of clueless people who will enable it and after some time notice aiming problems, forgetting after changing which setting that lag appeared.

ewerton
01-07-2013, 02:44 AM
My example worked on a machine with NVIDIA driver... But, I tested on another machine with other NVIDIA driver and it didn't work...
I use glew to load wglSwapIntervalEXT. glewinfo show me wglSwapIntervalEXT is OK:
WGL_EXT_swap_control: OK
---------------------
wglGetSwapIntervalEXT: OK
wglSwapIntervalEXT: OK

But... my code says it not ok!:

if(WGLEW_EXT_swap_control){
cout << "wglSwapIntervalEXT ok" << endl;
bool ret = wglSwapIntervalEXT(1);
cout << "ret = " << ret << endl;
} else {
cout << "wglSwapIntervalEXT - extension not ok" << endl; // this is my cout...
return 0;
}

What shall be wrong?

ewerton
01-07-2013, 03:12 AM
Using another driver from NVIDIA I mentioned, I realized that Glew not load any function but glewInit not returning any error. I decided to change the number of bits per pixel to 24 and everything worked, including wglSwapIntervalEXT! Now my problem is: how to find the correct configuration to run on each machine?

Nowhere-01
01-07-2013, 03:32 AM
and you've tried 16bpp before? if 32 bits didn't work, there's something wrong with your test machine.

int bitsPerPixel = GetDeviceCaps(hdc, BITSPIXEL); //to get current system's color depth

however, i'd say it's safe to assume that every machine now uses 32 bits per pixel. or there's something wrong with it's drivers or person who uses it.