View Full Version : Rendering jitter

01-17-2010, 04:12 AM

I'm new to OpenGL and I'm trying to build a simple rendering engine on Linux(Ubuntu) in C/C++.
When running however, I experience jittering/flickering. Every half a second the image is not vertically aligned any more. I first thought this is because of the lack of vertical synchronization, so I implemented a simple frame limiter:

while (!m_doExit) {
m_oldTime = m_currTime;


// frame limiter
m_currTime = getTimeOfDayInMillisecs();
while (m_currTime - m_oldTime < 1000/DEFAULT_MAX_FPS) {
usleep((m_currTime - m_oldTime)*1000/DEFAULT_MAX_FPS);
m_currTime = getTimeOfDayInMillisecs();

Where DEFAULT_MAX_FPS is set to 60 per default. With this code I get like 61 to 62 fps.
However it seems that OpenGL doesn't like the sleeping in the code, when I'm removing the frame limiter everything works fine (though I get senselessly 1200 FPS).
Is there any better method in reducing cpu load without getting the jitter effect?
I'm not a big fan of the 'sleeping' method either, as the system can't catch mouse/keyboard events in that time anyway...

Stephen A
01-17-2010, 05:03 AM
You need to enable vertical synchronization (vsync). You can do this using glXSwapIntervalSGI (http://techpubs.sgi.com/library/tpl/cgi-bin/getdoc.cgi?coll=0650&amp;db=man&amp;fname=/usr/share/catman/g_man/cat3/OpenGL/glxswapintervalsgi.z) - just beware that this is a GLX extension that you must query with glXGetProcAddress.

01-17-2010, 05:31 AM
Thank you very much Stephen!

I'll try this in a second.
Will my previous frame limiter implementation be worthless? In other words will OpenGL block in its functions or will 'just' the display be updated 60 times per seconds?

EDIT: I managed to get vertical synchronization working by looking at this tutorial:

Though I'd recommend checking for swapInterval validity before calling on systems which do not support this extension.