Portable way to throttle frame rate

I’m writing a small non-interactive animation in OpenGL that I’d like to run at a constant rate on different computers. Otherwise, it will run way too fast on souped up hardware and lose its effect.

I’ve tried the wglSwapIntervalEXT extension but have already found platforms where it isn’t implemented.

Is there a better way to throttle frame rate to say 60 fps in code?

Rely on systems timer, then everything will be ok. You can wait or discard unneeded frames.

In most cases it’s better to base your animation on real time rather than on frames. For instance if you were to try and “throttle” the app to 60 FPS, what happens when you are running on a slow machine that can only manage to push out 20 FPS?

If you base the animation on time, instead, you can be sure that it will run at the same rate on all computers, and if you have a faster computer it will just make the animation that much smoother.

Thanks! Your comments helped my refine my research and I found the following article:
http://www.gamedev.net/reference/articles/article753.asp

It will take some refactoring of my code (and brain as I try to wrap it around this concept) but it shouldn’t be too much trouble. Thanks again.

…just a small addition to the article above, which is valid if you intend to target ALL systems (regardless of how slow they are):

If your delta frame time (i.e. 1/fps) is very large (say half a second or so), you may experience “bugs” in your movements (i’ve even seen program crashes). Basically, all movements that are based on acceleration (i.e. they have a non constant speed) are sensitive to the length of the time step.

The solution is to partition one frame into several small time steps, and iteratively update movementes (etc) in small steps until the entire frame time is covered. This is not too difficult, do something like:

new_tim = time();
dt_total = new_time - old_time;
while( dt_total > 0.0 )
{
    if( dt_total > MAX_DELTA_TIME )
    {
        dt = MAX_DELTA_TIME;
    }
    else
    {
        dt = dt_total;
    }
    update_all_movements( dt );
    dt_total -= dt;
}
old_time = new_time;

Thanks for all the help folks! After some careful refactoring, my simulation now runs in constant time on various platforms, smoother of course on the faster ones.