GlutTimerFunc and a Constant Framerate

Hi,
I am trying to get a constant framerate from an openGL application that uses GLUT for its windows. The program is to run on both Win32
and SGI, although the latter is the most important platform (where I really
need the constant framerate).

I thought it would be a good idea
to use glutTimerFunc to tell the frame-loop to wait with executing
glutSwapBuffers() until some particular time has passed.

// THe loop advancing the frames (registered with glutIdleFunc() does:

if (wait){
sleep(0);
return;
}
glutSwapBuffers();
wait = true;
glutTimerFunc(40,allowFrame,1)

// And the allowFrame callback does:
void allowFrame(int value){
wait = false;
}

I expected this to give me frames of 40ms (or nearest video refresh).
Instead, I get really long frames. The timer seems to expire at almost
random times?

How can I get an accurate timer in Glut? Or does anyone know a better way to
get a constant framerate?

Thanks,

Bart

As far as I know GLUT’s time measurement is not the most accurate thing in the world. But if you want your app to run both in windows and sgi you’ll have to resort to same cross platform library.
Nevertheless I would suggest another alternative:

in your idle function start by getting the elapsed number of miliseconds with GLUT, only for the first frame.

if (init == 0)
init = glutGet(GLUT_ELAPSED_TIME);
render your scene
glutSwapBuffers();
total = glutGet(GLUT_TIME_ELAPSED);
while (total - init < whatever number of milisseconds per frame you’re looking for) {
total = glutGET(GLUT_TIME_ELAPSED);
}
// this last call resets init to the current number of elapsed miliseconds.
init = glutGet(GLUT_ELAPSED_TIME);
end idle function

Antonio www.fatech.com/tech

The problem is that, while your timer may
be accurate (depending on implementation),
the GLUT idle function is only as accurate
as the scheduler on your OS, and often less
accurate than that.

If you can change the scheduling mode of the
GLUT idle thread (assuming there is such a
thing) then try setting it to FIFO which will
make it run as close to the intended wake-up
time as the underlying OS as possible (which
may or may not be good enough).

Worst case, you have to use setitimer() and
try to swap from your timer signal handler
(and thus not use GLUT) – there are some
severe reentrancy implications there, though.
And setitimer() is, of course, not available
under Windows.

Take note that glutGet(GLUT_ELAPSED_TIME) also suffers inaccuracies in windows (most pronounced on NT and 2k). I use QueryPerformanceCounter() in windows. On macintosh and irix, glutGet(GLUT_ELAPSED_TIME) seems more accurate. use some #ifdef’s.

I discovered the inaccuracy using glutGet(GLUT_ELAPSED_TIME) to benchmark a routine, and the framerate derived from this information frequently responded winth “infinite”. Which couldn’t be true: that would mean i was getting over 1000fps. Which i wasn’t.