View Full Version : glutTimerFunc inaccuracy

12-21-2009, 06:45 AM

I an relatively new to C++ and OpenGL.

I am trying to create a rhythm game which allows players to tap on certain keys to the beat timing. I am using fMod for the sound and OpenGL for the graphics. I am using glutTimerFunc to play the "ticks" for each beat and for a quad to display a different colour.

The problem I am facing is that glutTimerFunc only takes in milliseconds. If a song has a bpm (beats per minute) of 166, a simple calculation 60000/166 will give us the milliseconds between each beats and in this case is 361.44. This means the timer must fire again at exactly 361.44 milliseconds in order for the ticks and colour change to be in sync with the song.

Since glutTimerFunc can only take milliseconds as integers, I cannot pass 361.44 into it and rounding to 361 or 362 makes everything off-sync towards the end of the song. Furthermore, the timer does not always fire at exactly the millisecond you tell it to. A simple timeGetTime() calculation informed me that glutTimerFunc fired at anything between 358 - 363 millisecond when I have given it 361. Such inconsistancies have a huge impact to my timing-critical game.

Is there any ways at all to get glutTimerFunc to fire at half milliseconds for example and each time it fires, it does it at the very precise time. If not, what other strategies can I adopt?


12-21-2009, 02:30 PM
GLUT is not well suited to this kind of time handling.
The time resolution is not that much important (60Hz monitor means 16.667 ms of granularity) but it is crucial that there is not drift, no accumulation of errors, and timer-based control is prone to drift unless using hard realtime capabilities.

Instead of GLUT I really like GLFW, better for game-like applications. In this case you just call glfwGetTime() to get the total time elapsed, so there is no accumulation of small errors.

Something like :

double startTime = glfwGetTime();
int ticks =0;
while(game playing) {
double currentTime = glfwGetTime();
while (currentTime * 60.0 / BPM > ticks) {
currentTime = glfwGetTime();

GLFW time code is quite optimized last I checked, with the most precise method used for each supported plateform.

12-22-2009, 03:32 PM
Thanks ZbuffeR,

I've looked into using GLFW and I've attempted to port my project over to this library and do away with GLUT. I was enthusiastic about the high precision timer which GLFW offers but I still get the problem with accumulated errors.

The timer runs at a much fast rate than the main loop so I cannot do

if (glfwGetTime() == 0.4332){
because it is more than likely that the value 0.4332 will be long passed when this line of code is executed so I have to resort to using the >= evaluator, which means there will be accumulation of errors and I am back to the GlutTimerFunc problem I had initially.

I think I will need something running outside of the mainloop that will execute glfwGetTime() in a loop at around 0.1 second intervals.

Anyone have any advice? My code for the main loop is below.

void runGameLoop()
(1,"Metronome/MainBeat.wav",0,0,0); // Load the high pitch tick
(2,"Metronome/SubBeat.wav",0,0,0); // Load the low pitch tick
glfwSetTime(0.0000); // Set time to 0.0000

while(running) // While still in the game loop
if (glfwGetTime() >= 0.4332){ // song is 138.5bpm so 60000/138.5 = 433.2 and if the timer is above this value
if (beat == 1){ // If beat count is 1 then
FSOUND_PlaySound (1,handle); // play the high metronome tick
FSOUND_PlaySound (2,handle1); // Else play the low metronome tick for beats 2,3 and 4
if (beat == 4){ // If we are on the 4th beat
beat = 1; // reset the beat count to 1
beat ++; // Else increase the beat count

if (red == 1.0f){ // If red value is 1.0
red = 0.0f; // set it to 0.0
red = 1.0f; // Else set it to 1.0
glfwSetTime(0.0000); // Reset the timer

glClear(GL_COLOR_BUFFER_BIT); // Flush the colour fuffer
glClear(GL_DEPTH_BUFFER_BIT); // Flush the depth buffer
glColor3f(red,1.0, 0.0); // Sets the colour for the quad
glRectf(-5.0, 5.0, 5.0, -5.0); // Draws the quad
glfwSwapBuffers(); // Swap front and back rendering buffers

12-23-2009, 12:16 PM
Just an update to let you know that I've fixed the timing issue by using GLFW's multithread feature. The timer is now running in another thread and seems to have fixed the problem.