I'm a total begginer. I would like to get your help on this subject. Just imagine a simple animation of an object moving, something like:
//The coordinates of the initial position of the object that I want to animate
GLfloat x1 = 0;
GLfloat y1 = 10;
// Step size for the animation
GLfloat xstep = 0.5f;
GLfloat ystep = 0.5f;
//Timer Function that starts the animation
void Timer(int value)
//Moves the object
x1 += xstep;
y1 += ystep;
For what I read on the web, if I have my CPU usage 100% while using glutTimerFunc to animate the object, this animation can slow down.
It is said that for this case, using glutGet(GLUT_ELAPSED_TIME) is the best option. I know that GLUT_ELAPSED_TIME retrieves the integer time in milliseconds since the application started. In this case, we use GLUT to count the number of frames per second and we have an animation independent of the CPU usage percentage.
But I have no idea how to implement this. What should I change in the above code in order to set the speed of the animation with GLUT_ELAPSED_TIME?
You need to decide what units your world (and models) are in, for example you could use one unit in OpenGL is one meter. Then decide what speed you want your object moving at (in meter per second) and from that given an elapsed time since the last frame you can determine how far to move your object.
Sorry if I didn't understand well, I want the time between each frame to be the same. But I don't know how to relate that with the object that I want to animate. Should I replace the step size of the object to a value that I get from elapsed_time? If anyone could give me an example it would be very helpful
Originally Posted by carsten neumann