time

how do you make your game run the same on all machines? i mean if the game runs at 70fps on one machine, it literaly is going twice as fast as it running at 35fps on a differnt machine…

since it realy doesnt need to go any faster than 30fps would it be possible to force it run at 30fps or lower? like:

if(fps < 30)
{
DrawGame();
}
calculateFPS();

and would that be a good idea or are there other better ways?

You could use a timer such that you call your rendering function every 1/30th or 1/60th of a second. However, you would probably be better off calculating the time between rendering and base your speed on that.

For Windows see SetTimer() for 1st part. For second, best to use high performance timer - see QueryPerformanceFrequency used in conjunction with QueryPerformanceCounter.

For other platforms, I have no clue.

Time and fps are to diffrent times and should be dealt with as that way.

Now if you have a lot of movement you want the display updated as fast as posible, but when the scene is not changing redrawing it 70 times a second is a waste of processing time.

First we setup a event routine to see if anything needs to be handled.

Let’s take the example of a bullet, if you used the fps. On one machine it would slowing creap accross the screen, on another it would be so fast you could not see it.

So to keep it the same on all machine we make a event loop called even X seconds.

Event_loop()
{

// Process bullet
// the bullet will have a direction, speed, distance
// We also need to have also some type of fps factor, in which to adjust for slower machines

// bullet.update
// bullet.speed
// bullet.direction
// bullet.position
// bullet.range // How far our bullet can travel

if ( bullet.update > bullet.speed)
{
// How far has bullet traveled since last update
// fps_offset is the adjusment for distance based on time and the fps
// this make sure that bullet made same distance based on time
// on a slower machine you would need to increase the distance moved each update. on a faster machine distance moved each update would be smaller.
bullet.position = Bullet_Update(fps_offset);

}

Note just example code, but hope to give you an idea

Originally posted by colinisinhere:
[b]how do you make your game run the same on all machines? i mean if the game runs at 70fps on one machine, it literaly is going twice as fast as it running at 35fps on a differnt machine…

since it realy doesnt need to go any faster than 30fps would it be possible to force it run at 30fps or lower? like:[/b]

Acctualy if you call your render routine once every 1/60th or 1/30th of the second you’ll probably won’t be able to finish even one on a slow computer. A better solution is to do your render function again and again and to calculate the scene change on the base of the fps. This way you’ll need the timer just to calculate your fps.

[This message has been edited by TBoNe (edited 02-14-2003).]

Basically, the strategy is this:

For each frame, do:

t = get_time_with_some_timing_function();
update_world_state( t );
draw_world();

Here, update_world_state uses the time t when updating things like physics propagation (internally, it may maintain the time for the last frame, t_old, in order to form dt = t - t_old, which can be useful for differential equations etc).

This way you are always guaranteed that your display is in synch with “real time”, no matter how fast the computer is.

The “only” problems/errors with this method are:

  1. The time for a given frame is actually wrong, since you are using the time for “this” frame to generate the “next” frame (since you can not predict how long it will take to render the following frame). This error is unavoidable, and in fact negligible. AFAIK most games/demos suffer from this - but it’s not noticable.

  2. If you are running on a very slow computer (e.g. software rendering Pentium 100), things like physics will most probably “bail out”, due to too large time steps. This can be handled by detecting if dt > max_dt (predefined limit), and if so iterating update_world_state() in small steps (max_dt) undtil the desired time (dt) has been simulated. Since this may casue the computer to virtually stop, you should also have some kind detection that says “no use - quit program”, or something like that.

  3. You need to use a timer with good precision. GetTickCount, timeGetTime, and SDL_GetTicks are NOT good timers. Use glfwGetTime, QueryPerformanceCounter, RDTSC, gettimeofday, CLOCK_CYCLE_SGI, gethrtime, or similar (depending on system).