The only problem is that it changes every 10 frames to (sometimes) a completely different value! I was wondering how others calculate fps to see if there is something better…
err yeah, for each frame calculate the time it has taken to draw. the fps then equals 1/(time for frame). Do it each frame not every 10, that seems a bit daft to me.
One simple method is to use tick counts to judge how long each frame takes. Divide this into 1000 to get FPS.
Then use a trick like the following to average out the framerate:
a = 1000/TickCount
b = 50
fps = (fps*(b-1)+a)/b)
fps will will hold a nice accurate sort-of average reading of recent framerates.
The higer you choose b the further back it reading extends. Note sadly early framerates will be innacurate until about 2bFrames have rendered. No problem if b ain’t huge.