PDA

View Full Version : A Bit OT - Correct FPS Count



Aster
08-16-2001, 09:02 PM
I am creating an animation for which I specify distances and velocities in SI units (m and m/s). I am using function clock() and constant CLOCKS_PER_SEC from time.h to find time it takes to render one frame. However, the problem is that constant CLOCKS_PER_SEC has a hard-coded value of 1000. How can I find system's actual clock count? Is it just the processor's frequency? I would like to do it in a platform-independent manner.

I appreciate any suggestions.

Aster
08-17-2001, 09:50 PM
Actually, my problem is to create a smooth animation. So far, I am getting some choppy movement. I need a correct way to estimate fps to be able to calculate new frames based on the specified speed of objects in the environment, not the speed they appear to be moving with. Please help.

Lucky
08-17-2001, 10:56 PM
look at http://partners.nvidia.com/Marketing/Developer/DevRel.nsf/pages/E459B619DE2869DB88256A8C008253FC

Obli
08-18-2001, 01:22 AM
Originally posted by Aster:
I am using function clock() and constant CLOCKS_PER_SEC from time.h to find time it takes to render one frame. However, the problem is that constant CLOCKS_PER_SEC has a hard-coded value of 1000.
...
I would like to do it in a platform-independent manner.



Yes, I have the same problem and I tried the program above, but it seems to be windows-dependant. I don't want windows to be in the way.

Try glutGet(GLUT_ELAPSED_TIME), I am beggining to use it but I'm not sure it works.

Bye

Dodger
08-20-2001, 04:34 AM
I'd say timer functions are almost always system dependent (unless provided by an API like Glut). Once you found a timer to use, I recommend to accumulate the timers over one second and averaging the value. That way you can still read your framerate even when it's fluctuating a lot.

Hofi
08-20-2001, 09:50 AM
Hi,
Even if you use glut, your code may be system independent, but the performance (response time and resulution) of your timer may still be system-dependent.
For Windows NT, i found 'Query performance - frequency / -counter' very convincing.
greetings,
Hofi

Aster
08-23-2001, 09:54 PM
I probably didn't want to say it to myself, but there is no way (that I know of) to create a system-independent timer function. Anyway, thanks everyone for your help, I
really appreciate your answers.

Cheers

MofuX
08-23-2001, 11:00 PM
a simple way to get the max. framerate and the same speed on every machine is to lock the keyboard input for some times (like every 50 ms)
then the keyboard input is at 20 fps and your game renders always the maximum fps it can do!
(important: gamemath's you have 2 do in keyboardroutine too, not in your drawscene routine)

08-25-2001, 11:41 PM
Windows has a timer function in the mmsystem.h I believe. Glut also has one, I was just look for this same exact info today, I wanted one that was not dependent on windows and I found it in glut =)

glutTimerFunc will call Timer after 1 ms passing it value.

Timer will udate the global counter, in this case value. But inside Timer you have to place another glutTimerFunc if you want it to keep going, else it'll only kick in once.

1ms is a bit extreme =)

int value=0;
void Timer(int value)
{
value+=1;
glutTimerFunc(1, Timer, value);
}

int main()
{
...
glutTimerFunc(1, Timer, value);
...
}