Thread: Calculate FPS of a scene in OpenGL

1. Calculate FPS of a scene in OpenGL

Hey guys,

I have succeeded to create a scene in OpenGL and write now i want to count the number of FPS.
I have fund something about this on the internet but i don't understand why is use that formula. Here is the fucntion:

Code cpp:
```void numberOfFPS()
{
frameCount++;
frame_per_sec_count++;

if (frame_per_sec_count == frame_per_sec_limit)
{
char local_fps[256];
float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);
sprintf(local_fps, " %3.1f fps", ifps);
glutSetWindowTitle(local_fps);

frame_per_sec_limit = ftoi(MAX(1.0f, ifps));
++i;
fps_cout_simple = frame_per_sec_limit;//fps print
frame_per_sec_count = 0;
sdkResetTimer(&timer);
}
}```

Exist some forumla for FPS? Why is used this :

Code cpp:
`    float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);`

Thanks

3. Originally Posted by Cristi
now i want to count the number of FPS.
Exist some forumla for FPS?
First, FPS is a pretty useless metric. Gamerz use it, but graphics developers don't. A few short pages on this: link, link.

Why is used this :

Code cpp:
`    float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);`
This is apparently asserting that:

ifps = Frames / Second = 1.0 / ( sdkGetAverageTimerValue(&timer) / 1000.f )

so, just do the math:

Seconds / Frame = sdkGetAverageTimerValue(&timer) / 1000.f
(1000 * Seconds) / frame = sdkGetAverageTimerValue(&timer)
(1 * Milliseconds) / frame = sdkGetAverageTimerValue(&timer)

So apparently, sdkGetAverageTimerValue() returns elapsed time for the frame in milliseconds.

Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•