PDA

View Full Version : Calculate FPS of a scene in OpenGL



Cristi
05-26-2017, 03:33 AM
Hey guys,

I have succeeded to create a scene in OpenGL and write now i want to count the number of FPS.
I have fund something about this on the internet but i don't understand why is use that formula. Here is the fucntion:


void numberOfFPS()
{
frameCount++;
frame_per_sec_count++;

if (frame_per_sec_count == frame_per_sec_limit)
{
char local_fps[256];
float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);
sprintf(local_fps, " %3.1f fps", ifps);
glutSetWindowTitle(local_fps);

frame_per_sec_limit = ftoi(MAX(1.0f, ifps));
++i;
fps_cout_simple = frame_per_sec_limit;//fps print
frame_per_sec_count = 0;
sdkResetTimer(&timer);
}
}


Exist some forumla for FPS? Why is used this :


float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);


Thanks :D

Cristi
05-27-2017, 04:57 AM
Still i can't find an answer, has anyone some idea about this?

Dark Photon
05-27-2017, 06:10 AM
now i want to count the number of FPS.
Exist some forumla for FPS?

First, FPS is a pretty useless metric. Gamerz use it, but graphics developers don't. A few short pages on this: link (https://www.mvps.org/directx/articles/fps_versus_frame_time.htm), link (http://www.humus.name/index.php?ID=279).



Why is used this :


float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);


This is apparently asserting that:

ifps = Frames / Second = 1.0 / ( sdkGetAverageTimerValue(&timer) / 1000.f )

so, just do the math:

Seconds / Frame = sdkGetAverageTimerValue(&timer) / 1000.f
(1000 * Seconds) / frame = sdkGetAverageTimerValue(&timer)
(1 * Milliseconds) / frame = sdkGetAverageTimerValue(&timer)


So apparently, sdkGetAverageTimerValue() returns elapsed time for the frame in milliseconds.