Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: Calculate FPS of a scene in OpenGL

  1. #1
    Newbie Newbie
    Join Date
    May 2017
    Posts
    6

    Calculate FPS of a scene in OpenGL

    Hey guys,

    I have succeeded to create a scene in OpenGL and write now i want to count the number of FPS.
    I have fund something about this on the internet but i don't understand why is use that formula. Here is the fucntion:

    Code cpp:
    void numberOfFPS()
    {
        frameCount++;
        frame_per_sec_count++;
     
        if (frame_per_sec_count == frame_per_sec_limit)
        {
            char local_fps[256];
            float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);
            sprintf(local_fps, " %3.1f fps", ifps);
            glutSetWindowTitle(local_fps);
     
            frame_per_sec_limit = ftoi(MAX(1.0f, ifps));
            ++i;
            fps_cout_simple = frame_per_sec_limit;//fps print
            frame_per_sec_count = 0;
            sdkResetTimer(&timer);
        }
    }

    Exist some forumla for FPS? Why is used this :

    Code cpp:
        float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);

    Thanks
    Last edited by Dark Photon; 05-26-2017 at 06:01 AM.

  2. #2
    Newbie Newbie
    Join Date
    May 2017
    Posts
    6
    Still i can't find an answer, has anyone some idea about this?

  3. #3
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,124
    Quote Originally Posted by Cristi View Post
    now i want to count the number of FPS.
    Exist some forumla for FPS?
    First, FPS is a pretty useless metric. Gamerz use it, but graphics developers don't. A few short pages on this: link, link.

    Why is used this :

    Code cpp:
        float ifps = 1.f / (sdkGetAverageTimerValue(&timer) / 1000.f);
    This is apparently asserting that:

    ifps = Frames / Second = 1.0 / ( sdkGetAverageTimerValue(&timer) / 1000.f )

    so, just do the math:

    Seconds / Frame = sdkGetAverageTimerValue(&timer) / 1000.f
    (1000 * Seconds) / frame = sdkGetAverageTimerValue(&timer)
    (1 * Milliseconds) / frame = sdkGetAverageTimerValue(&timer)


    So apparently, sdkGetAverageTimerValue() returns elapsed time for the frame in milliseconds.
    Last edited by Dark Photon; 05-27-2017 at 06:21 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •