Surprise, The Frame Speed is varied all the time!

Hi all, when I run my project, the frame speed varied up and down. I can not understand it !
let me introduce my project here.
My project is programed with SDL & OpenGL, in the main.cpp file, i use the following codes,

while ( !done )
{          
    SDL_Event event;
    int lasttype = 0, lastbut = 0;
    while(SDL_PollEvent(&event))
    {
       deal with message......
    };

    SuperPG.Draw();
    glFinish();

    SDL_GL_SwapBuffers(); 
}

every 3D object is rendered in SuperPG.Draw() function, the function code is as followed:

Draw()
{
millis =GetTickCount(); //(ms)

if (lastmillis)    //Global variant initialized as 0 
{
  curtime = millis - lastmillis; //

   m_NewDesktop.Draw();

  DrawStatus(); //print the fps
}

lastmillis = millis;

}

in the DrawStatus() function, i compute the fps:

{
if (curtime!=0)
{
fps = 1000.0/curtime;
oldfps = fps;
}
else
{
fps = oldfps;
}

*fpschar='\0';
sprintf(fpschar,"%ld", (int) fps);  
printf("%s

", fpschar);
}

Who can tell me why the result fps is varied strongly? sometimes 12 frame per second
sometimes 15fps
sometimes 16fps
sometimes 21fps.
Does the message pool effect the frame speed? But I never touch the mouse and keyboard since the project start running. Why I can not see a steady fps?
Anybody tell me the reason? Thank you very much.

Do you always draw the same thing and from the same point of view ?

read this:

http://www.gaffer.org/articles/Timestep.html

it the solution to all your problems.

GetTickCount() has very low resolution. It does not return 1-millisecond accuracy. Probably 10ms or 50ms resolution.

To test, try to call GetTickCount() thousands times and print the return values out. You will see it suddenly jumps from one value to the other.

…unless you call timeBeginPeriod(1) in your initialisation function, and timeEndPeriod(1) in your exit function. Then you get much better resolution with the timer functions.

i don’t think it is a good idea to display the frames/sec with printf(). the output in a console will slow your app down. a better way would be displaying the fps in your app’s window (that’s not so hard if you use a bitmap font). and since the value may change very quickly, it’s probably better to calculate it only every 10 frames (or maybe 20 or 50, depending on your app’s performance), thus getting an average value.

knackered,
Thanks for the tip. I did not know timeBeginPeriod(1) to get better resolution, but it is still ~16ms on my WinXP.

Thanks to all!

to jide: I have drawed the same thing in each frame wiyhout change of the viewport.

to herc: thank you for the web page which give me good idea to control object motion simulation. but my confusion is why I can not get a stable frame speed, I do not care how to control object motion with reality under various frame speed.

to RigidBody: I would like to take your suggestion, but if we using bitmap font, extra memory will be used for the bitmap, the render duty is heavier than before. that is to say, rendering fps will spend extra time. as will slow down our app down too!

to songho and knackered :

the following numbers is the fps printed one line for one frame.


22
22
22
22
22
21
20
22
23
22
22
22
22
22
22
22
22
22
11
125
22
20
20
22
23
22
22
23
22
22
22
22
22
23
22
22
22
21
21
22
11
111
22
23
22
22
22
22
22
22
22
22
22
21
20
22
22
22
22
23
22
22
22
11
111
23
22
22
22
21
21
23
23
22
22
23
22
22
22
23
22
23
21
22
23
11
111
23
22
22
23
21
23
22
22
23
22
22
22
22
22

so we can get some rules!
I don’t think rules is due to GetTickCount().
the return value of GetTickCount() is not periodically varied.

What’s the reason?
when I relpace the glFinish() with glFlush(), the result is the same.

22
23
22
21
21
22
21
24
22
22
23
22
22
23
23
22
23
22
11
100
22
22
23
22
22
22
23
22
22
22
23
23
23
22
22
21
21
22
22
23
22
11
111
22
23
23
22
22
22
23
22
21
21
22
22
22
22
22
23
22
22
23
23
22
11
111
22
21
20
22
22
22
21
23
23
22
22
22
23
22
22
22
23
21
21
22
11
111
23
22
22
22
22
22
23
22
22
23
22
21
21
23
22
22
23
23
22
22
22
11
111
22
22
23
21
21
21
23
22
22
23
22
22
22
22
22
23
22
22
22
22
21
11
125
22
22
22
22
22
23
22
22
23
22
22
23
22
21
21
21
22
22
22
22
11
100
22
22
21
23
22
21
22

why? Is the problem related to the OpenGL command Buffer? “the buffer is full” cause the frame speed slow down?
Need you help! thanks again!

if you have something bigger than a Z80, the memory used by the bitmap font should not cause any problems. when you create a bitmap font using glXUseXFont/wglUseFontBitmaps you can choose the range of ascii characters that is used. with characters 56-71 you get the numbers from 0 to 9 plus a dot (’.’). that’s enough to display the fps value.

concerning the performance: i tried a very simple example which only clears the color and depth buffer. on my system i get ~20 fps LESS when using printf.

To my point of view your program is drawing at 22Hz. The differences might come from external stuffs like the OS sheduler, other programs running… If you look carefully, it ‘periodically’ goes up to 100 fps, but just before it slows down half. yeah, I really think there might have something out of your program. Or do you do special things inside your program ?

concerning the performance: i tried a very simple example which only clears the color and depth buffer. on my system i get ~20 fps LESS when using printf.
yes with a simple example (with >1000 fps say) with high rate outputting to a console will negatively inpact performance, but for a normal app it wont.

to the OP as suggested u need to take the average of the last say 10 frames, the more samples the better.
even with a (in theory) deterministic system like a computer, if u run the same cpu benchmark twice u will get 2 diferent timings (perhaps big difference) for a single iteration. if u take the average over 1000s iterations then the difference wont be large

nope. the framerate was about 300/s when i used a bitmap font to display the fps and dropped to about 150/s-250/s when using printf each frame.

you don’t need to take an average, just use a lerp.

something like:-

float newFPS = calcFPS();
m_FPS = m_FPS>0.0f ? LERP(m_FPS, newFPS, 0.5):newFPS;

Thanks to all.

To jide: if my app run at 22fps , why it can touch 100fps. if there are something out of my program, my app should be slowed down, not accelerated.
In order to simplify the problem, I only draw a quad 1000 times. all codes in Draw() is followed:

       glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); 
        glPushMatrix();      
      
        for (int i=0; i<1000; i++)
        {            
            glColor4f(1.0f, 1.0f, 1.0f, 0.4f);      

            glPushMatrix();
                glBegin(GL_QUADS);
                glNormal3f( 0.0f, 0.0f, 1.0f);
                glTexCoord2d(0.0,0.0); 
                glVertex3f( 90.0f,  100, 0.00f);
                glTexCoord2d(0.0,1.0); 
                glVertex3f( 90.0f,  -100,  0.00f );
                glTexCoord2d(1.0,1.0); 
                glVertex3f(-90.0f,  -100,  0.00f );
                glTexCoord2d(1.0,0.0); 
                glVertex3f(-90.0f,  100, 0.00f );
                glEnd();
            glPopMatrix();

            glColor4f(1,1,1, 1);   
            //glFlush();
        }
        glPopMatrix(); 

in each frame, the periodical thing is treatment of message queue and render all objects in Draw function. so what is the special things?

To zed and RigidBody:
I think the printf function can only slow down my app, it can not vary my frame rate up and down because in each frame i call printf function the same times. and now i just want to track the trace of the fps. so i still use the function. the result is as followed:

30
31
31
31
28
33
31
32
32
31
33
31
30
32
31
32
32
31
32
29
34
31
31
32
32
32
31
31
30
33
31
32
31
31
30
31
32
32
28
33
32
32
30
32
32
32
31
32
31
32
31
32
32
32
32
32
32
33
30
29
34
32
32
32
31
29
27
30
32
27
34
31
32
32
33
31
32
32
32
32
32
29
34
32
31
31
32
32
29
33
33
29
32
31
33
32
31
32
31
22
34
31
32
32
30
29
34
32
33
31
32
30
33
31
32
32
33
31
32
31
32
30
33
29
34
30
32
30
34
27
32
32
32
32
29
34
32
32
30
32
32
32
32
31
32
31
31
32
32
32
28
33
32
30
31
28
30
33
27
33
31
32
30
32
31
32
31
31
31
32
32
31
27
33
33
32
32
32
32
30
30
24
33
33
32
31
33
29
32
32
31
32
32
30
22
32
30
33
32
32
32
32
30
32
31
31
31
32
32
32
32
32
32
28
25
34
32
32
27
34
31
25
33
32
31
32
32
33
31
32
32
32
22
33
31
32
32
31
31
32
30
33
31
32
33
32
22
31
33
29
32
32
27
34
32
31
31
33
22
33
32
32
32
30
33
32
32
33
31
32
28
33
31
32
32
32
31
31
31
31
30
32
31
31
32
30

To my surprise, In the Draw() function, when I add the glFlush()as the above, I can get a stable frame rate as followed:

34
33
34
34
33
34
34
33
34
34
32
32
34
34
33
34
34
30
33
34
32
34
33
35
34
33
34
35
33
34
34
34
33
34
31
33
33
34
34
33
34
34
33
34
34
32
34
34
34
33
34
32
32
32
35
34
33
32
34
34
34
34
34
33
35
32
33
34
34
33
33
34
34
33
34
35
33
34
34
33
33
32
34
33
34
33
34
32
32
32
33
35
34
34
33
34
34
34
34
34
33
34
34
33
33
33
33
33
35
34
34
33
32
34
33
33
34
34
32
33
33
34
31
35
34
33
34
34
33
33
34
34
34
34
34
33
32
34
33
34
34
34
33
34
34
33
34
34
33
33
34
34
31
32
34
33
32
34
34
33
33
35
33
34
34
35
33
34
34
33
34
34
33
34
33
34
33
33
34
33
34
34
34
32
32
34
31
34
34
34
31
34
34
33
34
35
34
33
34
34
32
34
34
33
34
34
34
33
33
33
33
34
35
34
33
34
34
33
33
34
32
32
34
31
33
34
34
34
34
34
34
31
35
34
33
34
34
34
33
34
34
34
34
33
33
34
34
34
31
34
34
33
34
33
33
30
32
34
34
33
35
34
33
34
35
33
34
34
34
33
34
34
31
35
34
33
34
34
33
30
34
33
34
33
33
34
33
33
31
32
30
34
34
33
33
33
34

How the OpenGL function can control the balance of frame rate? How can glflush accelerate my app?
What’s the reason?
Thanks agian!

gl wont expicilty control it you have to do it yourself (sorry didnt read all of the post it was too long). look at idle func etc. when calling windows timing stuff remember to keep processing other windows commands

Thank you. How I add SDL_Delay function to control the frame rate down to 25fps. it is quite stable now.
Thank all for all helps!