Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 5 of 5

Thread: Frame time increases significantly after exactly 30 seconds

Hybrid View

  1. #1
    Newbie Newbie
    Join Date
    Nov 2012
    Posts
    2

    Frame time increases significantly after exactly 30 seconds

    I'm developing a game using OpenGL and have run into a strange issue. Exactly 30 seconds after creating a context, the frame time increases by 2 - 3x depending on the scene, and then remains constant. I am using query objects with GL_TIME_ELAPSED to get the frame time. Below is a small demo that demonstrates the issue.

    Code :
    #include <stdio.h>
    #include <GL/glew.h>
    #include <GL/freeglut.h>
     
    int main(int argc, char** argv)
    {
    	glutInit(&argc, argv);
    	glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
    	glutCreateWindow("Frame time test");
    	glewExperimental = GL_TRUE;
    	glewInit();
    	GLuint query;
    	glGenQueries(1, &query);
     
    	while(1) {
    		glClear(GL_COLOR_BUFFER_BIT);
    		glBeginQuery(GL_TIME_ELAPSED, query);
    		glBegin(GL_TRIANGLES);
    		glVertex3f(-1, -1, 0);
    		glVertex3f(1, -1, 0);
    		glVertex3f(0, 1, 0);
    		glEnd();
    		glEndQuery(GL_TIME_ELAPSED);
     
    		GLuint drawTime;
    		glGetQueryObjectuiv(query, GL_QUERY_RESULT, &drawTime);
    		char timeStr[32];
    		sprintf(timeStr, "%f", drawTime / 1000000.0f);
    		glutSetWindowTitle(timeStr);
     
    		glutSwapBuffers();
    		glutMainLoopEvent();
    	}
    	return 0;
    }
    I know I shouldn't be using glBegin/glEnd, my actual game uses vertex buffers and the issue is exactly the same. I've also tried using GLFW, but the exact same thing happened.

    Is there something I'm doing wrong, or is this a driver bug? I've been doing OpenGL development for quite a while and have never seen this before.

    I'm on Linux with a NVIDIA GTX 560 Ti that has the latest drivers (310.19).

  2. #2
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    the frame time increases by 2 - 3x depending on the scene, and then remains constant
    So? You're not actually doing anything yet. Your frame time is negligibly small.

    What does it matter if a frame time measured in microseconds jumps by a factor of 3?

  3. #3
    Newbie Newbie
    Join Date
    Nov 2012
    Posts
    2
    It may not equate to much in this example, but it can mean a significant FPS drop in a real world application.

    Anyways, I managed to fix it by disabling adaptive clocking in nvidia-settings.

  4. #4
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    It may not equate to much in this example, but it can mean a significant FPS drop in a real world application.
    Or... it may not. If it's only a few microseconds, real-world applications wouldn't notice. In short, your profiling is too artificial to be detecting anything that is worrisome.

    The fact that disabling adaptive clocking turned it "fixed" it is evidence of that. You were detecting the fact that your graphics card looked at your workload and said, "Oh, nothing much here. No need to stress myself."

    Real-world applications will not have this problem.

  5. #5
    Junior Member Newbie
    Join Date
    Oct 2012
    Posts
    19
    you have wrote no code here, but it may be because of writing rendering functions in main display function(they will be re-rendered every 30 secs).
    Try having some void init(){//hardworking functions, or rendering functions}., and just attaching it to your display.
    It may fix the problem you have.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •