I’m developing a game using OpenGL and have run into a strange issue. Exactly 30 seconds after creating a context, the frame time increases by 2 - 3x depending on the scene, and then remains constant. I am using query objects with GL_TIME_ELAPSED to get the frame time. Below is a small demo that demonstrates the issue.
#include <stdio.h>
#include <GL/glew.h>
#include <GL/freeglut.h>
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutCreateWindow("Frame time test");
glewExperimental = GL_TRUE;
glewInit();
GLuint query;
glGenQueries(1, &query);
while(1) {
glClear(GL_COLOR_BUFFER_BIT);
glBeginQuery(GL_TIME_ELAPSED, query);
glBegin(GL_TRIANGLES);
glVertex3f(-1, -1, 0);
glVertex3f(1, -1, 0);
glVertex3f(0, 1, 0);
glEnd();
glEndQuery(GL_TIME_ELAPSED);
GLuint drawTime;
glGetQueryObjectuiv(query, GL_QUERY_RESULT, &drawTime);
char timeStr[32];
sprintf(timeStr, "%f", drawTime / 1000000.0f);
glutSetWindowTitle(timeStr);
glutSwapBuffers();
glutMainLoopEvent();
}
return 0;
}
I know I shouldn’t be using glBegin/glEnd, my actual game uses vertex buffers and the issue is exactly the same. I’ve also tried using GLFW, but the exact same thing happened.
Is there something I’m doing wrong, or is this a driver bug? I’ve been doing OpenGL development for quite a while and have never seen this before.
I’m on Linux with a NVIDIA GTX 560 Ti that has the latest drivers (310.19).