GL_EXT_timer_query --> driver bug ?

Tested with a 8800ultra, driver 175.51.

Considering this simple test, there is something wrong with the extension and/or the glFlush implementation :

glBeginQuery(GL_TIME_ELAPSED_EXT, id);
// Here you can draw any primitive glBegin()/glEnd(), but it is not required to show the problem
glFinish(); // Needed for the proof of concept, we need to force glBeginQuery() and glEndQuery() to be in separate command buffers
glEndQuery(GL_TIME_ELAPSED_EXT);
glFlush();

// Now, we normally can do anything without influencing the timer… BUT, this is not the case…
Sleep(777);
glFinish();

The problem is that the time of the Sleep call (here 777ms) is included in the query time!

I also have a second (?) problem which is not easy to simplify in a quick test without causing the bug to disappear, but here is what I’m doing :

… after SwapBuffers()

glBeginQuery(GL_TIME_ELAPSED_EXT, id);
// Here I draw a scene
glEndQuery(GL_TIME_ELAPSED_EXT);
glFinish(); // I don’t trust glFlush()

This gives me a result : 1.5ms in my app. What is wrong is that if I add a glFinish() at the beginning …

… after SwapBuffers()

glFinish();
glBeginQuery(GL_TIME_ELAPSED_EXT, id);
// Here I draw a scene
glEndQuery(GL_TIME_ELAPSED_EXT);
glFinish(); // I don’t trust glFlush()

… I now have 6ms !! Getting the timing result through CPU with QueryPerformanceCounter (now that I have a glFinish before and after) gives me 6ms. Looks like the first result is wrong…

I’ve re-tested with a GTX280 and driver 177.79 : the bug is gone.

It seems to be broken in the 175 driver branch only. Not sure about my second problem though.