GLUT: Erratic frame rate despite limiting (Using Windows)

Edit:
(1) Just realized the attachment is too small to really be visible so I included the stdout at the bottom of the post, sorry about that.

The Problem:
As you can see in the attachment below the frame rates I am getting are very erratic despite my attempt to limit it. I’m assuming I am probably missing something pretty obvious. Does anyone have any insight why this might be happening?

My System:
Alienware M18x
Windows 7 Home Premium
Dual Radeon 6990m cards
2nd Gen i7 2.4ghz-3.1ghz

Compiler:
MingW - gcc version 4.7.2

Compiler Command:
g++ -o Modeler -Wall *.cpp glut32.lib -lopengl32 -lglu32 -static -std=c++11

Any insight at all would be very appreciated.

Thank you for your time,
Brandon Murphy

My display function:


int multiplier = 1;

/**
 * Callback function to display the scene
 */
void Window::displayCallback()
{
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    static int theta = 1;

    // TODO: Pull this out into the model object
    glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
    glBindTexture(GL_TEXTURE_2D, texName);

    glPushMatrix();
        glTranslatef(0, 0, -4 * multiplier);
        glRotatef(theta, 0.1, 0.5, 1);
        for(int i=0; i<multiplier; i++)
        {
            for(int j=0; j<multiplier; j++)
            {
                for(int k=0; k<multiplier; k++)
                {
                        GLHelper::drawCube(i, j, k, 1, 1, 1);
                }
            }
        }
    glPopMatrix();


    theta++;

    glutSwapBuffers();


    // TODO: this is a mess
    int dT = ChubbyEngine::getFrameTimeDifference();
    ChubbyEngine::frameRendered();
    dT = (dT <= 0) ? 1 : dT;
    int sleepTime = 1000/ChubbyEngine::getFrameRate() - dT; // getFrameRate() returns 60
    sleepTime = (sleepTime <= 0) ? 1 : sleepTime;
    Time::sleep(sleepTime);
    cout<<"FPS: "<<(int)(1000/dT)<<" 	CubeCount: "<< multiplier*multiplier*multiplier<<"    Render Time: "<<dT<<"ms"<<endl;
    glutPostRedisplay();
}

Time.cpp

#include "Time.h"

int Time::getTime()
{
    #ifdef WINDOWS
        SYSTEMTIME st;
        GetSystemTime(&st);
        return (((st.wHour)*60 + st.wMinute)*60 + st.wSecond)*1000 + st.wMilliseconds;
    #endif

    #ifdef LINUX
        return time(0);
    #endif
}

void Time::sleep(int milliseconds)
{
    #ifdef WINDOWS
        Sleep(milliseconds);
    #endif

    #ifdef LINUX
        std::this_thread::sleep_for(std::chrono::milliseconds(milliseconds));
    #endif
}

Standard Out (short selection):


FPS: 71 	CubeCount: 1    Render Time: 14ms
FPS: 1000 	CubeCount: 1    Render Time: 1ms
FPS: 500 	CubeCount: 1    Render Time: 2ms
FPS: 66 	CubeCount: 1    Render Time: 15ms
FPS: 71 	CubeCount: 1    Render Time: 14ms
FPS: 1000 	CubeCount: 1    Render Time: 1ms
FPS: 500 	CubeCount: 1    Render Time: 2ms
FPS: 66 	CubeCount: 1    Render Time: 15ms
FPS: 71 	CubeCount: 1    Render Time: 14ms
FPS: 1000 	CubeCount: 1    Render Time: 1ms
FPS: 500 	CubeCount: 1    Render Time: 2ms
FPS: 66 	CubeCount: 1    Render Time: 15ms
FPS: 71 	CubeCount: 1    Render Time: 14ms
FPS: 1000 	CubeCount: 1    Render Time: 1ms

[ATTACH=CONFIG]428[/ATTACH]

GetSystemTime is very unlikely to be usable for the task at hand - use QueryPerformanceCounter and QueryPerformanceFrequency.

However, the primary cause for your trouble is actually this: “Sleep(milliseconds);” - remove it. It does not just pause for a specific time - it tells the OS that you have nothing useful todo for ROUGHLY the time specified and IF the OS would be so kind and give control back AFTER more than that time (an unspecified amount of “more”) has passed.

Use vertical sync.

GetSystemTime is very unlikely to be usable for the task at hand - use QueryPerformanceCounter and QueryPerformanceFrequency.

Ah ok, that makes sense.

However, the primary cause for your trouble is actually this: “Sleep(milliseconds);” - remove it. It does not just pause for a specific time - it tells the OS that you have nothing useful todo for ROUGHLY the time specified and IF the OS would be so kind and give control back AFTER more than that time (an unspecified amount of “more”) has passed. Use vertical sync.

I would like to eventually add the ability to let the user specify whether or not to use vsync or not, is there something else i can do? Possibly the GLUT timer callbacks? Are they accurate enough to do what I am wanting?

Thanks!
Brandon