Hi,
I am new here, I am now trying to render two images as fast as I can, so I used a teapot in OpenGL, and switching its color between green and purple in an idle function. My display card is Nvidia Geforce 310 and I have a CRT monitor(works at 75Hz) and DLP projector(works at 120Hz) connected to it.
When I put the render window in CRT(with Vsync on), the refresh rate is about 60Hz, I can see the flashing between green and purple. When I put the render window in projector(with Vsync on), the refresh rate is about 100Hz, and I can only see a white teapot(which is the sum of the green and purple).
- Can anybody tell me for each image(either green or purple teapot) what is the refresh rate? Is that just the (total refresh rate)/2?
- I want to use my camera to capture either one of the teapot, so I tried to use PGR firefly and run it at 60Hz fps, but it can’t distinguish the green and purple, instead it sees the same thing as me, white teapot. Why?
- I guess maybe it’s because the camera and projector are not synchronized. I searched the forum, some people say I need a Quadro card them the interval between the green and purple is stable, then I can use that interval to sync camera
and projector. I wonder if I can do this without Quadro card? - It is said that when the fps is higher than human’s flicker fusion frequency(75Hz), we will not see the flashing. But my CRT refresh rate is 75Hz, why can I still see the flashing?
Here’s my code, I calculate OpenGL fps using QueryPerformanceCounter.
#include <windows.h>
#include <iostream>
#include <GL/glut.h>
#include <stdio.h>
#include <time.h>
using namespace std;
double PCFreq = 0.0;
__int64 CounterStart = 0;
float CL=1;
float CR=1;
void StartCounter()
{
LARGE_INTEGER li;
if(!QueryPerformanceFrequency(&li))
cout << "QueryPerformanceFrequency failed!
";
PCFreq = double(li.QuadPart)/1000.0;
QueryPerformanceCounter(&li);
CounterStart = li.QuadPart;
}
double GetCounter()
{
LARGE_INTEGER li;
QueryPerformanceCounter(&li);
return double(li.QuadPart-CounterStart)/PCFreq;
}
double CalFrequency()
{
static int count;
static double save;
static clock_t last, current;
double timegap;
StartCounter();
++count;
if( count <= 100 )
return save;
count = 0;
last = current;
current = clock();
timegap = (current-last)/(GetCounter()*1000000);
save = 100.0/timegap;
return save;
}
void myDisplay(void)
{
double FPS = CalFrequency();
printf("FPS = %f
", FPS);
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(CL,0.5,CR);
glutSolidTeapot(0.5);
glutSwapBuffers(); // for GLUT_DOUBLE
//glFlush();// for GLUT_SINGLE
}
void myIdle(void)
{
CL=-CL;
CR=-CR;
myDisplay();
}
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
//glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE);
glutInitWindowPosition(100, 100);
glutInitWindowSize(400, 400);
glutCreateWindow("Time");
glutDisplayFunc(&myDisplay);
glutIdleFunc(&myIdle);
glutMainLoop();
return 0;
}
You can test my code to see if there’s something wrong.
Thanks,