millisecond accuracy

Dear OpenGLers,
I am running psychophysics experiments and need to be able to have up to millisecond control over the display of (relatively simple) objects and more imortantly over the time of key presses.
Basically, I need an accurate timer throughout the trials of the experiment. I would be thankful if anyone can help me with his/her expertise.
Thanks,
Mehrdad

Maybe you can use timeGetTime() function from mmsystem.h and .lib!

In windows, look for the performance timer-feature on MSDN.
In Linux, use gettimeofday.
In something else, I don’t know

what precision do you need?
64bits will be enough?
if so, you should use __asm rdtsc (see intel docs on using rdtsc) to see how many clockticks has been passed, then divide it by the frequency of the processor to get the time.

I hope i’ve put everything right…

Have fun

Here is something that I use to get sub milisec accuracy.

You need to include windows.h

LARGE_INTEGER freq, startTime, endTime;
BOOL accurateTime = QueryPerformanceFrequency(&freq);

QueryPerformanceCounter(&startTime);
//
//
QueryPerformanceCounter(&endTime);
double time =(endTime.QuadPart - startTime.QuadPart)/double(freq.QuadPart);
Elapsed time : time*double(1000.0f) mS

Your display typically takes 16.7 milliseconds to refresh (at 60Hz) and even then that’s if it’s a CRT. So you might have millisecond control in rendering but you cannot display the results without special hardware. If you up the refresh rate and reduce the resolution you would be able to improve on these numbers but not down to a millisecond.

On windows you cannot guarantee that you will be able to present a new image to the viewer each frame although you could measure the time and see if you did, and you’ll achive it most frames with the right code.

You may want to look at Linux with real time options compiled into the kernel if you want a better guarantee that jitter won’t defeat your experiment occasionally.

Typical displays refresh about once every 1/60-th to 1/80th second. That means about 12-17 milliseconds of latency, just for the beam to get back to the start.

Then comes the phosphorus. If it’s a CRT, the decay probably has gotten down to half intensity after the refresh time. If it’s a CRT, there’s some inherent slowness in turning each pixel on/off, which might be even worse than a CRT.

Then comes the speed and jitter of your input device. The old-style PS/2 keyboard port is a fairly slow serial bus. And while USB is faster (in throughput), it chops everything up in millisecond granules, which end up having more jitter than that when it gets to the CPU.

Then comes the timing stability of the machine. First, all the available PC timers have some kind of defect (see http://www.mindcontrol.org/~hplus/pc-timers.html ). Second, there may be device drivers running on your machine that at times disables interrupts for more than a millisecond, leading to additional jitter in your program execution.

To get millisecond stimuli times, I suggest using a laser and a MEMS mirror system, or something else with better-than-millisecond response time. Then I suggest using some bounce-free input, such as an optical breaker system, wired to some high-precision scientific counter. General purpose computer hardware and OSes (be it regular Linux or Windows) aren’t very good when it comes to millisecond accurate real-time stuff.

Originally posted by mjaz:
Dear OpenGLers,
I am running psychophysics experiments and need to be able to have up to millisecond control over the display of (relatively simple) objects and more imortantly over the time of key presses.
Basically, I need an accurate timer throughout the trials of the experiment. I would be thankful if anyone can help me with his/her expertise.
Thanks,
Mehrdad

For millisecond precise display, the refresh rate would have to be 1KHz and I dont think there is any monitor or video card that can do that. Why is the display so important?

The input device, I can understand. I think such devices are listed as being “data acquisition” devices.

Vman, jwatte and dorbie,
Thanks for your comments. You exactly put your finger on the problem that I am having.
The reason I need an accuare measure of time is twofold:

  1. I am presenting subjects with visual stimuli that are somewhat hard to discriminate in one of their visual features (i.e. direction of motion). The subjects view the stimulus for a specific amount of time and then have to report their judgment. I need to have accuarate and parametric control over the viewing time so that I can study the strategies they incorporate to gather evidence towards their final judgment.
  2. I also need to meause their reaction time as it may change both with viewing time and the difficulty of the discrimination. This also can help me in understaing how gathering evidence leads to decision making and the final committment (e.g. reporting the judgment via key press).
    Therefore, I need to have control over the presentation time and also accurate measurement of the subject’s response time.

All your inputs were very useful; I gather there is not much I can do for the case of viewing time.

By the way, I am working on an OS-X machine.

Thanks all the same and would appreciate any other feedback.

Mehrdad.

I am not sure, but maybe the latency could be lowered by using two threads. One for the display and one for input. So the input thread could gather information while the display thread is still waiting for the monitor.
To get the best out of it one should play a bit with thread priorities.

Jan.