PDA

View Full Version : vetical retrace, linux, OpenGL, SGI230, nvidia



simon_hosking
02-11-2001, 08:56 PM
Hi everyone,
I am a 2nd year Ph.D. student studying VISUAL PERCEPTION at Deakin
University, Melbourne, Australia. I am investigating via a series of
computer based and applied experiments how people learn to judge
time-to-contact with approaching objects, for example, how do people
learn to time a successful catch of an approaching tennis ball.

Over the last year i have taught myself (there is no one at Deakin
University familiar with OpenGL) how to program in C and create the
stimuli for my experiments using the OpenGL API. I have written a simple
program that displays a sphere which translates along the z-axis towards
the observer (camera) for x amount of frames. Each frame is synchronized
to the vertical refresh using #export GL_SYNC_TO_VBLANK=1.

My problem is this. The experiments that I am running are extremely time
critical. I need to time exactly how long the sphere is translating for,
so that I can calculate its velocity and calculate exactly when the
sphere would "collide" with the observer.

I have been told by users of other graphics APIs and OS's that the best
way to get accurate timing is to use vertical retrace. That is, time the
animation of objects by counting the number of vertical retraces, such
that in my sphere example, if vertical retrace is at 60 Hz and i wanted
the ball to translate for ONE SECOND i would write code for the
translation to occur for 60 frames/retraces.

Would anyone have any code or information that they could share with me
about how to determine what the actual operating refresh rate is when my
program is running, specifying a particular frame
rate?(glxSwapIntervalSGI() does not appear to be supported), and/or how
to time the translation of the sphere so that it is synchronized with
VBLANK and will swap buffers for X amount of retraces?

I am using a SGI230 workstation (intel processor), RedhatLinux6.2, and
nvidia's Geforce 256 DDR. None of the OpenGL extensions [e.g.,
glXWaitVideoSyncSGI(); GLX_SGI_swap_control] that may have been able to
address this issue appear to be supported.

any replies would be greatly appreciated as i'm really stuck on this
one,
Simon


PS: sorry about my naive description of vertical retrace issues, i'm
really a novice at this stuff.

rts
02-12-2001, 11:59 AM
Sorry I don't know anything about vertical retrace, but I do know one way to time things nicely is to use, well, a timer. And the SDL library has a very nice timer. Check out www.libsdl.org (http://www.libsdl.org) .

Meantime I'll try and figure out vsync in X.

joekrahn
02-14-2001, 12:54 PM
If using GLUT, you can just use glutGet(GLUT_ELAPSED_TIME) to find the current time in milliseconds since the program started.

I found that the proprietary nVidia GLX for Linux has the function glXGetVideoSyncSGI() which returns the video frame count, incremented at each vblank. It also has glXWaitVideoSyncSGI(), but hogs the cpu until it reaches the vblank. Not surprising, since these are not advertised in the GLX extensions string and not really ready yet.

Joe Krahn

02-15-2001, 11:18 PM
First of all i d'ont think that using Vretrace is the best way for implementing timed code. Because au supposed the retrace depends of your VideoMode, your screen and you video board.
Their is a lot of way to implements real timing function.


Get the actual time with the "ansi adequate function'.
Use an equivalent function as:
- SDL_GetTicks();
- glutGet(what joekrahn said).

Or used a callbacks timer function
- glutTimerFunc(nbms before the call, function to call, data to pass to the function).

tfpsly
02-26-2001, 08:24 AM
THe vertical sync is never exactly 60hz.
If you need high accuracy, this won't run prolerply.

You should use an high accuracy interpolated timer, such as the one featured in linux kernel : it does interpolate time between mainboard timer's interruption quite well.