PDA

View Full Version : Display() stalled by screen refresh?



brown12
10-30-2000, 12:17 PM
I have a problem which seems to reoccur in multiple projects using GL with GLUT.
I call my Display callback as often as possible to do animation, and even when Display contains the smallest possible drawing code (eg glDrawPixels with a one by one rectangle, or a glBegin/glEnd routine with only one polygon), the Display call seems to wait until the next screen refresh until it returns control (I did some timing, and the Display call lasts about 1/120 second on average, with a range anywhere from almost 0 to about 1/60 second, thus I think it must be waiting for the refresh). I'm working on a Solaris machine, but have observed similar behavior on WinNT. Is there any way around this -- can I get Display to return earlier, or can I synchronize it with the screen refresh? I realize I could run the Display in a separate thread, but I only want to try that as a last resort.
Currently, half my CPU cycles are going to Display, and 95% of those seem to be just waiting for the refresh.
Thanks for any suggestions,

Joel

Antonio
10-30-2000, 03:25 PM
Are you calling the render function yourself?

Have you tried registering a callback for the idle function? This way GLUT calls your rendering function as often as possible.

Antonio www.fatech.com/tech (http://www.fatech.com/tech)

brown12
10-30-2000, 03:31 PM
Yes, I do use an Idle callback, and Idle calls my Display callback as often as possible. The problem is that each of these Display calls stalls up to 1/60 second waiting for the screen refresh, rather than returning to my Idle loop.
Thanks,
Joel



Originally posted by Antonio:
Are you calling the render function yourself?

Have you tried registering a callback for the idle function? This way GLUT calls your rendering function as often as possible.

Antonio www.fatech.com/tech (http://www.fatech.com/tech)

mcraighead
10-30-2000, 09:30 PM
This sounds like intentional and correct behavior. Drivers generally only permit themselves to batch up a certain number of frames at a time, since batching up too many causes perceived lag.

If you have vsync on, this will be synchronized with the refresh rate, i.e., you will not be able to run a frame in less time than the refresh rate of the monitor would allow, unless you don't have any frames batched up. If vsync is off, you can go faster, but the driver _will_ wait at some point or another to make sure you don't get ahead, once again.

There are formulas for this, but they are messy functions of how fast the HW is going, how fast the SW is going, how many frames the driver batches maximum, how big the driver's command buffer is, and what kind of data you're causing the driver to put in the command buffer.

- Matt

Antonio
10-31-2000, 09:26 AM
can you turn vsync off? I never had a problem with drivers batching frames so I can't comment on that, but if you're talking about 120 frames per second then I guess that couldn't be the problem otherwise it would be impossible to go over that.

Antonio www.fatech.com/tech (http://www.fatech.com/tech)

Succinct
11-01-2000, 07:50 AM
I'm curious why you would turn off vSync, as won't turning it off introduce tears in your final animation?

Pauly
11-01-2000, 07:54 AM
Speed freaks turn V-sync off all the time to woo people with silly numbers of FPS http://www.opengl.org/discussion_boards/ubb/smile.gif

You can too in your own apps with the swap_hint extension...

beavis
11-02-2000, 12:43 AM
actually, turning off vsync will introduce tearing only if the refresh rate is low (60 Hz). When you set it high enough (e.g. 85 Hz) with DEVMODE.dmFields |= DM_DISPLAYFREQUENCY, you've got a high framerate and no tearing...