vsync

I have a geforce1 64mb card…and I recently made an opengl program that lets you walk around in a “world”. I have the program running with a timer making it go 30fps, and I get lots of tearing. Also…I realized I had turned off opengl vsync through display properties, so I turned it back on…however when vsync is on, the tearing is gone, but I get a lot of wierd slow downs…sometimes going to a crawl for a bit then speeding up(they are random). I have a good viewsonic monitor that is refreshing at 60hz. So I don’t think the monitor is to slow refreshing. All I am asking is : Is this a bug in NVIDIA’s drivers, or is it with my program? Thanks for your help.

Waiting on the vertical blank (vsync) of the monitor means you cannot go faster than the Hz of the monitor, 60 fps in your case.
Furthermore if the time needed to draw a frame takes longer than the 1/60 sec you have, the driver has to wait for the next vblank, which has just halved your frames per second for this image.
The same is true for longer drawing times, means you can only get 60/1, 60/2, 60/3, 60/4 … frames per seconds. You see the dramatic performance decrease from 60 to 30 fps and then to 20, then 15 and so on…
That also means the effect depends on the current monitor refresh rate and your program could behave differently with higher or lower Hz!

With vsync off the image is just swapped at the moment it’s ready, even if the raster beam is in the middle of the monitor. That also means you see parts of multiple images, which explains the tearing effect, highly visible with camera rotations around the “up”-vector.

Couple of possibilities here besides monitor issues. If you use Windows timer function it is notoriously inaccurate. Also, be sure to use double-buffering and include a glFinish at the end of your rendering loop. Also, if your slowdowns occur when you can see more of your scene (rather than less) could be natural. Especially if you are rendering everything in immediate mode (no display lists, for example; plus using lots of state changes (translate/rotate, for example) in immediate mode. OpenGl superbible has excellent chapter on real-time rendering that addresses your problems directly. Hey, its my problem too - prob everybody’s. Using multi-threading and avoiding use of Windows messaging and Windows Timer function will get you big performance boost. I think Richard Wright should pay me for pumping his book so much! But it sure works for me.

Well, for what it’s worth, my opinion is that locking rendering to a specific rate is not exactly a good idea. Of course it seems like a good idea, and I used to try and do the same thing when I started doing graphics coding. The fact that you are waiting for the vsync is only going to exacerbate the problem. Not that I’m suggesting not waiting for the vsync.

Anyway, using the performace counters ( if you are using windows ) and coding the engine such that it will run at any given frame rate ( by interpolating object movements, using the calculated frame time ) is probably a much better way to do it.

But El Jefe, doesn’t it come to the point where you have to eventually deal with the vsync problem, irregardless of whether your program runs ‘unmetered’ at the fastest framerate possible?

I’m asking because I’m having that very problem with my Disasteroids 3D game. Things run great on my GeForce card, but on my systems with Intel, AMD and Voodoo2 chipsets the SwapInterval OpenGL extension isn’t supported through their drivers, the framerate gets jerky as the game plays because of the Vsync issue.

Disasteroids 3D uses the Performance Timers, the game is designed to run full out as fast as the computer will allow, but I’m still having jerkiness on computers (regardless of their MHz) because of the Vsync.

What can be done to make sure the game runs smoothly when the Vsync IS enabled? I’d sure like to know

Thanks in advance (to anyone who can answer this)!