View Full Version : Frame rate issue?

02-27-2011, 08:46 PM
I am running a openGL program on my machine and it runs fine!

When i run my executable on another machine the frame-rate or the computer's input is really high. How would you guys recommend me fixing this?

My first thought if to make a delay function that makes the display function wait x milliseconds.

Is there any what to limit the refresh rate?

Please any ideas would be appreciated.

02-27-2011, 08:52 PM
It depends on many factors as to why things would go slow. If you have a fast running loop then OpenGl will get backed up trying to process things as fast as possible, you need a timer to manage your framerates to prevent this. I had this issue very early on when getting started with opengl.

Also, if you are using shaders poor design can cause severe framerate loss. Passing variables through all shaders instead of sending them directly to the 1 shader you needed them in can cause problems, as well as setting the geometry shaders max verices higher than you are using. These minor shader items I fixed recently increased my framerates from 30fps to ~166fps.

Simon Arbon
02-27-2011, 09:51 PM
Call wglSwapIntervalEXT(1) in your initialisation after your wglMakeContextCurrentARB call.
This causes each SwapBuffers call to wait for the next vertical sync before rendering the next OpenGL command.

The value in the wglSwapIntervalEXT call is the number of frames to wait for, so a higher value will make it even slower.
wglSwapIntervalEXT(0) means 'dont wait'.

Note that this setting can be overridden in the control panel for the video card.
If it still runs fast even with wglSwapIntervalEXT(1) then the owner of that computer may have set 'force off' in the 'vertical sync' option of their control panel.
It should be set to 'Use the 3D application setting'
(This is what its called the NVIDIA control panel, AMD or INTEL may use slightly different wording on their control panels)

02-28-2011, 08:40 AM
I think i temporarily solved the problem. i did want i said and places a wait function to make my display function wait about 20 milliseconds, then i placed a glFinish() at the end of my display function.

Now on any machine i run it on, so far, it run consistently. I think since i am running it on different machines with different clock speeds one runs the loop more than the other?

Thanks for input and @Simon i will look into that i want to try to put v sync in my programs if they ever get big.