I’m programming a geometrical visualization system and I’m needing to measure the time it takes to render a 3D scene. The reason for that is because I have a timer to make the camera orbit around an object. Each time the timer signals, the camera gives a step toward its orbitting direction, producing an animation.
To make a smooth animation tough, I need to set the timer’s delay properly. My ideia was to take the time used to render a single frame and use it as the timer’s delay. But any attempt to do so results in the time the system needs to send the OpenGL commands to the video board, not the rendering time itself.
So, how can I do so? Actually, any information about keeping the animation’s frame hate dinamically adjustable according to the hardware and scene’s complexity is very apreeciated. I know this is an issue that every game faces, but I don’t know any tecnique to approach this problem.
if you are programming under windows, there is no way to set the timer to the right interval.(esp.at every intervall!)… and it is not a good idea to reprogramm timers every frame anyway.
what you should do is:
1.read time
2.render first frame.
3.read time. get difference to last time
4.use the difference to adjust your animationsteps.
5.render next frame.
6.continue with 3.