Pakk
10-04-2004, 03:51 AM
Hi,
I'm programming a geometrical visualization system and I'm needing to measure the time it takes to render a 3D scene. The reason for that is because I have a timer to make the camera orbit around an object. Each time the timer signals, the camera gives a step toward its orbitting direction, producing an animation.
To make a smooth animation tough, I need to set the timer's delay properly. My ideia was to take the time used to render a single frame and use it as the timer's delay. But any attempt to do so results in the time the system needs to send the OpenGL commands to the video board, not the rendering time itself.
So, how can I do so? Actually, any information about keeping the animation's frame hate dinamically adjustable according to the hardware and scene's complexity is very apreeciated. I know this is an issue that every game faces, but I don't know any tecnique to approach this problem.
Thank you very much!
Fabio Pakk
I'm programming a geometrical visualization system and I'm needing to measure the time it takes to render a 3D scene. The reason for that is because I have a timer to make the camera orbit around an object. Each time the timer signals, the camera gives a step toward its orbitting direction, producing an animation.
To make a smooth animation tough, I need to set the timer's delay properly. My ideia was to take the time used to render a single frame and use it as the timer's delay. But any attempt to do so results in the time the system needs to send the OpenGL commands to the video board, not the rendering time itself.
So, how can I do so? Actually, any information about keeping the animation's frame hate dinamically adjustable according to the hardware and scene's complexity is very apreeciated. I know this is an issue that every game faces, but I don't know any tecnique to approach this problem.
Thank you very much!
Fabio Pakk