but it isn´t the real “server side” framerate, is the time who my application needs to send a sequence of commands to OpenGL driver. Is there any form of getting the Pipeline frame rate?
Thanks a million!
If you’re timing a whole frame, snapping CPU timers around the whole thing isn’t that bad. Yeah maybe the first frame or two’s timing isn’t perfect due to pipelining, but in the steady state it quickly evens out and you get good timings. Just disable SYNC_TO_BLANK of course. If you want to be really conservative, don’t time start and stop at points “within” a frame, time between frame starts at the same point in your code (which will include all of your frame overhead, including init/setup/physics/render/kitchen sink/SwapBuffers downsample/present/cleanup/etc.
For finer timing granularity, use ARB_timer_query is probably what you’re looking for. This times things on the GPU side rather than the CPU side.
Also, time in ms/frame, not FPS. FPS is non-linear and useless for comparison. It’s just a “gee whiz” number for gamerz. More detail, see this link, among many others.