Server Side Framerate

Hi!

In my OpenGL project I have more than 1800 FPS (obtained by typical client-side frame rate counter)

resetTime();
render();
getTime();
computeStatistics();

but it isn´t the real “server side” framerate, is the time who my application needs to send a sequence of commands to OpenGL driver. Is there any form of getting the Pipeline frame rate?
Thanks a million!

If you’re timing a whole frame, snapping CPU timers around the whole thing isn’t that bad. Yeah maybe the first frame or two’s timing isn’t perfect due to pipelining, but in the steady state it quickly evens out and you get good timings. Just disable SYNC_TO_BLANK of course. If you want to be really conservative, don’t time start and stop at points “within” a frame, time between frame starts at the same point in your code (which will include all of your frame overhead, including init/setup/physics/render/kitchen sink/SwapBuffers downsample/present/cleanup/etc.

For finer timing granularity, use ARB_timer_query is probably what you’re looking for. This times things on the GPU side rather than the CPU side.

Also, time in ms/frame, not FPS. FPS is non-linear and useless for comparison. It’s just a “gee whiz” number for gamerz. More detail, see this link, among many others.

If you are using this

resetTime();
render();
getTime();

then yes, it is measure the time it takes for the driver to process your commands which often is very fast. That doesn’t measure the actual FPS.

The normal way to measure the FPS is to collect the time past (NewTime - OldTime) and compute the FPS afterwards.

CollectTimePast();
ComputeFPS();
render();

This takes into account the time it takes for the GPU to render, swap buffers, vsync, your CPU usage and CPU usage by other applications.

That’s what I meant by “time between frame starts at the same point in your code”. Sorry that apparently wasn’t clear.

Thank’s for your answers…They have been very useful for me!!