CPU usage: How high is too high?

With just getting a gl context and not much else, not drawing anything whatsoever my cpu usage jumps right to a whopping 50%, this sounds ridiculously high and I’m wondering just how much cpu should a gl context on windows take?

I’m on a laptop pc with ati fireGL v5250
OpenGL Version 6.14.10.6479

Let me guess: you run an empty rendering loop?

I run an empty loop but also test a simple loop drawing a triangle to the screen with a buffer swap, still its at 50%.

If you don’t add a delay or vsync, such a “simple loop” will use all your CPU (as the GPU eats triangles for breakfast).

If it is only 50% maybe you have a dual core :slight_smile:

Hi, I don’t know, I can take any simple gl example(like one of the first lessons from NeHe) and it will hover around 50% cpu use no matter what(there’s no apps running in the bg btw)… I’m actually comforted by the fact that this could be limited to only this laptop config but still has anyone one else ran into this problem with their laptop/ati hardware?

You don’t listen, do you ?

sorry, yea I listened but I figured that if had vsync enabled that this problem shouldn’t persist and even with it on I still have this problem(which makes me wonder if it’s really enabled or not) and yes I stuck in a ‘Sleep(10)’ and down she went to 10-20%, obviously I’m not getting vsync right? maybe its a default setting in the driver. does this seem like a good solution to just keep the Sleep() in there? It doesn’t seem I can count on setting vsync even though i test for it and it is changeable.

Thanks… sorry for not listenin soon enough!

Some drivers do busy loops internally for vsync (it is pretty stupid).
Best way to check if you really have vsync on, is to measure FPS : if it is linked to the display refresh then you have vsync. For a simple triangle scene, you may have hundreds+ FPS without vsync.

Some of the Nehe examples use glutIdleFunc to perform scene updates. Maybe it’s related…

N.

It sounds like you may not be getting an accelerated pixel format. Check to make sure your pixel format doesn’t have the PFD_GENERIC_FORMAT set and that your color depth is the same as your desktop color depth.

If you’ve updated drivers (or haven’t updated your drivers) recently, your drivers may not be installed correctly. Laptops are the worst in terms of gettings drivers to install correctly.

Sounds like you aren’t working on a game but a application.
Just update the scene when there is a WM_PAINT event.

If you are using GLUT, get rid of glutIdleFunc
All you need is to pass your render function to glutDisplayFunc and when there is a WM_PAINT event, it will update.
CPU usage will be 0%

@Bucky, this is just the way it works. Even if you run an empty loop, there is no way the OS can tell it isn’t doing any meaningfull job. Basically, it tries to give as much as possible CPU time to your application, and since it (the application) doesn’t signal “I don’t need so much time”, it ends up eating all of it, resulting in a CPU usage of 100%. If you want to reduce the CPU usage, you will have to incorporate some means to tell the OS that you don’t need so much time. One possibility is to use the sleep() function.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.