PDA

View Full Version : cpu usage and fps optimization



enjoycrf
04-02-2011, 08:33 PM
ok so probably like many others
i seem to start obsessing over my project after i get something good going
so i was trying to see what i can optimize about it
i looked at cpu usage and i have 8 threads
so taking the highest is at like 80%
in my display function i have a timer which limits it to render only 100 frames per second
but the weird thing is that the actual fps meter in game reads between 70-90 on average but never goes to 100
so i tried taking out rendering of things like map, models, pretty much everything so its just the black window
but doing that only dropped my fps
which is totally backwards
then i started taking this further by messing with glutidle and glut displayfunc trying with out one or the other
but this just either crashes gl or peaks on of the threads
however taking the idle func out totally lowers my cpu
but i think this means that im only left with a static gl image
i just want to know why it seems that the more things i animate at once the faster my fps and less cpu is used
but at the same time too many polygons slow me down
im finding that im stuck in this balance or something
i also heard that new video cards actually run faster at higher resolutions
so perhaps gl works like that too
like there is some kind of threshold or something

ZbuffeR
04-03-2011, 02:00 AM
You are at the point where you change toolkit and not use GLUT anymore.
I suggest GLFW, which is cross-platform, simple to use and well documented, and well-suited to 3D games.
http://www.glfw.org/


1) know your display refresh rate, typically 60 hz.
There is no point to render more frames per second than this value, unless you are benchmarking. When benchmarking, measure and compare frame rendering time in milliseconds, not in hertz.

2) verify whether you have vsync enabled or not.
This can be set programmatically with wglSwapInterval(1);/glXSwapInterval(1);/etc but can also be set/overriden through display drivers. Vsync should be on to prevent tearing, and to prevent rendering more than display rate, which limits CPU usage too. For benchmarking, disable vsync : *SwapInterval(0)

3) never bother about performance if you are at or above the display refresh rate.

4) when asking for advice, post what is your card and OS and driver version. And the values of GL_VENDOR, GL_RENDERER, GL_VERSION, GL_SHADING_LANGUAGE_VERSION strings returned by glGetString from whithin a correctly initialized GL context :
http://www.opengl.org/sdk/docs/man/xhtml/glGetString.xml

5) Read and understand this too : http://www.opengl.org/wiki/Performance

enjoycrf
04-03-2011, 04:40 PM
thx

all i have in my nvidia gt220 is:
allow flipping ON
sync to vblank OFF
and the scroller is set to high performance

enjoycrf
04-03-2011, 05:09 PM
hey what do you think about my ordering of things

void display(void) {



// Get the current time in seconds
fpsTimer.time = (float)uptime() + ((float)uptimen() / 1000);


//check the time | 0.016666667 = 60fps | 0.01 = 100fps
if ((fpsTimer.time - fpsTimer.lastTime) < 0.016666667) { //86 max
return;
}


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();



// Give OpenGL our camera coordinates to look at
LookCamera();


/* LIGHTS */
//glEnable(GL_LIGHTING);



/* render map (-50fps) */
g_Level.RenderLevel();



/* render lights */
//g_Level.RenderLights();



/* LIGHTS OFF */
//glDisable(GL_LIGHTING);




/* RENDER MODELS */
if (camera.perspective == 0) {

ninja._i->angle[2] = camera.angle[2];
ninja._i->angle[0] = 90;

VectorCopy(camera.outputEnd, ninja._i->pos);
ninja._i->pos[2] -= camera.half;

ninja._i->scale = 6.4;


ninja.OnIdle();
ninja.OnDraw("models/ninja/nskinbl.jpg");

} //else {

viewModel._i->angle[2] = camera.angle[2];
viewModel._i->angle[0] = 90;
viewModel._i->angle[1];

VectorCopy(camera.pos, viewModel._i->pos);
viewModel._i->pos[2] += 10;

viewModel._i->flip = 1;
viewModel._i->scale = 1;


viewModel.OnIdle();
viewModel.OnDraw("models/ninja/nskinbl.jpg");

//}


/* RENDER MODELS */
AnimateChicken();


drawCrate();




/* BULLETS (-15fps) */
fireRay();
animateRays();
animateShells();


/* SKY */
skyBox();
drawHud();


//flush graphical turds
glFlush();
glFinish();


/* CALCULATE STUFF AND SWAP (wtf this makes my line flash?) */
GetDeltaTime();
CalculateFrameRate();


/* move thru scene */
MoveStrafeCamera();
MoveCamera();
RespawnCamera();
MoveCameraSound();
AnimateCamera();


//swap
glutSwapBuffers();


//save the time
fpsTimer.lastTime = fpsTimer.time;


}

enjoycrf
04-03-2011, 06:14 PM
ok i just read over the docs
cant wait to convert to GLFW
which package should i install?
libglfw-dev or libglfw2

ZbuffeR
04-04-2011, 12:31 AM
Both. Installing -dev to be able to compile programs depending on GLFW(.h) it will also install the other one needed for the compiled dynamic library.


EDIT: and please use code blocks for your code (button with # sign), otherwise even fewer people will take the pain to read through long code sections.