PDA

View Full Version : Program running at different speeds on different graphics cards



Chuffy_345
02-24-2004, 05:58 AM
As the subject suggests I've got a program where i've got some balls moving around on the screen using collision detection but when i run the program at home on my radeon 9800 the balls move around at the speed of light. When I run it at uni the speed is slower and more normal. How do I get my program to run the same no matter what graphics card is installed?

Cheers

davepermen
02-24-2004, 06:14 AM
not an advanced question, and not opengl related.

by measuring the time it took to update the image, and update the next scene depending on it.

hh10k
02-24-2004, 06:22 AM
Most examples of OpenGL on the Internet tend to do something like "position.x += 0.1f" every frame, but this is a Bad Idea. If your home machine has a frame rate that is 10x faster than the other one, this code will be run 10x as often and everything will happen 10x as fast.

If you're using Windows, you should use something like:


float getTime()
{
LARGE_INTEGER counter, frequency;
QueryPerformanceFrequency(&frequency);
QueryPerformanceCounter(&counter);
return (float)counter.QuadPart / (float)frequency.QuadPart;
}

Then, every frame use "position.x += speed * delta", where 'delta' is the difference between the previous frame's time and the current frame's time. 'speed' is the the distance you want it to move per frame. Of course, this applies to rotations as well.

If you use glut, you may be able to use glutGet(GLUT_ELAPSED_TIME), but I've never touched it before http://www.opengl.org/discussion_boards/ubb/smile.gif