am i doing this right??

Hi!, im making a demo, and i want it to work at the same speed on almost all computers, i have a speed variable called vel (with an initial value of 100.0f) and i divide it by the current frames per second that giving me the value of sync_vel:
sync_vel = vel / fps;
the logic behind this is that if i get 40fps on computer A sync_vel will be 2.5f and on computer B i get 20fps sync_vel will be 5.0f, on a slower pc the speed will be bigger to compensate, and on a faster pc, speed will be slower. so am i doing it correctly and will this work?

It is better to use the time between two frames rather than the FPS, because that can be measured more accurately. If your object moves with velocity v and the time between the last frame and this frame is t, you have to move it by v*t.

Search the forums for “clamping the framerate” and
“framerate” because we have answered this question
at least ten times this year.

Use a timer indipendent of the frame rate. I had
suggested in the past that you simulate with an
entirely different thread than the one that renders.

thanks!