ok i have no idea how to do this, ive never done it before in code. obviously implementing it this way creates a number thats gonna appear the same regardless of what machine i run it on. has anyone done this before that could tell me how to use C++'s time functions to time a chunk of code?
(basically my input for the code that is to be timed is going to change a lot, and i wanna know how long it takes to execute on a smaller number vs a bigger number)
on windows you can use GetTickCount(); to time an event if you want a piece of code to run at the same speed on diferent computers take a look at this tutorial
I don’t know if there’s something C++ specific about it but I believe you would have to rely on C functions. I think I did something similar once with gettimeofday() in linux, but I’ve got a gut feeling it’s an OS function and not C’s so I don’t know about cross-platform with the mac. You could take a look at it though.
i believe the is a c function i think is time() and difftime(), as i remember they are ansi functions so that will make them somewhat portable, if you want to be able to cross compile on diferent OS you could use for example for windows GetTickCount(), for linux gettimeday() (or whatever you use) and on mac, a mac specific function, and toggle between them using some ifdef’s