PDA

View Full Version : timer



PrinceofZ
11-18-2000, 04:14 PM
it's not really opengl related, but i'm using it in an ogl program...

can someone write me a complete code which prints in a win32 appliction(under vc++)
everysecond something?
(this code doesn't work, i made something)wrong and i don't know what.

thanks in advance

like this:
#include <time.h>
#include <iostream.h>
#include <stdlib.h>
#include <windows.h>
/*
int WINAPI WinMain( HINSTANCE hInstance, // Instance
HINSTANCE hPrevInstance, // Previous Instance
LPSTR lpCmdLine, // Command Line Parameters
int nCmdShow) // Window Show State
{
*/
void WinMain(HINSTANCE hInstance, // Instance
HINSTANCE hPrevInstance, // Previous Instance
LPSTR lpCmdLine, // Command Line Parameters
int nCmdShow)
{
long target_time, wait_time = 1;
for(int i = 0;i < 10; i++)
{
target_time = time(NULL) + wait_time;
while(time(NULL) < target_time)
{
//do nothing until wait_time has passed
}
cout << "second ";
}

}

Moz
11-19-2000, 03:32 AM
Hmmm !

You should use windows' timers (never used them and i don't have the MS help file but you should be able to find it somewhere).
Anyway, if you like to waste processor time you can do this:

long refTime;
for(int i = 0; i < 10; i++)
{
refTime = GetTickCount();
while((GetTickCount() - refTime) < 1000);
cout << "second";
}

GetTickCount returns the current value of the HW counter of your PC (in milliseconds) that starts at 0 when you start your PC.
You can also put the code above in a separate thread (just in case you wanted to do something else in your app at the same time http://www.opengl.org/discussion_boards/ubb/wink.gif).

Moz

Michael Steinberg
11-20-2000, 06:24 AM
Since it seems that time isn't very critical for that purpose, also try WM_TIMER messages, since that doesn't block the complete system while you wait a second. Of course, you'll not have exactly 1 second, but it should be exact enough.

Succinct
11-20-2000, 06:27 AM
you can't use cout w/ a winmain...
cout only works w/ main, etc..

in windows you have to do a whole bunch of crap to get the window ready to display text...

it's very messy, and it's not very conducive to c++, a least in a console style. anyway, here's what help i can offer (ya gotta find out how to print to the screen urself! http://www.opengl.org/discussion_boards/ubb/wink.gif )

I've been told that GetTickCount isn't very accurate (i've tested it myself and at most it's only been off by aroun 25 ms). sooo, if u guys care, here's a more accurate timing scheme (i'm assuming that this IS win32 based http://www.opengl.org/discussion_boards/ubb/wink.gif




unsigned GetTime( void )
{
static LARGE_INTEGER Frequency,Time;
static bool HighPerformanceTimerExists = QueryPerformanceFrequency( &amp;Frequency ) != 0;

if( HighPerformanceTimerExists )
{
QueryPerformanceCounter( &amp;Time );

return unsigned(1000u*Time.QuadPart/Frequency.QuadPart);
}
else
return GetTickCount( );
};

the reason for the if is that it's possible that the hw doesn't support a high performance timer... but i've never seen a system that doesn't

Deiussum
11-20-2000, 06:46 AM
Actually, I think you can use cout in a Winmain so long as you create a console first using AllocConsole(). You need to kill the console when you're done using it by using FreeConsole(),