glutDisplayFunc not updating?

ok as some of you know… im trying to make an pixel by pixel scrolling rpg engine… i just put the keyboard controls in and for some reason glutDisplayFunc is not updating… i have to minimize the opengl window and maximize again for it too change to reflect the characters position on the screen… why isnt is updating… once you call the mainloop function thing shouldnt it automatically be called every few miliseconds or so…?

my init code…

glutInit(&argc, argv); 

glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH); 
glutGameModeString("512x384:32@120");
glutEnterGameMode(); 

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0,512.0,384.0,0.0,0.0,10.0);

glEnable(GL_TEXTURE_2D);
glutSetCursor(GLUT_CURSOR_NONE);

glutDisplayFunc(drawscreen);
glutKeyboardFunc(inputevent);
glutSpecialFunc(sinputevent);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f); 
FocusCamera(0);
load_gl_textures();
glutMainLoop();

so shouldnt the display code keep refreshing?? or am i missing something to get it autorefreshed…

I dont use glut that often, but shouldnt you specify an idlefunc…I think it is called glutIdleFunc()…

The way your code is writen only when a window event, ie re-size, move, etc. will the window be updated.

Say you want the screen to update after a mouse move or keyboard. Use glutPostRedisplay, at the end of that routine.

example:

My_mouse_event(int button, int x, int y)
{
Check mouse position, move if needed.

glutPostRedisplay(); // Redraw screen with new mouse data.
}

For game timming and idle time, I like to use glutTimerEvent( time in ms, func * , int)

example:
Insert before glutMainLoop
glutTimerEvent( 10, My_timer_event, 1)

My_timer_event(int te)
{
update_objects /// code to update your objects

glutPostRedisplay(); // Redraw screen with new object data.

glutTimerEvent( 10, My_timer_event, 1)// timer is a one shoot must be reset after being called. By using a timed event, your application should run about the same speed on any machine.
}

You can use glutIdelFunc but it will not any idea of how fast the updates will be. So on a fast machine vs. a slow machine will get diffrent rates of updates. I prefer the gluttimerevent.

Originally posted by method5:
[b]ok as some of you know… im trying to make an pixel by pixel scrolling rpg engine… i just put the keyboard controls in and for some reason glutDisplayFunc is not updating… i have to minimize the opengl window and maximize again for it too change to reflect the characters position on the screen… why isnt is updating… once you call the mainloop function thing shouldnt it automatically be called every few miliseconds or so…?

my init code…

glutInit(&argc, argv);

glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutGameModeString(“512x384:32@120”);
glutEnterGameMode();

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0,512.0,384.0,0.0,0.0,10.0);
glEnable(GL_TEXTURE_2D);
glutSetCursor(GLUT_CURSOR_NONE);

glutDisplayFunc(drawscreen);
glutKeyboardFunc(inputevent);
glutSpecialFunc(sinputevent);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
FocusCamera(0);
load_gl_textures();
glutMainLoop();

so shouldnt the display code keep refreshing?? or am i missing something to get it autorefreshed…[/b]

[This message has been edited by nexusone (edited 03-25-2002).]

thanks guys… i put the timer meathod in so it should update every mili-second now :slight_smile:

glutTimerFunc(1, glutTimer, 1);

void glutTimer(int value)
{
glutPostRedisplay();
glutTimerFunc(1, glutTimer, 1);
}

For constant updates, use the idle function instead.

Personnally, for constant updates I prefer to use :

void displayfunc ()
{ //draw scene here

glutPostRedisplay ();
}

void main ()
{ //blah blah
glutDisplayFunc (displayfunc);
//blah blah
}

No need of any timer / idle func !
Unless there’s something I’ve not understood ??
Morglum

Thats also a way to do it. Personally I think it’s more clear to put it in the idle function, and only draw you scene in the display function. When I don’t do anything (idle), I want to update my scene again (post a redisplay). But that’s a matter of taste.

But putting the redisplay in the timer callback can be confusing. As method5 do, setting the timer to 1 ms makes one think that an even will occur every millisecond. If you have a complex scene that takes more than 1 ms (1000 fps) to render (well, that would be anything but a simple cube I suppose ), you won’t have updates with 1 ms intevals anymore, which the timer says.

It could be my misunderstanding of the “glidlefunc” function, but is it called after gl pipe has completed its entire tasks or if there is no input from the user or maybe a combination of both? But then is not the speed of the machine also affecting the “glidlefunc”, so on a slow machine your cube would run one speed and on a fast machine your cube would be spinning so fast it looks like a blur……

A maybe doing it wrong but I like glutTimerFunc since I have some clue as to how fast my screen will be updated. Now maybe there is a better method for game/animation timing?
I do agree that 1 ms is bit too fast, when you get to complex scenes.

Originally posted by Bob:
[b]Thats also a way to do it. Personally I think it’s more clear to put it in the idle function, and only draw you scene in the display function. When I don’t do anything (idle), I want to update my scene again (post a redisplay). But that’s a matter of taste.

But putting the redisplay in the timer callback can be confusing. As method5 do, setting the timer to 1 ms makes one think that an even will occur every millisecond. If you have a complex scene that takes more than 1 ms (1000 fps) to render (well, that would be anything but a simple cube I suppose ), you won’t have updates with 1 ms intevals anymore, which the timer says.[/b]

no problem, you just call QueryPerformanceCounter in the displayfunc to compute the new position of the objects. Thus things don’t run faster. Just smoother !

[This message has been edited by Morglum (edited 03-27-2002).]

It could be my misunderstanding of the “glidlefunc” function, but is it called after gl pipe has completed its entire tasks or if there is no input from the user or maybe a combination of both

The function is called “glutIdleFunc” and is a part of GLUT, not OpenGL.

GLUT is an event driven system. When an even occur, like mouse moves, a button (mouse or keyboard, doesn’t matter), a callback is performed, assuming you have registered the given callback. When GLUT’s even queue is empty, and an idle callback is registred, the idle callback function is called. That means, as long as there is nothing happening, you will have an idle callback. In the idle callback, you post a redisplay message to tell GLUT you want a redisplay. You then have a redisplay in the even queue, and the proper callback function is called.

The amount of callbacks during a given timeunit will of course differ on different systems. But the thing is, you will never get more callbacks that your computer can handle, since you only get one when your computer is idle.

Morglum,

I just tried your method and I got the exactually the same fps as the 1 milisecond timer… i cant get past 60fps here… this sucks… i used to get like 500-600fps before i translated the game to c++ from qbasic…(well i did have asm routines to draw the screen before too… must of been much faster then opengl… ) i thoguht c++ was suposed to be like 1000x faster.

First why do you need an update 500 fps?
The eye can not see any diffrence past 60 fps.
In some cases C++ is faster, I would not say 1000 times. And asm language is even faster and also you are doing direct draws to the screen under the basic.
One thing is to get a speed increase you may need to rethink how you are doing things.
If you still have the BASIC idea of how things work, you may need to get on how it should be done in C to get the speed increase.

Originally posted by method5:
[b]Morglum,

I just tried your method and I got the exactually the same fps as the 1 milisecond timer… i cant get past 60fps here… this sucks… i used to get like 500-600fps before i translated the game to c++ from qbasic…(well i did have asm routines to draw the screen before too… must of been much faster then opengl… ) i thoguht c++ was suposed to be like 1000x faster. [/b]

method5,

it might be the vertical sync. Shortly : your display driver detects the refresh rate of your screen and decides not to draw faster than it. This is good for you : with 60 hz+vsync, you get smoother animation than with 80 hz without vsync, with lower cpu load.
But I understand that you’d like to know the speed of your app. So you can disable vsync. 2 ways to do it :

  1. go to your display driver’s config utility. There might be an option somewhere for that. At least it’s the case with nvidia drivers
  2. add the following code to your initialization code :

PFNWGLSWAPINTERVALEXTPROC wglSwapIntervalEXT;
wglSwapIntervalEXT = (PFNWGLSWAPINTERVALEXTPROC) wglGetProcAddress (“wglSwapIntervalEXT”);
wglSwapIntervalEXT (1);

and don’t forget to add the includes :
#include <gl/glext.h>
#include <gl/wglext.h>

Morglum

[This message has been edited by Morglum (edited 03-27-2002).]

Originally posted by Morglum:
wglSwapIntervalEXT (1);

Shouldn’t it be wglSwapIntervalEXT(0); ?

Originally posted by zeckensack:

>>>>>Shouldn’t it be wglSwapIntervalEXT(0);

Yes of course ! I just copied-pasted these lines from my code. wglSwapIntervalEXT(1); enables vsync. thanks !

[This message has been edited by Morglum (edited 03-27-2002).]