glut is it responsible of the slowness of my application ?

i use glut to create a window and for the keyboard,and my program of motion blur is very slow. glut is it responsible ? If i don’t use the keyboard function of glut and if i just keep the ones which create the window the performances will be better ?

guess the only way to answer – is try that out

yes, but i would win some time if somebody knows it !

You can override GLUT kb callbacks, functions with native win/linux & see whats the difference, but usually, gluts task is easier window creation for multiple platforms, of course, calbacks & kb scanning takes some time.

ok thanks
you think that if i use SDL for keyboard, it will be better ?

However,

The glut code to process the keyboard is pretty trivial stuff …

MSG event;

if(!GetMessage(&event, NULL, 0, 0))	/* bail if no more messages */
  exit(0);
TranslateMessage(&event);		/* translate virtual-key messages */
DispatchMessage(&event);		/* call the window proc */

and the window proc implements the message pump and switches to the code basedf on the message being processed …

e.g.
case WM_KEYUP:
window = __glutGetWindow(hwnd);
if (!window) {
break;
}
/* Win32 is dumb and sends these messages only to the parent
window. Therefore, find out if we’re in a child window and
call the child windows keyboard callback if we are. */
if (window->parent) {
GetCursorPos(&point);
ScreenToClient(hwnd, &point);
hwnd = ChildWindowFromPoint(hwnd, point);
window = __glutGetWindow(hwnd);

etc etc …

I’d suspect something else … are you doing software feedback or something …

What happens if you disablew the motion blur part??? [and draw “unblurred” objects]

Rob

and if you use glut in windows, why don’t you use GetKeyboardState in your display callback for retrieving all the keys state at once into a state array? Or if you really use a few keys for the input, you could then use GetAsyncKeyState for the individual key (though don’t use GetKeyState, otherwise you’ll get probably the same stuff you have now, as it reads the keyboard messages posted in the message queue of the window)

As Rob pointed out, GLUT should not steal much time from your application. I use GLFW, which does similar things as GLUT regarding keyboard input etc, and I can easily get over 3000 FPS on my system (Win2k, 700 MHz Athlon, with actual double buffered OpenGL rendering going on), so I doubt that the event loop and input handling would steal more than a fraction of a millisecond per frame.

Update: I did a very quick benchmark (a demo that does nothing but process window/input events), and on my system the main loop takes 1.8 microseconds per iteration. Here’s the code, for your reference:

#include <GL/glfw.h>

int main( void )
{
int running = 1, samples = 0;
double t1, t2, per;

glfwInit();
glfwOpenWindow( 10,10, 0,0,0,0, 0,0, GLFW_WINDOW );

t1 = glfwGetTime();
while( running )
{
    glfwPollEvents();
    samples ++;
    running = glfwGetWindowParam( GLFW_OPENED ) &&
              !glfwGetKey( GLFW_KEY_ESC );
}
t2 = glfwGetTime();

per = (t2-t1) / (double) samples;
printf( "Average period: %.6f us

", 1e6*per );

glfwTerminate();
return 0;

}

If you initialize glut wrong, it may come back with software rendering. Do a glGetString() on GL_RENDERER and see what you get. If it’s “Microsoft” or “GDI” or “Generic” anything, then that’s the reason things are slow.

I usually just assert that this is not the case, and quit the program immediately.

Originally posted by jwatte:
If you initialize glut wrong, it may come back with software rendering. Do a glGetString() on GL_RENDERER and see what you get. If it’s “Microsoft” or “GDI” or “Generic” anything, then that’s the reason things are slow.

You actually get “Microsoft”, “GDI” or “Generic” when no driver is installed for the gfx card. In this case, Windows is using his default driver for the display.
It has nothing to do with any “wrong initialization” of GLUT. But I agree that checking for GL_RENDERER string can avoid surprises.

“My program of motion blur is very slow”

You wouldn’t happen to be using the accumulation buffer for this motion blur, would you?

j

Originally posted by morbac:
You actually get “Microsoft”, “GDI” or “Generic” when no driver is installed for the gfx card. In this case, Windows is using his default driver for the display.
It has nothing to do with any “wrong initialization” of GLUT. But I agree that checking for GL_RENDERER string can avoid surprises.

It has everything to do with wrong initialization. If Windows can’t find a suitable pixel format for the device driver, it will use it’s own software renderer even if you have a device driver properly installed. As an example, if I set desktop color depth to 16 bit and initialize glut w/o alpha channel, I get a pixel format my GF2 MX supports and the vendor is reported as NVIDIA, but if I include GLUT_ALPHA in glutInitDisplayMode, it uses MS’s software renderer instead and the vendor is now Microsoft.

why talking about glut init-n, if the man didn’t even say if he uses accum buffer for a motion blur or not. because if so, then it could give a performance slowdown by itself.