PDA

View Full Version : Super fast animation



bingo
05-15-2007, 08:28 AM
I need to display a series of 2d images, with a fast frame-rate.

I have the data in a set of BMP images and have managed to load them all in memory and used glDrawPixels for displaying them.

But this is giving a frame rate of only 4.5 frames per second on my 2.6 ghz single core intel with 2GB ram (without a 3d card).

This is for a scientific simulation and I need around 50 frames per second.

#. Is 4.5 fps using glDrawPixels expected? Or am I doing something wrong?

#. How can I get around 50 fps? Is this even possible?

k_szczech
05-15-2007, 10:02 AM
It can work very fast wit OpenGL if you have a 3D card.
If you don't then OpenGL is emulated by a library in Windows and it's extremely slow. Making better implementation of this library would be gently speaking, agains Microsoft's policy.

Ok, some thoughts:

1. OpenGL is not an image display library. It's meant for rendering.

2. glDrawPixels actually transfers image data to GPU which is costly. It may even happen that passing image as texture using glTexSubImage2D and then drawing a GL_QUAD covered with this texture would be faster.

3. You'll certainly get very high framerates, even on old hardware if you could pass all images as textures to graphics card and then just display them directly from card. There is of course memory limit so that's suitable for short animations.

4. You can also use texture compression if you choose to use textres
-load image from BMP and pass to GPU using glTexSubImage2D - choose compressed format of the texture (assuming you created this image earlier with glTexImage2D).
-use glGetCompressedTexImage to get compressed version of texture back to CPU - store it in system memory
-repeat for all images
Now you have a set of compressed textures in system memory - you can pass them directly to GPU prior to displaying them. Compressed image means less data to transfer and thus even higher framerate.

Kefrens
05-16-2007, 12:20 AM
Without 3d card i suggest using SDL instead of OpenGL.

bingo
05-16-2007, 12:43 AM
Infact I stumbled upon it yesterday :-)

I tried using textures with OpenGL, and could get 9 fps. But that is still less than what I want.

More than games my requirement is similar to that of video players. Will be exploring SDL now, any other suggestions are also welcome.

glAren
05-16-2007, 03:51 AM
Before many years I used this function to display bitmap:
SetDIBitsToDevice (hdc, x, y, width, height, 0, 0, 0, height, data, &bi, DIB_RGB_COLORS);
where "bi" is BITMAPINFO structure and data is pointer to RGB array. But this is really not OpenGL.

bingo
05-16-2007, 03:13 PM
OK, now I am getting around 80 FPS, Both with SDL and opengl, thanks :-)

But there is a very visible and very annoying flicker. I tried playing with vsync setting in control panel and I am using SwapBuffers to update display so am assuming that double buffering must be enabled. But the flicker doesn't goes away.

How do I fix this for OpenGL? How do I check if double-buffering is enabled and is getting used? How do I enable/check vsync from within my program?

I found this code, but it doesn't works
http://www.gamedev.net/community/forums/topic.asp?topic_id=317721

Stuart McDonald
05-16-2007, 10:17 PM
This works for me for vsync...Though it looks the same as the code you found. I use GLee for access to extensions.

#ifdef WIN32
wglSwapIntervalEXT(1);
#else
glXSwapIntervalSGI(1);
#endifJust call it once at setup and leave the rest of your code unchanged. I'm using an nvidia 6800 card.
--Stuart.

k_szczech
05-17-2007, 05:12 AM
Make sure your pixelformat supports double buffering.

Gremour
06-05-2007, 08:00 AM
I tried to use glXSwapIntervalSGI(1); in my program, but linker couldn't find it's implementation. Is there anything need to be done (aside from including <GL/glx.h> and defining GLX_GLXEXT_PROTOTYPES) to make it work? AFAIK, glX functions implemented in libGL, which is linked to project. "GLX_SGI_swap_control" is not among listed extensions. May that be the cause?

Building with g++ under Linux (Fedora Core 6).

songho
06-05-2007, 12:27 PM
Originally posted by Gremour:
I tried to use glXSwapIntervalSGI(1); in my program, but linker couldn't find it's implementation. Is there anything need to be done (aside from including <GL/glx.h> and defining GLX_GLXEXT_PROTOTYPES) to make it work? AFAIK, glX functions implemented in libGL, which is linked to project. "GLX_SGI_swap_control" is not among listed extensions. May that be the cause?
You mean "glxext.h", right? GLX_SGI_swap_control should be listed under the client glx extensions if it is supported. If not, try to use GLX_MESA_swap_control instead.