Overlay or blitting or something like that....

Hello,
I make a 2D real-time application which display what I named layers. I have layers that are static (drawn once), dynamic layers that are drawn every time and dynamic layers that are drawn if an observed data has changed.
What I want to do is not to redraw every time my dynamic layers. I would like to draw them once in a buffer and then blit the buffer to the frame buffer every ticks.
I use glut and I found these functions : glutEstablishOverlay, glutUseLayer, glutShowOverlay , etc. Should I use these functions for my staff? It seems that with these functions, there just one overlay plane. It is right or can I use several buffers and then blit them? Will this technique be more efficient than display list?
Are there a mean to do what I want using only OpenGL (not glut and any other windowing system)?

thanks in advance

If you have really a lot of full-screen layers it may be useful to use a pbuffer to store precomposited static layers, but otherwise, just draw everything, and it should be ok.

For your case, you don’t have a lot of geometry (100 planes are just 200 triangles), so you can draw it with immediate mode (glBegin();glVertex(); etc). Be sure to use glTexture2D just once for static textures and use glBindTex to switch.

I don’t advise you to use pbuffers for starter, because it is both complex and not portable. An extension seem to be in the works to fix these problems. Meanwhile, I put a web page with how I managed to do it, with full source code : http://www.chez.com/dedebuffer/

Edit: by the way, to display your “planes”, use textured quads, not glDrawPixels.

Thanks a lot for these advices ZbuffeR!

But I forgot to give an important detail : I make a tool that generates an opengl application which will be embedded in a real-time system. This system’s CPU must executed state machines for processing input data and also the display of my layers. So the display application that I generate must saved the CPU at most. That’s why I wondered if by storing my dynamic layers and then blit them, I could save the CPU (supposing there is an hardware acceleration).

Any ideas or advices will be very appreciated!
Thanks