Videofeedback with buffers

Hi!

I am trying to simulate a videofeedback loop using buffers. Physically this is done by connecting a video camera to a TV, which shows what the camera is recording, and pointing the camera directly on the TV. In this way a loop is created, like an iterated system. Spectacular images can appear then.

I already did this with the help of the accum buffer and rendering to textures with the command glCopyTexImage2D. The content of the accum buffer is returned to the screen, the screen is copied to the texture, which is applied and rendered and finally the result is loaded in the accum buffer. Then the loop starts again.

In want of better performance I learned a little about framebuffers and how they can be attached to textures to do off-screen rendering to textures. The new desing is like this: two framebuffers and two textures are defined and attached to each other. So we have fbo[0] -> tex[0], fbo[1] -> tex[1]. In the render loop, first we bind the fbo[0] and draw to it the content of tex[1]. Then we bind fbo[1] and draw to it the content of tex[0]. In that way a rendering loop is created.

But I have not seen any performance by using buffers respect the first way I described. Why? For sure there is a better way to do these things. Can you help me in this issue? Any ideas?

I use C++ and SDL under Linux.

Thanks a lot!