I have an app with a user-interface, and I want to run a background thread for continuously rendering to a FrameBuffer Object.
Every rendered frame, I want to display (different parts of) this rendered texture in one or more controls in my UI.
Creating a render-thread is not a problem (making sure the context is only current in one thread at a time).
But how do I synchronize so that when the background thread has rendered a frame, OpenGL-controls in my UI-thread display (different parts of) the rendered texture ?
I assume first of all I need to wait for the GPU to finish rendering, and then somehow let the OpenGL-controls in the UI-thread draw a control-filling quad with the wanted coordinates in the rendered texture.
So does anybody know how I actually should approach and synchronize all this ?
Okay, so say my renderthread has it’s own window and context.
I render, then I have to wait until it’s done using an awkward glFinish (the only thing happening on the main thread is just UI work), and then I fire an event notifying the UI thread a frame has finished. The UI thread binds the texture (in it’s own context) and draws a textured-quad with the desired coords to the output-control.