FBO: Offscreen rendering & Onscreen display

Hello,

I am currently working on a video application. The rendered data on the graphic card is loaded to a video card (a dedicated hardware for video input/output) to put SDI signals out. To achieve independent video output, the scene is rendered offscreen.
For pre-viewing I want to see the rendered scene on my computer display as well.

The necessary data flow is as follows:

render scene (offscreen) -> Load into System RAM -> copy to the video card -> output SDI signal

This works fine in a test environment with pbuffers. Because the “right” renderer uses fbos I want to use for offscreen renderig as well.

I found several tutorials for rendering fbos offscreen to a texture and use this texture for onscreen use, but I am confused how I get the offscreen data on my computer screen.

  1. Is it possible to render in a fbo offscreen and copy the data in system RAM (to process to my video card)

  2. Is it necessary to re-render the created texture onscreen or is it possible to copy the data from offscreen graphic ram to the backbuffer of the “normal” framebuffer?

  3. If it is not, have I use to render offscreen into a texture and map it on a onscreen quad and render again, or which technique is prefered ?

Thank you for your reply.

Chris

Technically, using FBO’s is same as using pbuffers or the windows framebuffer: you can use ReadPixel, WritePixels and CopyPixel as much as you wish. With the GL_EXT_framebuffer_blit extension, it is possible to copy the contents of a framebuffer directly to a window (or another framebuffer), but this extension is only supported by Nvidia. The best (most platform-friendly) way would be to render a texture and map it on a fullscreen quad.

Hi Zengar,
thank you for your reply.
I have few additional question to fbo:

glBindFramebufferEXT( GL_FRAMEBUFFER_EXT, g_frameBuffer );
glFramebufferTexture2DEXT( GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, g_dynamicTextureID, 0 );

How can I access the Buffer ?

glReadBuffer(GL_COLOR_ATTACHMENT0_EXT);
glReadPixels(0, 0, RENDERBUFFER_WIDTH, RENDERBUFFER_HEIGHT,GL_RGB,GL_UNSIGNED_BYTE,pdata);

would be that the right way to access the data?

Can you give me more information about the blit extension? I found only this site http://msi.unilim.fr/~porquet/glexts/GL_EXT_framebuffer_blit.txt.html , but without any examples or implementation. How does this works?

Thank you!

Chris

I guess, yes. I haven’t use it myself, so I can’t tell anything more concrete (I haven’t programmed 3D for several years now, but I am going to start again soon :slight_smile:

No one can give you information about the blit extension, because it is very new. I thought the the spec has some examples at the end of the document? It seems pretty straightforward to me, you just set your offscreen FBO as the read FBO and your windows as the write FBO and then call either CopyPixels or the blit function…

p.s. try it yourself and tell us the results :slight_smile:

p.p.s. The blit extension is only present on nvidia cards starting with GFFX and latest 97.46 (or such) drivers…

Here is the problem. I have a Quadro FX 3500 and the latest driver installed (91.36 on windows xp)

This driver (or the card itself) does not support the blit extension.

I think i will try out the “texture” technique…

Thank you for all your answers

Dank Dir ! Ich meld mich wieder, wenn ich ne Frage hab :slight_smile:

Bye Chris

I am afraid you’ll have to use beta drivers (if it is possible for you) for blit extension. Link: http://downloads.guru3d.com/download.php?det=1547

The render to texture approach is still more “compatible”

Alles klar, kannst du ruhig machen :slight_smile:

Taras