Frame rendering freeze -- GLSL in MFC

Hello All,

I am playing an AVI file on to a MFC screen. I am using GLSL to write a shader that processes each frame and waits for the next frame.

I am able to initiate opengl and glsl through context setting and setting pixel properties for opengl and glewinit() for GLSL.


CMFC_ProjectView::initOpengl()
{
 //set up context 
  //i.e getDC(..)

 //setPixel Format using wgl functions.

  ...

  //draw viewport and window size 
     //similar to glutInit and setWindowSize()

  //initiate GLSL
  glewInit();
  glGenTextures(3, tex);
  glGenFramebuffersEXT(1, &fb); 
  
  //Current MFC , Device context is the on-screen frame buffer
  i.e glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

}

Now the problem is that I am using FBO ( frame buffer object) for shader to manipulate each frame.

Inside my MFC function where I am starting opengl ( similar to glutInit() ) , I am generating texture ids and FBO object id.

After this I am using glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
to imply current MFC context ( on screen frame buffer ).

Above gives me valid Ids of FBO and generated Texture ids.

In my shader program I am doing this


OnPaint()
{
 ...
 if( flag_glsl == TRUE ) 
 {
   glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);
   shader();

 }
 ...
}

Now when I run my application, I find my video file freezes at first frame only.

The reason i conjectured was that since after first frame rendering on on-screen frame buffer, everything else is rendered to off-screen buffer via glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);
, the next sequential frame is not observered.

So to overcome this, after coming out of shader code, I include following

glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

in the hope to re-select main on-screen frame buffer.


OnPaint()
{
 ...
 if( flag_glsl == TRUE ) 
 {
   glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);
   shader();

   //reset to on-screen buffer
   glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
 }
 ...
}


But I observer on running the application that after first frame of my video, the next and subsequent frames disappear!!

Does anybody have a clue on what is going wrong?

Help & suggestions appreciated.

Does glGetError() return any errors?

zero in both cases. => GL_NO_ERROR


OnPaint(){
..
GLenum err = GL_NO_ERROR;
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb); 
err = glGetError();
		
  //shader function call		
  ...				
err = GL_NO_ERROR;  //reset err value
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); //switch back to original frame.
		err = glGetError();

I referred to Platform specifics: Windows - OpenGL Wiki and destroying my framebuffer and texture in onDestroy().

The link mentions to use wglMakeCurrent() for setting up context ref: wglMakeCurrent function (wingdi.h) - Win32 apps | Microsoft Learn

For my case, do I have to reset wglMakeCurrent() to achive the effect of glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0) [i.e switching back to on-screen buffer such that the next frame is rendered to MFC screen]?

If so, I do not know hot to relate glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0) with wglMakeCurrent()
regarding arguments etc…

A pointer or short snippet would be appreciated.

Forgot to mention. Following may be helpful.
I am on Vista 32 bit, GLSL 4.0, NVIDIA 8800GS,

For reference check chapter 19 of the Superbible book over here:
http://www.opengl.org/sdk/docs/books/SuperBible/
(Just as a sanity check for the MFC part of it…)

Another good read on MFC and GL:
http://my.safaribooksonline.com/9780321620491

For the framebuffer end of it, be sure you’ve set your viewport, scissor, etc. when switching between between render targets and views. Otherwise should just work IMO.

Thank you Brolingstanz for the time spent and for the links. Your
advise regarding viewport setting seems to have given me the clue for the error.

and that is what I came to know about the error make — “viewport” setting.

Yes. Inside my shader function in between the glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, <id>)


OnPaint(){
..
GLenum err = GL_NO_ERROR;
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb); 
err = glGetError();
		
  //shader function call	
  function foo()	
  ...				
err = GL_NO_ERROR;  //reset err value
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); //switch back to original frame.
		err = glGetError();


i.e inside foo(), I am changing my viewport size according to texture dimensions ( i.e stream convereted to 2D texture).

Then , as I come out , I am restoring to original size of the viewport. As soon as I do , a yellow rectangle displays.

This yellow rectangle is exactly of the same dimensions as the original video frame.

Thank you for the input, it helped me to track the exact cause.Now I am wondering how to get the next frame rendered instead of the yellow rectangle.

Any help in that direction will be hugely appreciated :slight_smile:

My new observation.

Refering to my OnPaint(), iff I do following:


OnPaint(){
..
GLenum err = GL_NO_ERROR;
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb); 
err = glGetError();
		
  //shader function call	
  function foo()	
  ...	

//restore view port settings to original image frame dimension
setViewPort(w,h);

			
err = GL_NO_ERROR;  //reset err value
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); //switch back to original frame.
		err = glGetError();

and inside foo()


  //change view port settings w.r.t texture size to be used by shader for R-W operations
   setViewPort(Tex_W, Tex_H); 

  setTexture(tex_id, te_target, type...);
  
  //call shader
  SetShader();

  //retieve results
  retrieve(data, buffer); //uses glReadPixels() etc ..

Now in the foo(), when I disable function SetShader(), the screen shows video frames but on a yellow background. Otherwise a yellow rectangle.

Ques 1. What could be the possible reason behind??


Ques 2. My explanation is that when an off-screen frame buffer is bounded as a current frame buffer and when SetShader() is called , the image data is replaced by the processed non-color pixel data.

P.S: I am attaching textures to process non-color data of each pixel such as its positions etc.

Am I right by above assertion?


Ques 3. When an image is rendered, the data go to on-screen frame buffer’s color buffer. When I use FBO and attach to it 4 color buffer attachments, is my main (on-screen) frame buffers image (color) data under the custody of COLOR_ATTACHMENT_1 or ATTACHMENT2 or what ???

Hoping for the best.Thanks all.

  1. Could be a bug in your shader or your viewport setup.

  2. If SetShader actually means RunShader, then yes the image data will be replaced by the results of the shader.

  3. The main buffer is FBO #0. It doesn’t have anything to do with color attachments.

If you want to output to the main buffer, just call glBindFramebufferEXT(0).

Thank you Stephen for replying.


#1. My viewport settings change and then restored according to original frame dimensions and it worked fine.I confirmed this by changing and restoring viewport settings and disabling call to SetShader() [texture setting , binding and actual shader code reside here].

Result:
It gave me uninterrupted supply of video frames.


#3. I see. I am calling glBindFramebufferEXT(0) after my shader processes a frame.


By # 2 I am correct at my assessment and confirmed this.

But now the problem is I want to process my image frame using shader program as well as would not like to disturb frame data from displaying.

In other words, I want continuous display of my video frames and do simultaneous computation on each frame by invoking shader program.

I had thought of following steps:-
a) Bind a FBO with 4 texture color attachments.

b) Attachment #1 will store current video frame pixel color data.

c) Attachment #2 has non-color data that is less than the total number of pixels in current image frame.

d) Attachments #3 and #4 are to be used for processing and temp storage ( i.e ping pongging etc)

But the hurdle with above is that FBO allows all texture color attachments to have same dimensions.
and Attachment #1 dimensions are greater than that of #2.

Q1. Do you suggest for above or some better way?


I came across following thread that was little helpful but unable to understand whether it is for MFC and PBO or an FBO can be used
Ref:
http://www.opengl.org/discussion_boards/…true#Post257439

The thread talks about multi-threading concept where multiple RCs are attached to a single DC using wglShareList() – that I have never used.

Q2. Is the solution given therein help me in achieving my objective?

Please comment.

I am using EXT_FBO that places restrictions that 2 texture objects must have same dimensions and all images attached to color attachments must have same internal formats.

To solve my problem stated in previous threads, I am, aiming to capture video frame / stream data and side by side process it using shader program.

Please correct me if following approach is wrong:

I take one FBO and bind 3 images of type color attachments to 3 texture objects and one attachment to Render to Buffer object.

I then capture/render my video frame directly to render to buffer object and use remaining 3 texture attache texture objects to do processing by shader code.

Will this approach succeed citing to the restriction placed by EXT_FBO that all attachments must have same dimensions?

I am stressing on attachment and dimension because the EXT_FBO specs specify it.

If the above is true => we can emulate ARB_FBO using EXT_FBO as former has a provision for different sized attached textures.

Note: here the 3 attached textures have same format and sizes. Only the RenderTobuffer object has different dimensions.

Please comment.

i think I got my answer after reading the specs again. Answer should be No because, RenderToBuffer is also an image of type render buffer and specs state that all images attached must be same in dimensions.

Solved.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.