PDA

View Full Version : Render bugs after recreating FBOs



Ident
12-28-2010, 09:00 PM
Hello,

In my application i render to FBOs with multisampling which works fine so far. Now I tried to expand my application to be able to change the sample value in run-time.

However I stepped against a weird bug which I cannot trace to any source. I get no OpenGL errors, and gDebugger shows no warnings and the data of the FBOs in the debugger is just fine and correct.

I create my Framebuffers and Renderbuffers once in my window initialization function and then fill and clear them every render step. This works great.

Now if i delete the buffers (glDeleteRenderbuffers, glDeleteFrambuffers) while rendering and recreate them exactly like i created them the first time, I get very strange results for all of the following render refreshes: It looks like the alpha blending doesnt work anymore and the textures also mix with the vertex colours in a different way.

However if i delete and recreate them at the end of my initialization function the rendering works just correct. I also tried to recreate the FBOs just in the first call of my render function quite at the start with also the result of getting the artefacts (but only on the first render call, from then on it seems to work fine).

I m very puzzled. What the hell is going on? I checked all IDs that i recreate and they were all fine, also at al times the data and IDs on the GPU are fine.

Does anyone have any ideas?
Do i have to do something with about the previously called glRenderbufferStorageMultisample and glFramebufferRenderbuffer when cleaning up?

Please help, I m sitting at this for hours now and i m making no process, not even after going through my app step by step with the recreation...

Dark Photon
12-28-2010, 10:08 PM
Now if i delete the buffers (glDeleteRenderbuffers, glDeleteFrambuffers) while rendering and recreate them exactly like i created them the first time, I get very strange results for all of the following render refreshes: It looks like the alpha blending doesnt work anymore and the textures also mix with the vertex colours in a different way.
This is not generally good practice as deleting and creating these things can be expensive.

However, it should still work.

For testing, right before you do the deletion, try adding a glFinish() call. Also add one right after you finish the recreation. This will force the pipeline to complete and clear at these stages. I don't think this should be necessary, only performance, but since you're dynamically deleting and recreating FBOs at render time you probably don't care about that, and it may give you a line on the problem.

All else fails, post a short GLUT test program here for everyone to try. It may turn out to be a driver bug, and AMD and NVidia are pretty responsive to fixing bugs when you post a test case.

Ident
12-29-2010, 07:50 AM
Thanks for your post Dark Photon.

I wonder why you say deleting and recreating these things can be expensive - I don't do this continually in my normal application, I got a nice GUI with a dropdown box and will do it only on request. Instead of reconfiguring the bindings and calling the MSAA functions the opengl wiki says you SHOULD delete and recreate the buffers, according to this: Clicky (http://www.opengl.org/wiki/Renderbuffer_Object) under "note".
Or maybe I misunderstood this - then could you tell me how this was meant?


I added glFinish() before deletion and after creation. No change at all.

Additional info:
I use OpenGlut and one of the latest versions of GLEW as libraries. I program in VS2008 but i guess that makes no difference - also release and debug compile modes come to the same results in terms of render bugs.
My OpenGL context is Opengl 3.3 Compatibility profile. I mix rendering with 3.3 code and 2.x code at the moment but commenting the old code made no difference.

I got an Nvidia graphics card ( Quadro FX 1800M ), driver is 259.something. I experience regular Nvidia driver related BSODs since I got this graphics card, but this should be unrelated to the rendering bugs but maybe a hint to crappy drivers, i dunno.


I wanna remind - deleting and creating works FINE if i do it after initialiation and before my "glutRenderFunc", no matter where in my glutRenderFunc i recreate my buffers, it will [censored] up, sometimes only in the first render step, sometimes ongoing.

This behaviour is more than inconsistent to me. I don't understand anything about what's going on and why.


Although somewhat of a pain to do this in my leisure time - I could write a test case - is it fine if i use the same libraries for the test case (openglut, glew)? Also i would be very thankful if we found a fix without test case :D

Alfonse Reinheart
12-29-2010, 08:05 AM
I wonder why you say deleting and recreating these things can be expensive - I don't do this continually in my normal application, I got a nice GUI with a dropdown box and will do it only on request. Instead of reconfiguring the bindings and calling the MSAA functions the opengl wiki says you SHOULD delete and recreate the buffers, according to this: Clicky under "note".
Or maybe I misunderstood this - then could you tell me how this was meant?

I think you're misunderstanding his advice. He's not saying that you should call glRenderbufferStorage again for the same object. He's saying that allocating new renderbuffer storage in the middle of the application can cause performance problems. It takes time for the driver to allocate a portion of GPU memory. It can also cause GPU memory fragmentation. Whether you actually delete the object and create a new one, or simply respecify the storage for it, you still incur this potential performance hit.

But as you say, for your application's needs, it may not be a problem.

Ident
12-29-2010, 08:32 AM
The bug is ofc more important in this thread - but what would be your recommended way of accomplishing a different sample value for MSAA in run-time?

Alfonse Reinheart
12-29-2010, 12:14 PM
what would be your recommended way of accomplishing a different sample value for MSAA in run-time?

There is nothing technically wrong with what you're doing. As long as you're not doing this frequently, it should be relatively OK.

Ident
12-29-2010, 12:27 PM
what would be your recommended way of accomplishing a different sample value for MSAA in run-time?

There is nothing technically wrong with what you're doing. As long as you're not doing this frequently, it should be relatively OK.

Thanks. Good to know, so this can also not be the source of the problem...

If anybody got a clue about my problem or idea what else i could to check or do I m still up for that. Thanks.

Else i will create a test case the next days, maybe someone could try it on their computer then to see if it is just local, or restricted to some drivers...

Ident
09-20-2011, 12:52 PM
Hi, i recently got back onto this problem and solved it. I wanna report about it in case someone stumbles upon this thread in the future.

Problem wasn't my code, it was somewhere in the openglut (or freeglut?) version i used - however now i use glew with the same code and it works fine :) no driver problems, just a bad glut

Topic closed