PDA

View Full Version : Resolution independant shading



Maire Nicolas
09-07-2010, 12:53 AM
Hello :)
I would like to apply a Blur to my scene, but I'm having problems, since blur results directly depends on the scene texture resolution.
I'm currently using glTexImage2D to render to my texture, but this function gives me a texture that has the same resolution as the screen.
Is there any other way (PBO, TBO, FBO ?) to render the scene to a texture that has its own resolution, whatever the screen resolution ?

strattonbrazil
09-07-2010, 04:19 AM
What do you mean "scene texture resolution"? I'm not familiar with this term. Do you mean scene resolution?

If you want to render your scene to a texture of a given size, you can bind a texture to an FBO of a given size unrelated to your window resolution (like 64x64). Assuming you set the viewport correctly, you can just render into that and get a 64x64 version of the scene.

BionicBytes
09-07-2010, 04:58 AM
Yes is the simple answer.
You need to create a series of Frame Buffer Objects with a colour attachment.
After rendering the scene (presumably also to an offscreen FBO) you can then use that texture as the source and render to one of the FBOs (which were setup to be screen size/2, for example). Then use FBO #1 as the source and render to FBO #2 (1/4 width for example). During these 'render' passes you should be using a blur shader and setup a 2D orthographic projection and set the viewport to the size of the framebuffer object.
As a quick alternative, instead of a blur shader you could use the glBlitFrameBuffer to copy from FBO #1 to FBO #2 - which will be diffeerent in size. The blur quality won't be as good - but you'll get the idea of what it's doing.

Maire Nicolas
09-07-2010, 05:24 AM
Thank you for your replies :)

Strattonbrazil : by "scene texture resolution", I mean the resolution of the texture obtained through glCopyTexImage2D

Strattonbrazil & BionicBytes :
If I'm right, your idea is to change the viewport settings to the texture resolution, or to rerender a FBO on a smaller viewport to get the wanted texture resolution.
Is it right to do the same with glCopyTexImage2D ?
I mean :
-Render the scene
-Call to glCopyTexImage2D -> Texture with Screen Resolution
-Set the viewport to the wanted texture resolution
-Render the texture to a screen-aligned quad
-Call to glCopyTexImage2D -> Texture with Wanted Resolution

strattonbrazil
09-07-2010, 08:25 AM
Why are you even changing the resolution? Are you just trying to downsample the image you're going to blur so it's faster? What's your end goal? Do you just want to blur your screen?

* render the scene to FBO #1 at any size (screen-resolution or half or a quarter or whatever)
* render to FBO #2 (of the same size) drawing a viewport-aligned quad with the texture bound to FBO #1. In your shader, you're doing the blur using texture coordinate lookups
* if you're putting it back on the screen, take the texture bound to FBO #2 and draw it onto the main window.

BionicBytes
09-07-2010, 12:58 PM
Yes you can use glCopyTexImage2d instead of an FBO as you suggest.

Maire Nicolas
09-08-2010, 04:51 AM
Ok, I will stick for the moment to glCopyTexImage, but I may try FBOs soon for the speed they offer.

Well, in fact I would like to downsample my texture to do a faster blur on it.
I know I could just ignore the texture resolution since I use texture coordinates in my shader, but I would like to achieve a bloom and I realized that blurring a "supposed" 32*32 texture wich had in reality a much higher resolution (the same as the screen) would give me unwanted artifacts in the final "bloom textures gathering"

strattonbrazil
09-08-2010, 08:46 AM
By the way, this was on the OpenGL front page this morning. Might be useful to you.

http://rastergrid.com/blog/2010/09/efficient-gaussian-blur-with-linear-sampling/

nickels
09-08-2010, 02:57 PM
Yeah, multilevel blur is the way to go. Lots of framebuffers, lots of box and gaussian kernels.
I blur on the way down and then again on the way up. You can see my pipeline:
Maybe something like this will work for you!

http://public.blu.livefilestore.com/y1pBThf-__OOMlZR0RybcLDzAlFG4T4jl1NamAHGP3BozOa8Ur--8R9btVVCMBp6JmO-RZNxHhajzuNkMksGgNYMg/gl_hdr_pipe.jpg

Maire Nicolas
09-09-2010, 08:41 AM
strattonbrazil : Indeed, this article should be useful to help me optimizing my shaders, thanks :)

nickels : You're using several images extracted from the scene, downsampled and then blurred, right ? Do you use fixed resolutions for them ?

nickels
09-09-2010, 01:10 PM
Maire:
I apply a threshold to the image to select what I will bloom, them downsample agressively using a 4x4 box filter to get quickly to a smaller image. Then I progressively sample using a gaussian filter until I get to a fixed image size (32x32 or 64x64, I don't remember). Then I start recombining on the way up using and blurring and finally add to the original scene.
The middle layer in the image above is the reduction of the luminance to a single value. The bottom layer is the bloom chain. The top right image is the final bloom result, which is added to the final scene, left.
I got a lot of useful info from:
Bungie HDR (http://www.microsoft.com/downloads/en/details.aspx?FamilyId=995B221D-6BBD-4731-AC82-D9524237D486&displaylang=en)

The resolution independence comes from reducing to a fixed size independent of the original buffer size. So for larger screen res I might have more layers of reduction.

Another way to go would be to apply a gaussian over and over, more times for larger images, but this would be much slower than the multilayer approach!

nickels
09-10-2010, 05:57 AM
In essence its like solving a diffusion equation, with the basic realization that multigrid is going to be way faster than a single resolution solve. Although I make no claims about actually precisely solving any diffusion equation or projecting residuals, etc...

Maire Nicolas
09-19-2010, 10:49 AM
Sorry for the late reply :eek:
I do my bloom the same way, I just do not downsample the images initially.
Thank you for the tips and the link :)