blur effect

Hello,

I am trying to make some Gaussian blur effect. Here is the result. I use 13x13 horizontal and vertical separately blurs.

The problem is the edges of the blurred object are jagged. I can see the squares in different colors along the edges. I need smooth edges.

Here is how I make the blur effect.

  1. Downsample and pass the objects need to be blurred from the original 512x512 texture to a 128x128 image and do a horizontal blur.

  2. Copy the result image to another 128x128 image and do a vertical blur.

  3. Copy the result image to the first 128x128 image and do a horizontal blur.

  4. Copy the result image to the second 128x128 image and do a finally vertical blur.

The 4-times blurred image is here.

Maybe I could get a better result by using a two-dimensional kernel rather then separate horizontal and vertical kernels. But that would be much less efficient. Any suggestions?

Doing multiple passes should be OK. What filtering are you using when you scale-up the image? NEAREST would result in the 4x4 pixel blocks in the image. Try setting it to LINEAR and see what happens.

The problem is the resolution of the blur, when you go to a 128 x 128 image you then go back up to screen resolution. You’re using a nearest magnification filter for the resize back up.

You should use a GL_LINEAR to get bilinear results with texture hardware used for the magnified fill. This will eliminate the sampling you see by applying an appropriate reconstruction filter.

P.S. yep, just as the previois poster said…

All my textures already have GL_LINEAR for both magnification and minimization. My video card is ATI mobility 9600. Don’t know if that matters.

FLOAT16 formats are not filtered on any ATI gpu (except for the 2900XT I uppose).
Judging by the image (some lines and columns are doubled) it seems like you do have GL_NEAREST on that texture.

k_szczech, are you saying that even if I specify GL_LINEAR in my program, it still uses GL_NEAEST anyway?

I tried the program on three different machines all with newest driver.

Nvidia Quadro FX 3000 -> No floating-point texture support. (GL_ARB_texture_float)

ATI Radeon 9700 pro -> Same jagged edges.

ATI Radeon 9800 pro -> Same jagged edges.

k_szczech, are you saying that even if I specify GL_LINEAR in my program, it still uses GL_NEAEST anyway?
Yes.

Yes. Only GeForce 6 (and above) could filter FLOAT16 textures.

In my HDR implementation my fnal texture (the one I overlay on the screen) is RGB8. Perhaps you can use the same approach here.