Fragment shader question

I blend two polygons with different textures and transparency into frame buffer. Can I do some image processing functions after blending, (such as invertion, edge detection) by using pixel/fragment shader function?

Well you can use the register combiners (if you have a geforce card) to blend the textures then do other math ops to them.

-SirKnight

Originally posted by SW:
I blend two polygons with different textures and transparency into frame buffer. Can I do some image processing functions after blending, (such as invertion, edge detection) by using pixel/fragment shader function?

Yes, but you’ll need to render to a texture instead of the framebuffer. You can’t access the framebuffer in a shader and the blending operations with the framebuffer are quite limited, but if you’ve rendered it to a texture it’s fully accessible in a fragment shader. Take a look at the WGL_ARB_render_texture OpenGL extension.

register combiners can access only the current fragment’s output from the different texture units.
So what you’ll have to do is bind several texture units with the same texture image, with different mappings, like (0,0) (1,0) and (1,0)
(in texels, so that will actually be 1/texWidth instead of 1, unless you are using non-power-of-2 textjre (texture_rectangle extension))
Then the register combiners have access to the ‘current’ texel, the one above, and the one to the right. You can then use the register combiners arithmetic (multiplies, scales, adds, etc.) to combine the 3 values into one output. Like 2*(0,0) - (1,0) - (0,1) which will give you some kind of edge detection.
This is much faster than multiple renderings into the framebuffer, and has more arithmetic flexiblity