I need to do some very crazy blending in a multi-pass rendering loop and trying to figure out a way to preserve blending coefficients between passes.
I need to compute new blend coefficients in a fragment program that will be used in the next render pass. Ideally I'd like to be able to write this coefficient to the framebuffer so I can then read it back in the next iteration, compute a new value, and write it back.
The problem is that fragment programs can't access the frame-buffer. The only way I can think of getting this functionality is to copy the alpha channel to a texture and using the texture as input to a fragment program, but I'd like to avoid the copying overhead when it seems so conceptually simple for a fragment program to source coefficients from the framebuffer.
I investigated the possibility of writing values to a texture, but that doesn't seem to be possible either.
Does anyone have any ideas on how I can cache data from one render-pass to the next?
[edit: the other point is that I can't use render-to-texture (which would also save the copy) because it isn't yet supported under GLX. damn, eh?]
thanks in advance,
[This message has been edited by john (edited 09-03-2003).]