After a lot of headaches I have managed to get some code working that uses Cg, and pbuffers as both render targets and texture sources.
Now my problem is I'm in a situation where I need to render an image in nonoverlapping "tiles", where, when rendering a particular tile I have to have all the previous tiles available as input.
Imagine that my final image is conceptually divided up like a checkerboard. My first rendering pass renders the first square. My second rendering pass takes the first square, uses it as input to render the second square, and passes along the first square unchanged. So now my rendered image has square 1 and square 2 rendered. The process repeats, with square 3 using square 2 as input, and passing along squares 1 and 2 unchanged. Etc.
When completed, all 64 squares are rendered and the result is the final image.
Unfortunately, I want to keep everything in hardware and with pbuffers there doesn't seem to be an easy way to do this. According to Apple's documentation, pbuffers can be rendered into but cannot be modified when used as textures. Ultimately I'd like to keep a texture in memory (on the graphics card--I want it all in hardware!) that I can use as an input texture to all the rendering passes, and update by adding each square as it is rendered.
The only solution that I can think of is to render (to a pbuffer) the current square, and then use simple textured quads to pass through the previously rendered squares. Then use the pbuffer as the texture input for the next pass.
I hope that my explanations make sense. If not please let me know and I'll see if I can clarify.