The thing i want to do is a combination of "conventional rendering" (say render something into render target) and overlay raytraced image.
I have a raytracing kernel which outputs PBO which I can draw later as a texure, i also can generate depth for any point in my ray-traced image, so i can get depth buffer with floatind point depth values. Then i want to draw let's say a sphere or box over my ray-traced image but i want my sphere or box to account for depth which my ray-traced image has. So the question is - how can i put my float point depth values to depth buffer (GL_DEPTH_COMPONENT render buffer)
Am I right assuming, that i can do something like this:
and then draw quad where in pixel shader output values from my duffer of depth values?
e.g. gl_FragColor = texture2D(ray_traced_depth, texCoord.xy).r;