Hi,
I’m writing a 3d texture renderer supporting volumetric lighting.
It’s running using multitexturing and I’m doing a lot on the CPU right now, but now I’d like to accelerate it by moving most code into vertex and fragment shaders.
But I know nothing about shaders yet (well, I saw a few examples). So I’m here to ask what is possible and what not. Maybe you just can point me to the right references / books.
I’m mainly developing for nvidia/cg, it’s a semester thesis.
Algorithm:
I have a 3d object = volume which must be rendered. Therefore I slice this object into a lot 2d slices:
For each of these slices:
1st passt: render the slice for the eye into the framebuffer
– for this render pass, use a 3d texure as transfer function for the volumetric model
– and use a 2d texure which is generated in the previous 2nd pass (combine the 2d texture with the 3d texture and blend them into the eye buffer)
2nd pass: render the same slice for the light into the lightbuffer
– again use the 3d texture and blend into the light buffer
– render directly into the light buffer, e.g. with the frame_buffer object extension
end for each slice;
So right now I’m using multitexturing for the texture lookups and combinations (modulation) and glBlendFunc for the blending etc.
I’ll replace this with fragment shaders.
For the first pass I need another fragment shader than for the second pass.
And in the first pass, I already need the coordinates for the volumetric lighting texture since I’m using the texture to modulate the 3d volume data.
So I thought of a a single vertex shader that has some inputs and computes the vertex coordinates for both passes and outputs 2 vertices.
1 output vertex for the first pass and 1 output vertex for the 2nd pass.
Maybe the vertex shader could also define which output vertex will use which fragment shader.
Why do I want to combine the 1st and the 2nd pass?
Because the lighting texture coordinates that must be computed for the 1st pass are the same as the vertex coordinates for the light buffer. So I could save some computations there.
Questions:
- Can a vertex shader output two vertices?
- Can 2 different fragment shaders be loaded at the same time? And how can you select them? Can you select in the vertex shader the fragment shader that should be used?
- Can a fragment shader select to which render target / framebuffer it outputs?
- Is there something like “streams” or logically parallel pipelines? I’d like to have loaded 2 different vertex shaders and 2 different fragment shaders at the same time, such that I don’t have to switch the binding x times etc. (2 pass algorithm for each volumetric slice)
- Where could I find such information? (books, guides, references?)
Thanks - Andy