fragment depth in fragment shader.

Hi…

I am trying to use fragment shader to encode each fragment’s depth into RGBA channels. So that i can do 4 rendering passes and then read back the depth buffer values using only one pass.

but somehow, the depth values i got back are wrong compared to those read back without using shaders.
right now i am multiplying the coordinates in the vertex shaders with modelviewproj. but somehow in the fragment shader… values computed dun seem to be correct.

Are you using vertex-/fragment-shaders or vertex-/fragment-programs?

In the latter case, you could specify the ARB_position_invariant option in the vertex program to ensure the depth values are the same for the fixed function and the programmable pipeline.
In the fragment program select fragment.position.z as the depth for a fragment.

Nico

What’s wrong with just rendering once, and locking/reading the actual depth buffer?

Originally posted by -NiCo-:
[b]Are you using vertex-/fragment-shaders or vertex-/fragment-programs?

In the latter case, you could specify the ARB_position_invariant option in the vertex program to ensure the depth values are the same for the fixed function and the programmable pipeline.
In the fragment program select fragment.position.z as the depth for a fragment.

Nico[/b]
in the former case you can use ftransform()