I need to evaluate the data in my z-buffer texture in world space. Since the depth buffer values undergo the perspective calculation I need to reverse this process.
What I would like to do in a fragment shader is to associate depth values with OpenGL units in world space.
If somebody has done something similar or knows how to do this, I would be very interested.
dl mesa src and take look at gluUnproject function
write shader similar to gluUnproject func. Inputs are known (modelview, projection and viewport put as uniforms, x,y from gl_FragCoord and z from texture)
Read the thread “depth buffer as float texture” in the opengl advanced forum.
If you don’t need more that 8 bits precision, it’s fairly straight forward.
I don’t get it.
Eye space does still contain the perspective calculation, right!?
So, your code should only change the range of the depth data but not linearlyze the range…
I get negative values when using this code with clipping at near=0.5 and far=10.5.
I am not sure if this can help me, but it sure looks more efficient than where I am going at the moment…
I believe the calculation that makes your depth values non-linear is in the projection matrix. So anything transformed to eye-space should still be linear just like the values you get from this method. The negative values you see are correct because in OpenGL the +z axis points toward you.
I derived this method after reading a bunch of literature on the web and solving some equations. I was amazed I could still do that much math Unfortunately, I don’t know where my sources are anymore. Here’s one source: http://www.opengl.org/resources/faq/technical/depthbuffer.htm But there are a lot more out there if you google for them.
Basically, you need to find the equation that converts linear eye z to normalized device coordinate z, then invert the equation, and then turn it into shader code.
The result should be pretty accurate up close to you and very inaccurate close to the far clip plane. Once you convert depth to NDC, a lot of precision is lost forever.
Yes, it looks like I will need to use the Perspective Matrix to get back to world space.
Then the GluUnproject code would be the only way to go. And I though I would get around matrix calculations…
I got it!
mogumbo had the right idea: nearfar/(far-openGLdepthValue(far-near))
unmaps the z values to linear clipping data, thats what I was looking for. I now have z data which is linearly related to the pixel-to-eye distance.
I had the unproject method working, when I realized that world space wasn’t what I needed…