PDA

View Full Version : converting a depth texture to world space



def
12-10-2004, 07:11 AM
I need to evaluate the data in my z-buffer texture in world space. Since the depth buffer values undergo the perspective calculation I need to reverse this process.
What I would like to do in a fragment shader is to associate depth values with OpenGL units in world space.

If somebody has done something similar or knows how to do this, I would be very interested.

yooyo
12-10-2004, 07:18 AM
1. put your depthbuffer in texture
2. dl mesa src and take look at gluUnproject function
3. write shader similar to gluUnproject func. Inputs are known (modelview, projection and viewport put as uniforms, x,y from gl_FragCoord and z from texture)

yooyo

def
12-11-2004, 08:12 AM
Thanks yooyo, looking into the mesa source is a great idea.
:)

garmonbozia
12-11-2004, 10:39 PM
Tell me please,how i can put depth buffer in texture :confused:

def
12-12-2004, 01:15 PM
Read the thread "depth buffer as float texture" in the opengl advanced forum.
If you don't need more that 8 bits precision, it's fairly straight forward.

mogumbo
12-14-2004, 03:19 PM
Would eye space be good enough? Here's how you can take a depth buffer value to eye space (arb_fp; not glsl I'm afraid).

first pass this to program local 0:
far_clip - near_clip, -far_clip, far_clip * near_clip, 0.0f

then do this in the frag program:
PARAM depth_to_eyez = program.local[0];
TEMP depthtex, eyez;
# get depth texture value, which is NDCz remapped to {0, 1}
TEX depthtex.x, fragment.position, texture[1], RECT;
# convert depthtex value to eyez;
MAD eyez.x, depthtex.x, depth_to_eyez.x, depth_to_eyez.y;
RCP eyez.x, eyez.x;
MUL eyez.x, eyez.x, depth_to_eyez.z;

def
12-15-2004, 02:15 AM
I don't get it.
Eye space does still contain the perspective calculation, right!?
So, your code should only change the range of the depth data but not linearlyze the range...
I get negative values when using this code with clipping at near=0.5 and far=10.5.

I am not sure if this can help me, but it sure looks more efficient than where I am going at the moment...

Can you explain what the code should do?
Thanks

mogumbo
12-15-2004, 07:04 AM
I believe the calculation that makes your depth values non-linear is in the projection matrix. So anything transformed to eye-space should still be linear just like the values you get from this method. The negative values you see are correct because in OpenGL the +z axis points toward you.

I derived this method after reading a bunch of literature on the web and solving some equations. I was amazed I could still do that much math :) Unfortunately, I don't know where my sources are anymore. Here's one source: http://www.opengl.org/resources/faq/technical/depthbuffer.htm But there are a lot more out there if you google for them.

Basically, you need to find the equation that converts linear eye z to normalized device coordinate z, then invert the equation, and then turn it into shader code.

The result should be pretty accurate up close to you and very inaccurate close to the far clip plane. Once you convert depth to NDC, a lot of precision is lost forever.

def
12-15-2004, 07:42 AM
Yes, it looks like I will need to use the Perspective Matrix to get back to world space.
Then the GluUnproject code would be the only way to go. And I though I would get around matrix calculations... :(

yooyo
12-16-2004, 09:28 AM
... or you can create rgba32f pbuffer and store pixels xyz position in rgb. In second pass, bind rgba32f pbuffer as texture and read rgb as xyz.

yooyo

def
12-16-2004, 10:49 AM
:D I got it! :D
mogumbo had the right idea: near*far/(far-openGLdepthValue*(far-near))
unmaps the z values to linear clipping data, thats what I was looking for. I now have z data which is linearly related to the pixel-to-eye distance.
I had the unproject method working, when I realized that world space wasn't what I needed...

Thanks to everybody for your help!