zrzmonkey

06-03-2011, 02:14 AM

I want to unproject the gl_FragCoord to get the 3D coordinats in camera space. This is my code in fragment shader,but it doesn't work.The input is gl_FragCoord.

vec4 getcoord (vec4 a)

{

vec4 d=vec4(a.x*2/Size-1,a.y*2/Size-1,a.z*2-1,1);

vec4 c=gl_ProjectionMatrixInverse*d;

return c/c.w;

}

And,I want to calculate the coordinats in camera space using a depth buffer.So,as I know the depth of a pixel,how do I get the coordinats in camera space?I do it like this in fragment shader.Does it work?

vec4 getcoord (sampler2D depth,vec2 a)//a is the pixel's 2D screen coordinate

{

float de=texelFetch2D(depth,ivec2(a),0);

vec4 b=vec4(a,de,1.0);

vec4 c=vec4(b.x*2/Size-1,b.y*2/Size-1,b.z*2-1,1);

vec4 e=gl_ProjectionMatrixInverse*c;

return e/e.w;

}

Thanks!

vec4 getcoord (vec4 a)

{

vec4 d=vec4(a.x*2/Size-1,a.y*2/Size-1,a.z*2-1,1);

vec4 c=gl_ProjectionMatrixInverse*d;

return c/c.w;

}

And,I want to calculate the coordinats in camera space using a depth buffer.So,as I know the depth of a pixel,how do I get the coordinats in camera space?I do it like this in fragment shader.Does it work?

vec4 getcoord (sampler2D depth,vec2 a)//a is the pixel's 2D screen coordinate

{

float de=texelFetch2D(depth,ivec2(a),0);

vec4 b=vec4(a,de,1.0);

vec4 c=vec4(b.x*2/Size-1,b.y*2/Size-1,b.z*2-1,1);

vec4 e=gl_ProjectionMatrixInverse*c;

return e/e.w;

}

Thanks!