PDA

View Full Version : deferred shading // update



Vexator
09-12-2007, 08:16 AM
i want to switch over to deffered shading. i have setup my fat buffer, but i'm not sure what exactly i have to store and in which way. i guess i need at least diffuse color, normal and depth (to reconstruct the original vertex position).

- how to store the normal? should i simply pass the vertex normal as a varying to the fragment shader? or should i have already transformed it into view space or tangent space?
- how to store depth? should i use a depth texture to store it or is it sufficent to use the alpha channel of oneof the textures?
- how to restore the original vertex position?

thanks!

Trenki
09-12-2007, 09:55 AM
Hi!

I would use all 16 bit floating point render targets.

You could simply store the vertex normal in world space and to the lighting in the deferred pass also in world space.

For depth I would not use a depth texture (I don't even know if you can easily read from them). You can simply use the alpha channel of the texture where you store the normal for instance

You could store your vertex position as three floats but that is not realy necessary as it is possible to recover the original vertex position from depth only.

[ www.trenki.net (http://www.trenki.net) | vector_math (3d math library) (http://www.trenki.net/content/view/16/36/) | software renderer (http://www.trenki.net/content/view/18/38/) ]

Vexator
09-12-2007, 11:40 AM
it is possible to recover the original vertex position from depth only.yeah, but how? :)

Zengar
09-12-2007, 12:11 PM
You could probably look it up here: http://forum.beyond3d.com/showthread.php?t=37614

He just reconstructs the world position using the inverse of the projection matrix and the fragment position on screen.

Vexator
09-12-2007, 02:19 PM
thanks for the link, but i have to admit that i cannot quite follow. what would vpos.xy be in glsl? gl_FragCoord.xy?

Zengar
09-12-2007, 02:23 PM
I guess :) You'll have to look it up in the spec though, I am not sure how DX and OpenGL agree on device coordinaes and depth representation (the w coordinate in particular)

oc2k1
09-12-2007, 03:07 PM
It's simpler to use calculate the distance from the Z-Value. The direction is already known from the pixel position (if the bounding volume of a light is drawn)

Vexator
09-13-2007, 03:47 PM
could you specify plz?

Y-tension
09-14-2007, 01:56 AM
You can always store world coordinate in buffer. This technique was used in S.T.A.L.K.E.R (there's an excellent article in GPU Gems 2).

Brolingstanz
09-14-2007, 07:37 AM
... and the story continues in GPU Gems 3.

P.S. Great book guys :eek:

Vexator
09-14-2007, 11:23 AM
yeah i read that article. but they're passing the position as a whole. what i like to do is reconstruct it from depth only. atm i need 3 textures to store position, normal and diffuse color. that's just too much. i can compute normal.z from normal.xy, so i could drop that.. if i could drop position.xy as well, then it'd all fit into 2 textures.

Jan
09-14-2007, 01:43 PM
Take a look at gluUnproject. That function does exactly what you want to do, you only need to implement that function in a shader.

For example here:
http://www.opengl.org/documentation/specs/man_pages/hardcopy/GL/html/glu/unproject.html

Jan.

oc2k1
09-14-2007, 04:32 PM
float Z = DepthParameter.y/(DepthParameter.x - texture2DRect(g_depth,gl_FragCoord.xy).r);
vec3 ModelView = vec3 (unpro.xy/unpro.z * Z,Z);Try that to recalculate the modelview position. unpro is the is the modelview position of a light bounding volume pixel....

Vexator
09-16-2007, 08:08 AM
@oc2k1: what would DepthParameter.x/.y be? and how and why get a pixel of the light's bounding volume?

thanks guys! don't lose patience plz :p

oc2k1
09-17-2007, 07:26 AM
DepthParameter.x/.y are two uniform values that are required to calculate the distance from the Depthbuffer value, both are depend to the far and near plane. For more information read that:
http://www.sjbaker.org/steve/omniv/love_your_z_buffer.html

unpro.xy/unpro.z is the direction to the pixel. It's a varying that have to be filled with the worldviewposition in the vertex shader.

Vexator
09-23-2007, 12:02 PM
i still didn't get it to work :p


vec4 viewport = vec4( 0, 0, 1024, 768 );
float z = texture2D( u_DepthTexture, v_Coordinates ).r;

vec4 input;
input.x = 2.0 * ( gl_FragCoord.x - viewport.x ) / ( viewport.z - 1.0 );
input.y = 2.0 * ( gl_FragCoord.y - viewport.y ) / ( viewport.w - 1.0 );
input.z = 2.0 * z - 1.0;
input.w = 1.0;

vec3 Position = ( u_Matrix*input ).xyz;u_Matrix is the inverse of view * projection.

i have two projection matrices, a perspective one (when rendering the geometry) and an orthographic one (when rendering the full screen quads). i need to use the perspective one here, right?

oc2k1
09-23-2007, 12:48 PM
sure... but you should use my code snipped, because that it's much faster. And rendering fullscreen quads isn't optimal for the most light sources...

Vexator
09-23-2007, 01:01 PM
i would, but i'd first have to get unpro.x/.y and DepthParameter.x/.y, resulting in more potential sources or error. i don't care much about speed atm, as long as it works :)

does the code look ok to you? the position is still dependant on the camera's position/orientation.

Vexator
10-11-2007, 01:18 PM
ok, i gave another approach a try, a professor at university suggested the following:


initial pass, vs:

vViewPosition = gl_ModelViewMatrix*gl_Vertex;

fs:

vViewPosition /= vViewPosition.w;
float Distance = -vViewPosition.z;

//then i store Distance in the MRT

deferred lighting pass, fs:

// Depth = distance computed in initial pass, read from MRT

vec3 ray;
float invTanHalfFOV = 1.0 / tan( radians( 22.5 ) );
ray = vec3( ( (gl_FragCoord.xy/vec2(1024, 768)) - 0.5 ) * 2.0, -invTanHalfFOV );
ray /= invTanHalfFOV;

Position = vec4( ray * Depth, 1.0 );y & z are correct, but x is always a bit too small. It cannot be due to the depth stored in the MRT, as only one component is wrong. so there's probably sth wrong in the way i calculate the ray. i checked the fov and the resolution.. do you have any clue what might be wrong? thanks :)