Lightning using render to texture

Hi,

in my current project I am trying to enable lightning. To do this I want to test my own light system, which should work like this:

  1. The scene is rendered to texture (works)
  2. Read the scene, which is stored in a texture, back into the cpu (I dont know how to do that)
  3. Calculate for each pixel the distance to every light, the dotproduct of the normal of each pixel by the direction of every light.
    4 finally combine these information and update the scene texture again.

Now my question is how to get the scenes normals and postion and color of each pixel. I want to have a position(x, y, z) the normale(x,y,z) and the color(r,g,b,a) of every pixel

I hope my question ist understandable :slight_smile:

It sounds like you’re trying to do deferred rendering. Googling that phrase will tell you everything you need to know, but here’s a quick overview.

The idea with deferred rendering is that you render to textures all of the information you need to compute lighting. A simplistic implementation would render positions, normals, and whatever colors you might need, along with any other information (specular power, etc).

In the lighting pass, you render a full-screen quad, using gl_FragCoord to pick the values from your textures. Then you use those values to do your lighting computations, which you write to our output image.

A more optimal implementation will not bother to write the position at all. The position can be computed as needed by using gl_FragCoord.xy and the depth buffer value (which also has to be a texture), reversing the camera-space to clip-space transform.

Hi,

thx that exacly that I am looking for ;)But I dont find any got tutorials :frowning:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.