Hi guys,
The problem comes up when I try to project a point light volume, a sphere, to screen. The process is following:
1, I draw a sphere with a vertex shader, get the vertex position in normalized screen space.
uniform float fViewAspect;
uniform float fTanFOV;
uniform mat4 mat4ModelViewProjection;
varying vec3 vec3EyeScreenRay;
void main()
{
gl_Position = mat4ModelViewProjection * gl_Vertex;
vec2 vec2ScreenPosition = gl_Position.xy / gl_Position.w;
vec3EyeScreenRay = vec3(vec2ScreenPosition.x * fViewAspect, vec2ScreenPosition.y, fTanFOV);
}
2, In fragment shader, I calculate the texture coordinates from the screen space position, which indicate the pixel may be in the light volume.
vec2 vec2TexCoord = 0.5 * (vec3EyeScreenRay.xy + vec2(1.0));
But, the result is weird. When I bind a render target of diffuse color for test,
gl_FragColor = texture2D(txColor, vec2TexCoord);
pixels are drawn in the sphere area and twisted. Looks like it is affected by the sphere’s polygon tessellation.
So, is my way to project light volume wrong?