I’m trying to make a shader to calculate a 3D gradient vector for a set of 3 images from a continuous video stream. The idea is to represent the video frames as a 3D volume using raycasting, as in this example:
http://vimeo.com/8096416
The idea is that in the next version, this shader will be used to calculate a normal for lighting purposes, writing normal XYZ into the RGB channels of each Z-axis slice of the texture that’s input to the raycast shader, with intensity in the A channel.
In the code below, I’ve used 3 texture inputs, from the previous, current and next frames. I’m attempting to calculate a normal X and Y values by subtracting neighbouring texels on the X and Y axes in the current frame texture, and the Z by doing the same with values at the current coords in the previous and next frame textures. Here’s the code:
uniform sampler2D PreviousFrame, CurrentFrame, NextFrame;
const float texel = 0.01;
void main()
{
vec3 s0, s1, norm;
s0.z = texture2D(PreviousFrame, gl_TexCoord[0].xy).r;
s0.x = texture2D(CurrentFrame, gl_TexCoord[0].xy + vec2(-texel, 0.0)).r;
s0.y = texture2D(CurrentFrame, gl_TexCoord[0].xy + vec2( 0.0,-texel)).r;
s1.x = texture2D(CurrentFrame, gl_TexCoord[0].xy + vec2(texel, 0.0)).r;
s1.y = texture2D(CurrentFrame, gl_TexCoord[0].xy + vec2(0.0, texel)).r;
s1.z = texture2D(NextFrame, gl_TexCoord[0].xy).r;
norm = normalize(s1 - s0);
norm = 0.5 * norm + 0.5;
float alpha = texture2D(CurrentFrame, gl_TexCoord[0].xy).r;
//Multiply color by texture
gl_FragColor = vec4(norm, 1.0);
}
The problem is, I seem to be getting a lot of black pixels in the resulting texture.
Am I right in assuming this is going to be a problem when trying to use the normal to light my volume render, and if so, can anyone suggest any way to fix the problem?
Thanks a lot,