PDA

View Full Version : Is cubemap depth test with CgFX possible?



kitfox
08-15-2010, 10:18 AM
I'm trying to get shadow mapping for omni directional lighting working in an OpenGL/CgFX program. I ran into a wall in that I can't figure out what to pass to the texCUBE() method to check for shadows.

At the moment my code is mostly working. For my omni light, I create an OpenGL cubemap texture and then for each of the six faces bind it to an FBO, set up a 90 degree perspective camera that points down the axis in work space and use an FBO to render the scene onto the face.

Originally I was using a custom CgFX shader during this stage and rendering to a color buffer. The shader would encode the distance to the point light using pack_ubyte() and return a color. Then when I rendered my scene, I would unpack this value and use it to compare with the world distance of any point to the light. The system worked well, and shadows were drawn in the right place.

Then I learned about PCH. After getting this working for my spotlights (which use a GL_TEXTURE_2D to hold their shadow info), I decided to get the same thing working for my cubemaps. I'm now declaring my cubemaps as GL_DEPTH_COMPONENT:



glGenTextures(1, &texDepthId);

glBindTexture(GL_TEXTURE_CUBE_MAP, texDepthId);

glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_DEPTH_TEXTURE_MODE, GL_LUMINANCE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE);

glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);

for (int i = 0; i < 6; ++i)
{
glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + i,
0, GL_DEPTH_COMPONENT,
width, height,
0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE,
NULL);
}


I'm rendering to them this way:



glViewport(0, 0, width, height);

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(90, 1, 1, 1000);
glMatrixMode(GL_MODELVIEW);

glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT); //Only draw back faces

//Render to each face
for (int i = 0; i < 6; ++i)
{
glFramebufferTexture2DEXT(
GL_FRAMEBUFFER_EXT,
GL_DEPTH_ATTACHMENT_EXT,
GL_TEXTURE_CUBE_MAP_POSITIVE_X + i, texDepthId, 0);
checkFramebufferStatus();
checkGLErrors();

//Clear
glClear(GL_DEPTH_BUFFER_BIT);

//Camera
glLoadIdentity();
switch (i)
{
case 0:
gluLookAt(cx, cy, cz,
cx + 1, cy, cz,
0, -1, 0);
break;
...
}

drawMeshes();
}


If I examine the faces with glGetTexImage() at this point, they appear to have been rendered correctly. I then upload this to my scene shader (I'm rendering one light per pass, so I attach the cubemap and disable the plane here):



CGparameter paramShadowCube = cgGetNamedEffectParameter(cgEffect, "lightShadowCube");
CGparameter paramShadowPlane = cgGetNamedEffectParameter(cgEffect, "lightShadowPlane");
...
cgGLSetupSampler(paramShadowCube, texId);
cgGLSetupSampler(paramShadowPlane, 0);


So far, so good. However, I now have no idea what to write in my shader. This was working before:



texture cubeTex <
string TextureType = "CUBE";
>;

samplerCUBE lightShadowCube = sampler_state {
Texture = <cubeTex>;
minFilter = Linear;
magFilter = Linear;
WrapS = ClampToEdge;
WrapT = ClampToEdge;
WrapR = ClampToEdge;
};

...

float3 sampDir = pointInWorld - lightPosition;
float4 shadow = texCUBE(lightShadowCube, sampDir);


However, the depth buffer should now be in the range [0 1] since the projection matrix always maps and clips things to the [-1 1] range (and then the frame buffer maps the [-1 1] depth to [0 1]). I can deal with this for spotlights because I can just invert the projection matrix. But how do I do this with a cubemap?

What does texCUBE(lightShadowCube, sampDir) even mean for a depth cubemap? tex2D(lightShadowPlane, vec.xyz) will do a depth comparison against the .z component at .xy and return a black or white value that indicates if the depth test passed. But in a cubemap all three components are needed to form the lookup vector. What is returned, the value of the depth buffer in that direction? In my case, the only thing that seems to be returned is float4(1, 1, 1, 1). Am I doing something wrong?

kitfox
08-15-2010, 06:04 PM
Figured it out. There's a form of texCUBE() that accepts a float4 for the lookup. The xyz component of this term are the ordinary lookup vector for a cubemap while the w component is used for the depth test.