PDA

View Full Version : linear filtering not working for variance shadow mapping



Sobeit
06-27-2014, 01:28 PM
hi, I'm trying to implement variance shadow map, but got the result with very sharp shadow edge. I checked the glsl code and found out it's because the variance( = moments.y - moments.x * moments.x) is always zero. so I suspect that somehow opengl failed to do linear filtering when fetching texels. Does anyone know why?



GL_DEBUG(glTexImage2D(GL_TEXTURE_2D, 0, GL_RG32F, 512, 512, 0, GL_RG, GL_FLOAT, 0));
GL_DEBUG(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR));
GL_DEBUG(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR));

arekkusu
06-27-2014, 02:24 PM
Very old (~2006) desktop GPUs and most current phone GPUs can't filter 32-bit floats. Try RG16F.


so I suspect
Do an experiment to confirm your suspicions: make a two-pixel texture with black and white, and stretch it across three pixels. Is the middle pixel grey?

Sobeit
06-27-2014, 07:00 PM
Thank you for your reply. I drew a 512x512 texture with left side being black and right side being red, then render this texture to 1152x648 frame buffer, and I did see some gradual transition in the middle. so linear filtering works in this case. by the way I'm using OpenGL 4.0.

Do you know any other season may cause linear filtering to fail?

EDIT: I fixed it, I was a tiny bug, I compared the wrong value. Thanks