PDA

View Full Version : Problem with alpha writes within Fragment Shader



Zulfiqar Malik
01-18-2005, 12:45 AM
I am currently working on real-time shadows with support for translucent objects via projective texture mapping. Since i am writing color values in the fragment shader as well, i cannot use a GL_DEPTH_COMPONENT texture and therefore i have to store depth values manually within a particular channel of the frame buffer (from where i copy it to a texture). Initially i tried writing the color values in the RGB channels and the depth values in the ALPHA channel of gl_FragColor but for some reason values don't get written into the ALPHA channel of the fragment color (yes, my texture is GL_RGBA, so that is not the problem). So right now my technique is working by writing the depth values in the RED channel and color values in the other three but the color value in ALPHA channels get lost for some reason due to which colors of projective textures don't appear correct, but black shadows work right due to the correct depth value in the RED channel.

Apart from that, since i am using one component of the frame buffer to write out depth values (clamped between 0 and 1) therefore i am getting precision problems in the form of shadow mapping artifacts. Anybody knows a way to get around those (without using nVidia's specific NV_texture_float extension)?

Relic
01-18-2005, 02:06 AM
therefore i have to store depth values manually within a particular channel of the frame buffer (from where i copy it to a texture).Probably stating the obvious, but make sure your pixelformat has a destination alpha channel.
If you use GLUT you must specify GLUT_RGB and GLUT_ALPHA. (GLUT_RGBA == GLUT_RGB)
If there is no destination alpha all readbacks will return alpha == 1.

It doesn't matter that the texture has an alpha channel of you don't have one in the source for the copy to texture operation.


Since i am writing color values in the fragment shader as well, i cannot use a GL_DEPTH_COMPONENT textureI don't understand this restriction. What's the problem to use a color and a depth texture simultaneously?

Zulfiqar Malik
01-18-2005, 03:00 AM
Yeah, the pixel format does specify 8 bits for each of the red, blue, green and alpha channels. So, the problem is still there!

I cannot use a GL_DEPTH_COMPONENT texture since i want the whole RGBA values from the back buffer and not just the depth values.

Speaking of problems, there is another problem i had while using GLSL for the same shader.
The way the projective texturing works is that for opaque objects i update the depth value (i.e. RED channel) and leave color unmodified, whereas for translucent objects i leave the depth unmodified and write color in the rest of the channels. Now, before rendering translucent objects i disabled the red color channel using glColorMask(0, 1, 1, 1); (since i am writing depth in red channel) but that didn't seem to work and values in red channel were getting written which messed up my shadow rendering algorithm. After wasting a few hours i decided to write the depth/color write shaders in CG and guess what, the same shaders worked perfectly. Can that be a problem with GLSL implementation (i am using nVidia GFX 5700 Ultra with latest drivers 66.93)?
Anyways, the original problem still remains. Any help shall be highly appreciated.

Guido
01-20-2005, 04:24 AM
I'm using GLSL for physical simulation on GPU...it's a really hard job...in a month I have already found two Nvidia's driver bug...If your code seems reasonably strong, please write a bug report at Nvidia's developer ring...with my poor luck there are good probability that your today's bug will become one of mine in one or two weeks ;)