PDA

View Full Version : using alpha blending with shaders



HuFlungDu
06-27-2011, 11:58 PM
Simply, I am working on rendering text using the new OpenGL style, so putting everything within the shaders basically. What I do is I get a texture containing all of the glyphs from a bitmap containing all the glyphs like this:


glTexImage2D( GL_TEXTURE_2D, 0, GL_ALPHA, Z.shape[1], Z.shape[0], 0, GL_ALPHA, GL_UNSIGNED_BYTE, Z )
where Z is a numpy array containing the bitmap information.

Then when I render it, I pass the shader the position of the glyph I want to render within the texture so it only grabs that portion of the texture. So far so good.

As you can see, it is set up so that the glyphs should be able to blend with glColor so that one can set the color of the font before rendering. This doesn't work. After doing some testing with explicitly setting the alpha within the shader, it appears that the only alpha value to have any effect at all is the value 0, which makes that pixel completely clear, as one would expect. All other values just leave the pixel looking exactly the same as it would with 1 alpha value (usually means a flat black color, but testing shows that every other color leaves it as that color).

So, the only thing I can assume is that it doesn't know what it is supposed to blend the alpha channel with and so just blends it with... nothing I guess. I thought I set it up for that in the beginning of my code with this (Python by the way):


glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE )
glDisable(GL_LIGHTING)
glEnable( GL_DEPTH_TEST )
glEnable( GL_BLEND )
glEnable( GL_COLOR_MATERIAL )
glColorMaterial( GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE )
glBlendEquation( GL_FUNC_ADD )
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA)
but experience is telling me otherwise.

Just for completeness' sake, here are my vertex and fragment shaders:



//Vertex shader
#version 110

attribute vec4 position;

attribute vec2 relativeposition;

varying vec2 texcoord;

void main()
{
gl_Position = position;

texcoord = relativeposition;
}

//Fragment shader
#version 110

uniform sampler2D texture,textures;

varying vec2 texcoord;

void main()
{
gl_FragColor = texture2D(texture, texcoord);
}

So anyways, does anything jump out that I'm missing for doing this? Thanks.

Alfonse Reinheart
06-28-2011, 12:27 AM
You're using a shader, so the following functions are irrelevant:



glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE )
glDisable(GL_LIGHTING)
glColorMaterial( GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE )


If you're using a shader, you don't get the texture environment operation. So if you want to have the color multiplied with your texture color, you need to actually do that in the shader.

Also:



gl_FragColor = texture2D(texture, texcoord);


This texture uses a GL_ALPHA internal format. Therefore, the RGBA values you get from accessing the texture is (0, 0, 0, A), where A is the alpha value.

Therefore you're writing black to the screen. If you want the alpha channel broadcast over RGBA, then you need to do this:



gl_FragColor = texture2D(texture, texcoord).aaaa;

HuFlungDu
06-28-2011, 04:05 AM
Aha, thanks. Armed with this knowledge I have created a fragment shader capable of rendering the text using using glColor. One issue still remains however, most fonts use a form of anti-aliasing so that they look smoother from a distance or at smaller sizes. Basically it lowers the alpha channel around the edges which makes those parts lighter, basically makes it "fade" to the background color. I have created a shader that will render the fade correctly from glColor to some other arbitrary color (currently that color is white, but it could just as easily be set with glSecondaryColor) that looks like this:


#version 110

uniform sampler2D texture;
varying vec2 texcoord;

void main()
{
vec4 alpha = texture2D(texture, texcoord).aaaa;
vec4 negalpha = alpha*vec4(-1,-1,-1,1)+vec4(1,1,1,0);
gl_FragColor = alpha*gl_Color+negalpha*vec4(1,1,1,0);
}

This effect works really well as long as the background is white. However, quite often the text will be rendered on something that isn't a solid color and I need it to blend with whatever texture it happens to be on top of, which would normally mean sampling the texture it is on top of. However, because it the shader programs are run in parallel, it's possible that whatever fragment that is "below" the current fragment being processed hasn't been processed yet, so there is no way to sample it (as far as I know). The only part of the pipeline I can see this sampling happening is in the "testing and blending" portion. Which brings me back to the beginning, why are the fragments I'm drawing that have alpha values that are less than one (except 0) not being blended during that part of the pipeline?

tksuoran
06-29-2011, 09:07 AM
You have enabled blending and set alpha blending mode. But your shader is writing only 0 to alpha channel. Try putting alpha to gl_FragColor.a. Then your blending actually does something. Blending is done after fragment shader.

HuFlungDu
06-29-2011, 04:40 PM
No, I'm pretty sure I am writing to the alpha channel just fine, let's follow the math:


vec4 alpha = texture2D(texture, texcoord).aaaa;
From what I can tell, this makes a 4 dimensional vector where each element is the alpha. i.e. the vector (alpha,alpha,alpha,alpha)


vec4 negalpha = alpha*vec4(-1,-1,-1,1)+vec4(1,1,1,0);
This should make a 4 dimensional vector as well. First, it multiplies R,G, and B by -1 and multiplies the alpha channel by 1. This should give me the vector (-alpha,-alpha,-alpha,alpha). Following this, I add another vector to it, which should give me (1-alpha,1-alpha,1-alpha,alpha)


gl_FragColor = alpha*gl_Color+negalpha*vec4(1,1,1,0);
Finally, we take alpha, the vector (alpha, alpha, alpha, alpha) and multiply it by gl_Color (Usually passed in as (0,0,0,1), but could be anything, I'll say it's (0,0,0,1), since that's what I'm using), which gives me the vector (0,0,0,alpha). in the next part, it multiplies negalpha, the vector (1-alpha,1-alpha,1-alpha,alpha) by the vector (1,1,1,0), giving me the vector (1-alpha,1-alpha,1-alpha,0). Then I add those to vectors together and get the vector (0+1-alpha,0+1-alpha,0+1-alpha,alpha+0) or (1-alpha,1-alpha,1-alpha,alpha). From what I can tell, I'm writing the found alpha value to the alpha channel, though if I did my math wrong let me know. Or is there some different way to write to the alpha channel I am unaware of?

tksuoran
06-30-2011, 03:05 AM
Sorry, indeed you are right, I did not read your formula correctly.

Your code gl_FragColor = alpha*gl_Color+negalpha*vec4(1,1,1,0); is redundant, blending already does this for you. See GL4 / GL2 specifications:

http://www.opengl.org/registry/doc/glspec21.20061201.pdf

4.1.8 Blending, starting on page 208

See table 4.1. If you use FUNC_ADD blending equation, blending will do "fragment color after blending = source weight * shader output + destination weight * previous fragment color" for you. You can configure source weight to be taken for shader alpha or set it to one, and you can set destination weight to be one minus shader alpha - see table 4.2.

Blending can sample the previous color from the framebuffer as needed, according to how you have set it up. You are not supposed to do that yourself in the shader, and in fact it would not be possible since there are no standard ways to read the previous values from the framebuffer.

You shader code should look something like HuFungLu suggested.

HuFlungDu
06-30-2011, 04:22 AM
Aha! It seems GL_BLEND and GL_DEPTH_TEST were getting disable somehow, enabling them right before I draw the glyphs fixes the blending issue. Do OpenGL states only stay active for a certain amount of time? If not, I guess the library i'm using to make a rendering context is probably resetting those variables somewhere.

Here's the fragment shader I ended up using:


#version 110

uniform sampler2D texture;
uniform float red;
uniform float green;
uniform float blue;
uniform float calpha;
varying vec2 texcoord;

void main()
{
vec4 alpha = texture2D(texture, texcoord).aaaa;
gl_FragColor = alpha*vec4(red,green,blue,calpha);
}

One odd thing is that I could no longer use gl_Color to set the color, I had to pass in the color arguments separately. Don't know what's up with that, but if it works it works I guess.

Thanks again.

BionicBytes
06-30-2011, 08:16 AM
Should be no problem using gl_Color within a #version 110 shader.