glBlendFunc() confusion

Hello, everybody!

I’ve recently gotten back into programming with OpenGL. I’ve been working, actually, on a 2D game using OpenGL to enhance it with rotation, blending and so on.

Anyway, to the point. I use PNG images as textures for the sprites. No problems there. But now I want to be able to set an alpha value for the entire texture in addition to the per pixel alpha channel that’s available with the PNG image.

When I initialize OpenGL, I use the following lines:

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

This works as I expect it to without any issues.

However, when I go to set an alpha value with glColor4f(), nothing happens.

So I thought that I needed to change the blend function. I used this function which seemed to work with a sample application that I poked around with:


glColor4f(1.0f,1.0f,1.0f,0.5f);
glBlendFunc(GL_SRC_ALPHA,GL_ONE);

Afterward I set the blend function back to the original method.

This doesn’t work. I get what appears to be some sort of alpha but it’s not what I expect. Additionally if I were to change any of the RGB values in glColor4f the texture is not adjusted as I would expect.

Obviously I don’t understand how to properly use glBlendFunc properly. I’ve looked at the documentation but I’m really not getting anywhere with it. I’ve also looked on google and, again, not really getting anywhere.

Long story short what I want to be able to do is apply a ‘per-texture’ alpha value to a texture that also has a ‘per-pixel’ alpha channel.

Thanks for taking the time to read over my post.

The first blendfunc should work, and the second too if you have premultiplied alpha (which is better, read tom forsyth’s blog entry on premultiplied alpha for details)

Be sure texture env is in the default GL_MODULATE mode :
http://www.opengl.org/resources/faq/technical/texture.htm#text0030

Can you describe more precisely what you mean with “but it’s not what I expect” ?
And post more code : have you disabled all lighting, texenv tex params etc ?

Thanks for your reply.

I haven’t touched the lighting at all (at least I don’t think I have – relevant code below). glTexEnvf is set to modulate (was one of the first things I checked). Texture parameters for all textures:


glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

Example screenshot

In the linked screenshot, I have what’s actually being rendered by the ‘game’. There’s no background and if I wanted a translucent portion of the image (say an anti-aliased edge) that works perfectly fine.

What I’d also like to be able to do is to use a per-texture alpha value that I can specify so that the whole texture can ‘fade in’.

When I set up OpenGL, I do the following:


glShadeModel(GL_SMOOTH);

glClearColor(0, 0, 0, 0);

glViewport(0, 0, mScreen->w, mScreen->h);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
	
glClear(GL_COLOR_BUFFER_BIT);

//glDepthFunc(GL_NEVER);
	
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
	 
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
	 
glOrtho(0.0, mScreen->w, mScreen->h, 0.0, -1.0, 1.0);
	 
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
	
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
	
glPushMatrix();

This works entirely as expected. I can render fonts and images with alpha channels (TGA and PNG images) just fine.

I figured I would just need to change the blend mode but I ended up with this: multiplied?

Which looks like the texture is just being multiplied with what’s already there. That, and no matter what values I put in the call to glColor4f, nothing happens. The code looks like this:


glLoadIdentity();
glPushMatrix();

float tX = (image->getWidth() >> 1) * scale;
float tY = (image->getHeight() >> 1) * scale;

glEnable(GL_TEXTURE_2D);
		
glTranslatef((GLfloat)x, (GLfloat)y, 0.0f);
glRotatef(degrees, 0.0f, 0.0f, 1.0f);

glBindTexture(GL_TEXTURE_2D, 0);

glColor4f(1.0, 1.0, 1.0, 0.05);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);

glBegin(GL_QUADS);
	glTexCoord2f(0.0f, 1.0f); glVertex2f(-tX, tY); // Top Left
	glTexCoord2f(1.0f, 1.0f); glVertex2f(tX, tY); // Top Right
	glTexCoord2f(1.0f, 0.0f); glVertex2f(tX, -tY); // Bottom Right
	glTexCoord2f(0.0f, 0.0f); glVertex2f(-tX, -tY); // Bottom Left
glEnd();

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glDisable(GL_TEXTURE_2D);
glPopMatrix();

That’s not multiplicative blending.

(GL_SRC_ALPHA,GL_ONE) is additive blending - but anyway…

since glColor* isn’t doing anything it leads me to believe that
lighting is somehow enabled. Specifically disable it and
see what happens.

EDIT: Also, why are you enabling texture mapping and then binding a null texture with glBindTexture(GL_TEXTURE_2D,0)?

Heh… or something like that… :slight_smile:

I explicitly disabled lighting and it’s exactly the same. No change.

I actually was trying to simplify the code I actually have. I store an array of texture ID’s and then choose them accordingly.

As a note, I don’t know if this affects anything, but I have the texture code generate GL textures based on whether the source image has an alpha channel or not so the texture format is either GL_RGB or GL_RGBA. All of my images are 32bit PNG’s so I assume they all have alpha channels (ran it through the debugger and am not hitting the code that uses GL_RBG).


glTexImage2D(GL_TEXTURE_2D, 0, nColors /* either 3 or 4 */, image->getPixels()->w, image->getPixels()->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, bytebuffer);


So, just to be clear:

calling glColor4f(1.0,0,0,.5) doesn’t cause your ship to turn red? Or is it just the alpha component that does nothing?

Nope. Doesn’t change colors, opacity is not changed. It’s the weirdest thing too because I had used a NeHe tutorial to experiment with a few things and had no trouble.

Would having SDL set up the OpenGL render context have anything to do with this? I’m not convinced SDL is interacting with OpenGL in any way that could cause this.

If the alpha component wasn’t working but the color was, it may have been due to no bits allocated for alpha in the context.

However, since the RGB channels aren’t responding to the glColor call and you’re sure that lighting is disabled it’s a bit of a mystery. I’m currently at a loss.

If your source code isn’t too large could you link it?

Actually it’s very big but I’ll post the relevant functions on pastebin.

Relevant Code

To give a quick overview, we put together a Renderer and abstracted the details of using either OpenGL for hardware acceleration or using software mode via SDL.

We do a lot of texture conversion on the fly – it’s something we’ve been meaning to clean up. Also, the ‘texture manager’ functions I’m sure could work better.

That all aside, is there something that I should be looking for that could cause OpenGL to not change color components of a textured quad but work correctly for untextured primitives?

Thanks for all your help. I really do appreciate the time everybody’s taking to help me figure this out.

I’m also new too OpenGL but I checked your code and one thing got my attetion.

Your glTexImage2D call, you use nColors to specify the internal format. I looked at the function and your nColors is a valid value but my own PNGUtility class uses the enum instead :


if(aTransparency)
		glTexImage2D(GL_TEXTURE_2D,0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*)image_data);
	else
		glTexImage2D(GL_TEXTURE_2D,0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, (GLvoid*)image_data);

It’s probably not that but maybe you might try this instead in your case :


if(format->Amask)
	glTexImage2D(GL_TEXTURE_2D, 0, textureFormat, image->getPixels()->w, image->getPixels()->h, 0, textureFormat, GL_UNSIGNED_BYTE, image->getPixels()->pixels);
else
	glTexImage2D(GL_TEXTURE_2D, 0, textureFormat, image->getPixels()->w, image->getPixels()->h, 0, textureFormat, GL_UNSIGNED_BYTE, image->getPixels()->pixels);

Also if I’m not mistaken, your condition here is useless since the 2 lines seems to be doing the same thing.

Hope this is the problem even if I doubt it :stuck_out_tongue:

Thanks for the input. I’ll try it.

They’re actually not doing the same thing. In the above test there’s a check to see if there are three color components (RGB) or four (RGBA). Basically, if Amask is anything but 0, it’ll use an RGBA format. Otherwise, RGB. It also checks to see what order the color components are in – RGB or BGR (MacOS X which this code also runs on reverses them for whatever reason).

I doubt this will make a difference (and in fact, changing it now it breaks completely… hehe).

Thanks for the suggestion though!

Did a bit of playing around and found something interesting.

If I turn off glEnable(GL_TEXTURE_2D) when rendering the quad, I get a red quad that’s 50% translucent… as expected. It makes me believe that there’s something in the 2D texture mode that I need to enable or disable to let the texture also take on the properties of the quad that it’s applied to.

I looked on the OpenGL programmers guide but nothing is really standing out. What would cause texture rendering mode to not take on the properties of the primitive geometry it’s applied to (assuming that’s even the problem)?

Also, I’ve been playing with enabling/disabling a few different modes such as materials, lighting, blend modes and so on but I either get weird results or nothing changes.

Thanks again for the help.

Are you using a GL_MODULATE texture environment.

If vertex alpha (or color) is not working then texture is clobbering the vertex value probably with GL_REPLACE.

Try no texture just vertex color. Then add texture, if texture is not working exactly as expected then fix that, then move to blending. I think your problem has been earlier in the pipeline all along.

I see from page 2 that you disabled texture and it works. Your texture environment is certainly wrong, change it to GL_MODULATE, which should be the pipeline’s default state.

That was actually the first thing I looked before posting here. If you look at the code listing you’ll see that it is, indeed, modulate. I’ve tried all four options and none have any effect.

leeor_net, your problem is indeed with GL_MODULATE.
glTexEnv is not attached to one texture object, contrary to glTexParameter.

So move this line :
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
from OGL_Renderer::getTextureId to OGL_Renderer::drawImageRotated.

I would be very surprised if that still does not make your ship reddish translucent.

EDIT: sorry I was wrong with premultiplied alpha, try to replace glBlendFunc(GL_SRC_ALPHA, GL_ONE) (which is additive blending, as already mentionned) with glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA)

Awesome. Thank you very much. I didn’t realize that I needed to set the texture environment every time I actually used it (seems OpenGL sometimes remembers some things and other times it has to be specified like colors and such).

So yes, it is now working exactly as expected. I really appreciate the help!