glAlphaFunc & glBlendFunc

Hi,

I need to get my textures transparent. I know how to do this by faking colorkey with glAlphaFunc(GL_GREATER, 0), but what I want is to draw every alpha value; 0 being fully transparent, 255 being fully opaque, and every value between 0-255 accordingly.

I have tried glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); but it doesn’t work.

I’m working on an isometric-tile game and I need alpatexture feature for sprites. Lasersquad Nemesis is the only isometric engine I know which uses alpha values on sprites (not just for fully transparent pixels).

saezee

This should work.

Make sure you issue

glEnable(GL_BLEND);

and disable the mode when you’re done.

Nope, doesn’t work. Here’s the code snippet:


glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glBindTexture(GL_TEXTURE_2D,spriteTexture[SprID]);
glBegin(GL_QUADS);
glTexCoord2f(0.0f, 0.0f); glVertex2i(sx, sy);
glTexCoord2f(1.0f, 0.0f); glVertex2i(sx + (tile_w), sy);
glTexCoord2f(1.0f, 1.0f); glVertex2i(sx + (tile_w), sy + (spriteheight2));
glTexCoord2f(0.0f, 1.0f); glVertex2i(sx, sy + (spriteheight
2));
glEnd();
glDisable(GL_ALPHA_TEST);
glDisable(GL_BLEND);


saezee

I don’t see your glAlphaFunc call and the disable suggests you have alpha test on. Perhaps you are culling all the blended fragments with a big ref value. Turn of alpha test and see what happens.

Also watch your zbuffer. I’m assuming no zbuffer or a sort. There are other gotchas that might LOOK like no blending. Make sure you draw your sprites last, after your backdrop, then see how things look.

One last thing, chack your texture filter, you probably want LINEAR at least. I dunno what alpha values you have but if it’s 0 and 1 and no filter there will be nothing in between to blend. Check the texture format you’re using.

Okay, I did everything the way you suggested, but still no change. I disabled alphatest and it didn’t have any effect on the result. I checked I’m using GL_LINEAR textures. I draw sprites last. Still the effect remains wrong.

This is how I create my textures:

glGenTextures(1, &textureptr);
glBindTexture(GL_TEXTURE_2D, textureptr);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexImage2D(GL_TEXTURE_2D, 0, 4, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);


Is there something wrong with that code?

saezee

Does it do what you expect WITH the glAlphaTest and no blend?

I assumed you have the texture in good shape since you said you wanted to improve on a working alpha test. If the alpha test approach didn’t work this would be critical information in determining if your texture is OK or not.

I’d say your unpack alignment call is redundant, leave it at 4, your texture is 2^n in memory and has 4 bytes per pixel so any way you look at it you’re aligned.

Aside from that I think it should work.

This makes me wonder if you have the right behaviour but don’t recognize it. I need a screenshot to go any further with this.

I removed the UNPACK alignment line. Didn’t make any difference.

okay, here are shots of the problem:
http://saez1.tripod.com/glprobs.html

I hope those clear out my problem.

saezee

sayzee,

looked at your images. Have you tried using a blend function (GL_ONE, GL_ONE_MINUS_SRC_ALPHA)?

That should give more weight to the trees I guess.

HTH

Jean-Marc.

NO, that is a bad idea. You have the correct blend function, don’t change it. It will make your trees brighter but will certainly do strange looking things (like glow and look transparent in parts pf your particular example).

Your problem is that either you have vertex or material alpha in addition to texture alpha, or you have low alpha values in the texture.

So, you need to do a couple of things.

First, make sure that the polygon without the tree texture is opaque, you can do this by disabling texture and keeping blending on. The polygon should appear solid. If not you know that’s where your problem is. The other way to test this is to set your texture environment to GL_REPLACE, that way you know you are seeing only the texture fragment coming through for the blend.

The next thing to do is check the alpha values in the tree. You can do this by increasing the second argument in the glAlphaFunc(GL_GREATER, 0); Make sure the texture environment is GL_REPLACE for this test and try a range of values between 0.0 and 1.0 for the second argument. This will tell you if you have a problem with the alpha channel in the texture.

My bet is the second option. I think what you’ve done (and this is a bit of a guess) is to copy the RGB channel of a tree against a black background to the alpha channel, instead of having a separately painted alpha channel. This has given you very low alpha for most of the tree and looking at your screenshot I think I see a correlation between RGB and alpha. Take your alpha bytes in the texture image before the glTexImage call and multiply them by a BIG number.

[This message has been edited by dorbie (edited 06-05-2002).]

I bow my head.
Very nice homepage, Angus!
And excuse me for being off-topic.

Here’s a part from a code for loading a tree with alpha test

Warning, the tree has a alpha channel (it’s a TGA file)

// **********************************************
int InitGL(GLvoid) // Toutes les réglages pour OpenGL, c’est ici que l’on initialise
{ // tous les paramètres nécessaires à OpenGL

// On génere les textures à partir du fichier indiqué dans la fonction
if(!Load_TGA("Data/Textures_TGA.txt"))
	{
		// Une erreur ???
		KillGLWindow();	
		MessageBox(NULL, "Initialisation des TGA : ERREUR !!!", "Erreur", MB_ICONERROR); 
		exit (0);
	}

// On active la gestion des textures
glEnable(GL_TEXTURE_2D);

// Ecran de fond de couleur bleue
glClearColor(0.0, 0.5, 1.0, 1.0);	

// Spécifique au test du cannal Alpha
glAlphaFunc(GL_GREATER, 0.3f);

// On active le canal alpha
glEnable(GL_ALPHA_TEST);

return true;	// Aucune erreur, tout est OK

} // Fin de la fonction

// **********************************************
int DrawGLScene(GLvoid) // Tout ce que l’on va afficher se trouve ici
{

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	// Efface l'écran et le Depth Buffer
glLoadIdentity();									// Mise à 0 des 3 axes

// On place l'image
glTranslatef(0.0f, 0.0f, -3.0f);

// On la fait tourner sur l'axe "Y"
glRotatef(Angle, 0.0f, 1.0f, 0.0);

// Affiche l'image avec la texture
glBindTexture  ( GL_TEXTURE_2D, Texture_TGA[0].texID );
glBegin(GL_QUADS);
	glTexCoord2f(1.0f, 1.0f); glVertex3f( 0.75f,  0.75f, 0.0f);	// Supérieur droit
	glTexCoord2f(1.0f, 0.0f); glVertex3f( 0.75f, -0.75f, 0.0f); // Inférieur droit
	glTexCoord2f(0.0f, 0.0f); glVertex3f(-0.75f, -0.75f, 0.0f); // Inférieur gauche
	glTexCoord2f(0.0f, 1.0f); glVertex3f(-0.75f,  0.75f, 0.0f); // Supérieur gauche
glEnd();

// On augmente l'angle de rotation
Angle += 1.0f;

// Si l'angle de rotation = 360°, on le remet à 0
if ( Angle == 360 ) Angle = 0;

return true;	// Aucune erreur, tout est OK.

}

Okay, now it works. The problem was in my texturefile loader which didn’t take the image r, g, b and alpha masks but reseted them. Doh! Well, seems that taking a short break from a puter can be good at times.

Thanks everyone!

saezee