Some new blending problems...

Hi!
I’m writing some GL code and have some problems to get alpha-blending with my textures. I’ve read through the forum, but nothing mentioned there solved the problem.
Due to that, I broke down the code, but still got the problem. Here is what I’m doing:

<CODE>

#include <Windows.h>
#include <GL/gl.h>
#include <SDL.h>

SDL_Surface *g_srfcScreen;
GLuint glintTexture[1];

unsigned long Test[] = { 0x00000000,0xFF000000,0xFF000000,0xFF000000,0xFF000000,0xFFFF0000,0xFF00FF00,0xFF000000,
0x00000000,0xFF000022,0xFF000022,0xFF000022,0xFF000022,0xFFFF0022,0xFF00FF22,0xFF000022,
0x00000000,0xFF000022,0xFF000044,0xFF000044,0xFF000044,0xFFFF0044,0xFF00FF44,0xFF000044,
0x00000000,0xFF000022,0xFF000044,0xFF000066,0xFF000066,0xFFFF0066,0xFF00FF66,0xFF000066,
0x00000000,0xFF000022,0xFF000044,0xFF000066,0xFF000088,0xFFFF0088,0xFF00FF88,0xFF000088,
0x00000000,0xFF000022,0xFF000044,0xFF000066,0xFF000088,0xFFFF00AA,0xFF00FFAA,0xFF0000AA,
0x00000000,0xFF000022,0xFF000044,0xFF000066,0xFF000088,0xFFFF00AA,0xFF00FFCC,0xFF0000CC,
0x00000000,0xFF000022,0xFF000044,0xFF000066,0xFF000088,0xFFFF00AA,0xFF00FFCC,0xFF0000FF,
};

int main(int argc,char *argv[])
{
SDL_Event sdlEvent;
Uint32 nStart;

SDL_Init(SDL_INIT_VIDEO|SDL_INIT_AUDIO);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE,8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE,32);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER,1);
g_srfcScreen=SDL_SetVideoMode(640,480,32,SDL_OPENGL/*|SDL_FULLSCREEN*/);

glPixelStorei(GL_UNPACK_ALIGNMENT,1);
glShadeModel(GL_SMOOTH);
glClearColor(0,0,0,0);
glClearDepth(1.0f);
glViewport(0,0,640,480);
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glDepthFunc(GL_LEQUAL);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

glGenTextures(1,&glintTexture[0]);
glBindTexture(GL_TEXTURE_2D,glintTexture[0]);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP);
glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);

glBindTexture(GL_TEXTURE_2D,glintTexture[0]);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,8,8,0,GL_RGBA,GL_UNSIGNED_BYTE,Test);

glDisable(GL_BLEND);

nStart=SDL_GetTicks();
while((SDL_GetTicks()-nStart)&lt;5000)
{
	SDL_PumpEvents();
	glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	glOrtho(0,1,1,0,-1,1);

	glMatrixMode(GL_MODELVIEW);

	glDisable(GL_BLEND);
	glDisable(GL_TEXTURE_2D);
	glBegin(GL_QUADS);
		glColor4f(0,0,0,1);	glVertex3f(0,0,0.4);
		glColor4f(1,0,0,1);	glVertex3f(1,0,0.4);
		glColor4f(0,1,0,1);	glVertex3f(1,1,0.4);
		glColor4f(0,0,1,1);	glVertex3f(0,1,0.4);
	glEnd();

	glEnable(GL_TEXTURE_2D);
	glEnable(GL_BLEND);
	glBindTexture(GL_TEXTURE_2D,glintTexture[0]);
	glColor4f(1,1,1,1);
	glBegin(GL_QUADS);
		glTexCoord2f(0,0);	glVertex3f(0.4,0.4,0.5);
		glTexCoord2f(1,0);	glVertex3f(0.6,0.4,0.5);
		glTexCoord2f(1,1);	glVertex3f(0.6,0.6,0.5);
		glTexCoord2f(0,1);	glVertex3f(0.4,0.6,0.5);
	glEnd();

	SDL_GL_SwapBuffers();
	SDL_Delay(1);
}

SDL_Quit();

return (0);

}

</CODE>

To make it in short. The stuff is distributed over some procedures, but is called in this order. I’m initialising a litte 8X8 Texture from raw memory and draw it on the screen (with a rectangle underneath, not mentioned above). Everything I get is the texture blended with black, no matter where on the screen it is drawn. Any idea? The blending itself seems to work, because the texture is allover red and it appears blended to black…

Thanks in advance,
Andreas Podgurski

[This message has been edited by SunSailor (edited 08-10-2001).]

[This message has been edited by SunSailor (edited 08-13-2001).]

[This message has been edited by SunSailor (edited 08-13-2001).]

I believe the bug is in the line

glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_BLEND);
(my docs state this to be undefined)

You might wanna try GL_MODULATE or GL_DECAL here, in respect to its intention.

GL_BLEND is defined in my header files and seems to work (looks like subtractive blending if i’m not mistaken). But I agree with Michael, GL_MODULATE is probably the mode you are looking for (blends with srcColsrcFunc + dstColdstFunc).

What I meant was GL_BLEND with a texture of four components (RGBA).

Nope, I tried all modes (GL_BLEND, GL_MODULATE and GL_DECAL), but that changed nothing…

Regards,
Andreas Podgurski

Maybe, your driver doesn’t support so small textures? (Like 8*8)

Anyway, can you send me the code so I can play around with it?

a driver must be able to support at least 64x64 sized textures. 8x8 sized textures are used more often than u think eg when u generate mipmaps youre creating textures all the way down to 1x1

Well, zed, you’re probably right. Ouch ouch (Michael hits himself)

Hi!
Thanks so far. I’ve exchanged the little code fragment with a working code, which should be compileable by everyone. Would be nice, if somebody could have a look at it, because I don’t know, why such a simple thing doesn’t work. Thought OpenGL would be easier than DX…
Anyway, the code should draw a colorful background and blend a simple texture above it. In reality, it only blits it on the screen and blends only the red channel to black, blue and green arn’t blended at all. Very odd, in my opinion, but maybe I have a fundamental misunderstanding of how OpenGL works…

Thanks in advance,
Andreas Podgurski

P.S.: It is definitly no driver problem, because the code is always tested on at least four different machines.

Just want to tell you that the problem is solved. For any reason, OpenGL is reading
the texture in opposite order, ABGR. After changing that, everything worked as
estimated. This is strange, because three component textures are read RGB.

Thanks for support,
Andreas Podgurski