Texturing Problem With transparency unusual prob

Hi

I have read numerous threads on other people having similar probs and have tried all solutions and none seem to work. I am trying to draw a texture to the screen with one of the textures colors fully transparent. It works perfectly if I draw the image to the screen using glDrawPixels() and passing parameter RGBA, but when trying to do the same thing using a texture it doesnt. I have passed RGBA as a parameter in the glTexImage2D() call. I have the following setup:

glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

I have no idea what the problem is! The Alpha bits are obviously setup correctly because as I say if I use drawpixels it works. The Texture appears but as I say the parts I want to be transparent aren’t.

Here is whats written:

glEnable(GL_TEXTURE_2D);
glPushMatrix();
glLoadIdentity();
//glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_REPLACE);
gluOrtho2D(0.0,700.0,0.0,550.0);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glTranslatef(200.0f,200.0f,0.0f);
glRotatef(Rot,0.0,0.0,1.0);
glTranslatef(0.0f,0.0f,0.0f);
glPolygonMode( GL_FRONT, GL_FILL );
glBegin(GL_QUADS);
// Front face
glTexCoord2f(0.0f, 0.0f); glVertex3f(-32,-32, 0); // Bottom left of the texture and quad
glTexCoord2f(1.0f,0.0f); glVertex3f( 32,-32, 0); // Bottom right of the texture and quad
glTexCoord2f(1.0f,1.0f); glVertex3f( 32,32, 0); // Top right of the texture and quad
glTexCoord2f(0.0f, 1.0f); glVertex3f(-32,32, 0); // Top left of the texture and quad
glEnd();
glPopMatrix();
glDisable(GL_TEXTURE_2D);
glRasterPos2i(Rb1X-Rb1HWidth,Rb1Y-Rb1HHeight);
glPixelStorei(GL_UNPACK_ALIGNMENT,4);
glDrawPixels(Rb1Width,Rb1Height,GL_RGBA,GL_UNSIGNED_BYTE,Pixels);

Please Help!!

Thanks…

Rohland

Hi again

It seems I have fixed the problem. Was a very stupid mistake and didnt warrant a post.

glTexImage2D(GL_TEXTURE_2D, 0, 4, RobotImage->GetWidth(), RobotImage->GetHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, Pixels);

The mistake was that the GLint components value was set to three and not 4 as above.

Hope this helps someone else who might be having a similar problem ( prob noone :stuck_out_tongue: )

Rohland

The third component isnt called ‘components’ anymore, its ‘InternalFormat’ and includes both kind of image, and a quality hint.

Use one of the following instead of 1,2,3 or 4

GL_ALPHA, GL_ALPHA4, GL_ALPHA8, GL_ALPHA12, GL_ALPHA16, GL_LUMINANCE, GL_LUMINANCE4, GL_LUMINANCE8, GL_LUMINANCE12, GL_LUMINANCE16, GL_LUMINANCE_ALPHA, GL_LUMINANCE4_ALPHA4, GL_LUMINANCE6_ALPHA2, GL_LUMINANCE8_ALPHA8, GL_LUMINANCE12_ALPHA4, GL_LUMINANCE12_ALPHA12, GL_LUMINANCE16_ALPHA16, GL_INTENSITY, GL_INTENSITY4, GL_INTENSITY8, GL_INTENSITY12, GL_INTENSITY16, GL_RGB, GL_R3_G3_B2, GL_RGB4, GL_RGB5, GL_RGB8, GL_RGB10, GL_RGB12, GL_RGB16, GL_RGBA, GL_RGBA2, GL_RGBA4, GL_RGB5_A1, GL_RGBA8, GL_RGB10_A2, GL_RGBA12, and GL_RGBA16

else you are going to comeback and ask why your textures looks 16bit instead of 32, since 4 = GL_RGBA which mlets the driver choose format for you ,and it often chooses a 16 bit format to save memory, and i guess its faster. so a GL_RGBA8 hints the driver that you want all 32bits, and as far as i know ATI and nvidia i listening to these hints, and im not supprised if even more vendors does that.