[sigh] RGBA texture problem

Hello,

I’m going crazy with my alpha texture problem. I’m loading a picture that
has alpha channel, then create a texture from it. Here’s the code for
creating the texture:

{
GLuint textureptr;

glGenTextures(1, &textureptr);

glBindTexture(GL_TEXTURE_2D, textureptr);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, 4, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

}

This is the result I get:
http://saez1.tripod.com/render.jpg

(btw, if the link doesn’t work, try writing it manually)

The texture on the left is created with above code, the texture on the
right is the same texture, but created without alpha:

glTexImage2D(GL_TEXTURE_2D, 0, 3, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, data);

See the huge difference? I wonder what I’m doing wrong. I’ve tried
different image loaders, written by other people and all give the same
result, so it’s not my image loader to blame for this. I did a search on
alpha textures on this forum and read tons of threads, but none of them
provided a solution to my problem.

I’m using orthographic mode because I’m working on a 2D project.

Can anyone help? Any ideas what I’m doing wrong?

saezzz

The picture doesn’t work. Not at all.

However, if you’re making two texture objects with the code you posted I’ll bet that one of them looks right and the other one looks funny.
The way you call TexImage2D, it takes three bytes per pixel for rgb and four bytes per pixel for rgba.

Which one of the two does look right on your screenshot (which I can’t see)?
If it’s the rgba texture, you can fix the rgb one by supplying GL_RGBA instead of GL_RGB (but keep the 3!).

The other way round points to a problem with your image loader, or with your file.

How does your loader store the image data in memory?

Ah, I managed to view your screenshot. Looks fine (even though I think it shouldn’t work the way you posted).

What exactly do you want to do with your alpha channel?
If it’s basic transparency you’re after, try
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
and see if that’s closer to what you expected.

Weird that the picture doesn’t work. I tested it and it loads. Oh well… anyways, the problem is with the RGBA texture. It does blend “correctly” (0 alpha is transparent and everything else between as it should), but the texture seems to lose a lot of detail, like if you would convert a truecolor picture to 64 or less colors. AGH!

I’m using SDL and SDL_image libraries for loading images. Here’s the code for loading a picture:

void Copybuffer(SDL_Surface *image, Uint8 **buffer)
{

Uint8 *buffertemp;

Uint8 *imagepixels = (Uint8*)image->pixels;
buffertemp = new Uint8[image->w * image->h * 4];

int j = 0,i = 0;

for(i = 0; i < image->h * image->pitch; i+=image->pitch)
{
	Uint8 *imagerow = imagepixels + i;
	for(Uint8 *imagepixel = imagerow; imagepixel < imagerow + image->w * image->format->BytesPerPixel;
															  imagepixel+=image->format->BytesPerPixel)
	{
		Uint8  red,green,blue,alpha;
		Uint32 *currentpixel = (Uint32*)imagepixel;
		SDL_GetRGBA(*currentpixel, image->format, &red, &green, &blue, &alpha);

		buffertemp[j++] = red;
		buffertemp[j++] = green;
		buffertemp[j++] = blue;
		buffertemp[j++] = alpha;
	}
}
*buffer = buffertemp;

}

I don’t know what’s the problem.

saezzz

[This message has been edited by saezee (edited 10-07-2002).]

Btw, that last parameter of SDL_GetRGBA
is ? instead of just a. I tried to edit the post but for some reason is doesn’t apply the changes.

And about the blendfunc, I’m using it. But even if I disable it, I get the crappy result shown in the picture.

[ edit edit edit ]
Laughing my ass off… so i can’t write alpha with a “&” character in front of it…

saezzzz

[This message has been edited by saezee (edited 10-07-2002).]

Dont use just 3 or 4 as internal format

use the GL_RGBA8 or GL_RGB8 to hint the driver that you want to have 32 and 24 bit, else the driver can choose for you… ( and with rgba it might choose to downsample to 4bit per component to save memory)

Thank youu Mazy! Can I kiss you? I can’t believe it… I’ve been struggling with this problem for a long long time and the solution was THAT simple and partly a driver issue! Well, I guess that’s how it usually is… solution is simple.

Thank you Mazy!!! Thank you to zeckensack too! Always nice to see helpful people.

saezee

Thanx Thanx Thanx…
it was the same problem for me…
(see my topic about 16 bit graphic)
Thanx