PDA

View Full Version : SDL_Image + PNG + Alpha + OGL = Argh!



UncleD
03-19-2005, 05:00 AM
Hi there! I'm rather new to GL, so please don't flame me when I ask something that has been asked before (although I searched the whole place here ;) )

Okay - now my question :)

I'm using SDL & SDL Image for all initialization and textureloading, the problem is, that GL seems to ignore the alpha channel of any loaded textures - I'm usually using PNGs with Alphachannel. The Image is usually displayed correctly except the transparency (even with GL_BLEND enabled)

Here's my code for loading the Image:

SDL_Surface *Tmp, *conv;
Tmp = IMG_Load(File); //load to temporay surface

conv = SDL_CreateRGBSurface(SDL_SWSURFACE, Tmp->w, Tmp->h, 32,
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff);
#else
0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000);
#endif
//do some conversion of the byteorder for gl

SDL_BlitSurface(Tmp, 0, conv, 0);

TextureNode *myNode;
myNode = new TextureNode; //allocate memory for the texture :)

myNode->TX = conv->w; //store file dimensions
myNode->TY = conv->h;

glGenTextures(1, &myNode->Data);
glBindTexture(GL_TEXTURE_2D, myNode->Data);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glPixelStorei(GL_UNPACK_ROW_LENGTH, conv->pitch / conv->format->BytesPerPixel);

glTexImage2D(GL_TEXTURE_2D, 0, 3, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels);

glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);


SDL_FreeSurface(Tmp); //delete temp mem.
SDL_FreeSurface(conv);

I guess the problem is somewhere in this line:
glTexImage2D(GL_TEXTURE_2D, 0, 3, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels);

As far as I understood the glTexImage2d stuff, I need to specify the format of the Pixeldata for the texture (with 3 beeing RGB for example - the problem is, that when I put 4 (instead of 3) the image isn't displayed at all. Am I missing some texture env's?

I use the following setup before drawing:

glDepthFunc(GL_LESS);
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MO DULATE);

glShadeModel(GL_SMOOTH);
glMatrixMode(GL_PROJECTION);
glMatrixMode(GL_MODELVIEW);

This happens just before drawing:
glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glDisable(GL_DEPTH_TEST);
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER,0.0f);

and yet it doesn't work :(

Any help would be appreciated :)

Cheers Phil

Overmind
03-19-2005, 06:08 AM
The third parameter of glTexImage2D is the internal format. There should be an enumeration, propably GL_RGBA in your case, or one of its variations.

The use of 3 or 4 is only for backwards compatibility, and it should not be used in new applications. Either way, 3 is wrong because your texture has 4 components...

UncleD
03-19-2005, 06:34 AM
strange enough - if I put 4 or GL_RGBA there's no image.

dvm
03-20-2005, 01:12 AM
Are you sure your texture is a power of 2?
Also, check the data type for the conv->pixels member. Is it unsigned char? Usually problems like this came to me and had to do with the data type.

CWiC
03-21-2005, 12:59 AM
What if you set 4 as argument for gkTextImage instead of 3 and disable alpha test and blending? Your image will be still unvisible or it will not show up the transparency?

03-21-2005, 05:16 AM
got it to work. after all it had something to do with the messing I did with the byteorder in the beginning.