PDA

View Full Version : How to add alpha to a BMP image....



10-04-2002, 09:53 AM
First of all sorry for my poor english http://www.opengl.org/discussion_boards/ubb/frown.gif
i try to explain my problem
i opened a window 1024x768x16
i loaded a BMP image using Nehe tutorials..
(AUX_RGBImageRec *TextureImage[] )
now i want to add the alpha..because i need to have trasparent part of my texture..
one of my friend suggest me to create a new
variable like this :
GLubyte TexBucata[256 * 256 * 4];
(256 is the dimension of the Image)
the advice is to copy the TextureImage->data
in the new variable adding the alpha value (when i find a particular colour in the original image i set alpha to 0; else alpha=1)

in a 16 bit windows..it doesn't work...
the colour appears bad....(like in a 256 or less colour)
in a 32 bit windows.. the routine works perfectly..
where is the problem? http://www.opengl.org/discussion_boards/ubb/smile.gif

nexusone
10-04-2002, 11:46 AM
An easy way to get alpha channel is to use a TGA file format for your graphics. Nehe GLUT version of his tutor has a TGA file loader.



Originally posted by Calca:
First of all sorry for my poor english http://www.opengl.org/discussion_boards/ubb/frown.gif
i try to explain my problem
i opened a window 1024x768x16
i loaded a BMP image using Nehe tutorials..
(AUX_RGBImageRec *TextureImage[] )
now i want to add the alpha..because i need to have trasparent part of my texture..
one of my friend suggest me to create a new
variable like this :
GLubyte TexBucata[256 * 256 * 4];
(256 is the dimension of the Image)
the advice is to copy the TextureImage->data
in the new variable adding the alpha value (when i find a particular colour in the original image i set alpha to 0; else alpha=1)

in a 16 bit windows..it doesn't work...
the colour appears bad....(like in a 256 or less colour)
in a 32 bit windows.. the routine works perfectly..
where is the problem? http://www.opengl.org/discussion_boards/ubb/smile.gif

*Aaron*
10-04-2002, 12:40 PM
BMPs can be 32-bit. You could just use the extra 8 bits for alpha. But if you just want to do masking, this is a bit wasteful. Are you sure the colors don't look bad in 16-bit because it's 16-bit color? 16-bit color means only 32 shades each of red and blue, and either 32 or 64 shades of green.

10-06-2002, 09:45 AM
Originally posted by *Aaron*:
BMPs can be 32-bit. You could just use the extra 8 bits for alpha. But if you just want to do masking, this is a bit wasteful. Are you sure the colors don't look bad in 16-bit because it's 16-bit color? 16-bit color means only 32 shades each of red and blue, and either 32 or 64 shades of green.

i am sure , because if i use the
TextureImage->data the colours are oK,
if i use my variable.. that has the same date + the alpha bit... the colours are bad
http://www.opengl.org/discussion_boards/ubb/frown.gif