PDA

View Full Version : Bitmap Image as Texture



WT
03-05-2002, 10:41 AM
I am using a bitmap image as a texture to render some graphics objects. However, the other objects seem to acquire the palette of the bitmap image. How do I prevent this discoloration.


Also I now want to show a series of bit maps in quick succession using the above technique. How do I do this without having to load up each bitmap onto an array which is going to be large


Many Thanks

nexusone
03-05-2002, 10:47 AM
Try:
http://nehe.gamedev.net
and
there was a link to a site here, on playing a AVI on a object as a texture.

As for the color change, what color mode are you using? 8bit, 16bit, 32bit?


Originally posted by WT:
I am using a bitmap image as a texture to render some graphics objects. However, the other objects seem to acquire the palette of the bitmap image. How do I prevent this discoloration.


Also I now want to show a series of bit maps in quick succession using the above technique. How do I do this without having to load up each bitmap onto an array which is going to be large


Many Thanks

WT
03-05-2002, 10:56 AM
How do you determine what graphics mode you are in. I adapted a program from the internet and just about got it working. So I am not sure whether this is a 16 bit or 24 bit. There is however some lines which look like

For X = 0 To bitmapWidth - 1
For Y = 0 To bitmapHeight - 1
c = pctbackground.Point(X, Y) ' Returns in long format.
bitmapImage(0, X, bitmapHeight - Y - 1) = c Mod 256
bitmapImage(1, X, bitmapHeight - Y - 1) = (c And 65280) / 256
bitmapImage(2, X, bitmapHeight - Y - 1) = (c And 16711680) / 256 / 256
Next
Next

To me it seems to suggest that it is in the 24 bit mode


Originally posted by nexusone:
Try:
http://nehe.gamedev.net
and
there was a link to a site here, on playing a AVI on a object as a texture.

As for the color change, what color mode are you using? 8bit, 16bit, 32bit?