View Full Version : glDrawPixels() with Alpha Channel

03-03-2009, 11:20 AM
This will be a really newbie post; I'm just looking for general direction/advice.

I want to draw a 32-bit bitmap with a format like ARGB (A8 R8 B8 G8). In glDrawPixels there is no GL_ARGB format , but I see a GL_UNSIGNED_INT_8_8_8_8 type option.

What's the simplest way to do this? Is there some way to convert from ARGB to RGBA, either in OpenGL or with a specific image editor? (I'm pretty sure DirectX has functions for this type of situation).


03-04-2009, 07:20 AM
The documentation says there is GL_RGBA. So you'll have change your format in memory from ARGB.


03-04-2009, 08:47 AM
The documentation says there is GL_RGBA. So you'll have change your format in memory from ARGB.

I realize there is an RGBA and no ARGB... I was just wondering if OpenGL has built-in functions to convert, since I recall that DirectX can convert from BGR to RGB and back.

Otherwise I was asking if someone happens to know an easy way to convert (perhaps an image editor that can save different forms of 32bit bmp's???). I can write a program that will go through my bitmaps and swap the two, but I have many bitmaps in a folder and this seems like a lot of work.

03-04-2009, 09:20 AM
Do you have a reason to use drawpixels and not a textured quad ?
Textures can be BGRA and have much better performance.

BTW I had never heard of ARGB :

03-04-2009, 12:06 PM
BTW I had never heard of ARGB

Yea I'm realizing that ARGB is a non-existent format, but I acquired these bitmaps in a very strange way (they were custom created by a friend who was doing graphics rendering using an old MC68k processor... long story...)

Anyway, I managed to write a program to convert them so no worries. Thanks for your help.

03-04-2009, 01:25 PM
ARGB (BGRA backwards) seems to be an old DX9 convention for representing little-endian byte orders in a hexadecimal layout (i.e. 0xAARRGGBB). As Zbuffer pointed out GL_BGRA is just the ticket for bitmap layouts.