View Full Version : Rendering into DIBSection with alpha channel

07-03-2009, 12:57 AM
I have some code written to do an openGL render into a DIBSection, and it is working fine for the RGB channels, but is not generating the alpha pixels - they all get set to zero on any pixel that is drawn, regardless of alpha, so when I save to PNG file, it is completely black. Other file types work fine (JPG, GIF, BMP), and if I programmatically set all the alpha pixels to 255 just before saving, the PNG also works.

I've tried clearing the alpha bits before the OpenGL rendering, and any pixels that are not touched by OpenGL will retain their original alpha channel, but any pixels rendered by openGL (regardless of alpha channel) will set the DIBSection alpha channel to zero.

I'm guess it's got to be something in the pixel format descriptor for the DIB, right? Here's the code for setting up the DIB prior to rendering...

// Create the memory DC for the OpenGL context
m_hDC = ::CreateCompatibleDC(NULL);

// Create the DIB to render into
memset(&m_bmi, 0, sizeof(BITMAPINFO));
m_bmi.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
m_bmi.bmiHeader.biWidth = m_iWidth;
m_bmi.bmiHeader.biHeight = m_iHeight;
m_bmi.bmiHeader.biPlanes = 1;
m_bmi.bmiHeader.biBitCount = 32;
m_bmi.bmiHeader.biCompression = BI_RGB;
m_bmi.bmiHeader.biSizeImage = m_iWidth * m_iHeight * 4;

m_hBitmap = ::CreateDIBSection(m_hDC/*hWindowDC*/, &m_bmi, DIB_RGB_COLORS, &m_bmpBits, NULL, (DWORD)0);

::SelectObject(m_hDC, m_hBitmap);

// direct method of pixel format setting
// Select a software-rendering pixel format that supports drawing to a bitmap
// (Must be 32-bit color because the bitmap is 32-bit color)
sizeof(PIXELFORMATDESCRIPTOR), /* size of this pfd */
1, /* version num */
PFD_TYPE_RGBA, /* pixel type */
32, /* 8-bit color depth */
8, 24, 8, 16, 8, 8, /* color bits (ignored) */
8, 0, /* alpha bits */
0, /* no accumulation buffer */
0, 0, 0, 0, /* accum bits (ignored) */
16, /* depth buffer */
0, /* no stencil buffer */
0, /* no auxiliary buffers */
PFD_MAIN_PLANE, /* main layer */
0, /* reserved */
0, 0, 0 /* no layer, visible, damage masks */

int SelectedPixelFormat;
BOOL retVal;

SelectedPixelFormat = ::ChoosePixelFormat(m_hDC, &pfd);
if (SelectedPixelFormat == 0) {
AfxMessageBox("Failed to find acceptable pixel format.", MB_ICONERROR | MB_OK);

// just to double check, read out the pfd and check it in the debugger...
DescribePixelFormat(m_hDC, SelectedPixelFormat, sizeof(PIXELFORMATDESCRIPTOR), &pfd);

retVal = SetPixelFormat(m_hDC, SelectedPixelFormat, &pfd);
if (retVal != TRUE) {
AfxMessageBox("Failed to set pixel format.", MB_ICONERROR | MB_OK);

// Initialize the OpenGL context
m_hGLRC = wglCreateContext(m_hDC);

m_hOldDC = wglGetCurrentDC();
m_hOldGLRC = wglGetCurrentContext();
wglMakeCurrent(m_hDC, m_hGLRC);

// Setup the OGL viewport
glViewport(0, 0, m_iWidth, m_iHeight);
gluOrtho2D(0, m_iWidth, m_iHeight, 0);

After that, I call my existing OGL DrawScene() function. Everything seems to work fine, except the alpha channel. Immediately after calling DrawScene(), any pixel touched by OpenGL has alpha = 0. I can overwrite all the alpha pixels with 255 before saving, then I can see the saved PNG file perfectly. If I overwrite with 128, then I can see a half-transparent image. I just need to get the OpenGL alpha channel coming out into the DIB correctly!

Any ideas?

Simon Mihevc
07-03-2009, 07:27 AM
DIB related functions don't deal with alpha channels as nicely as you would expect them to. For example GetDIBits ignores the alpha channel in case you specify biBitCount as 32 or as msdn states (http://msdn.microsoft.com/en-us/library/dd183376(VS.85).aspx) for BITMAPINFOHEADER's biBitCount field:

Each DWORD in the bitmap array represents the relative intensities of blue, green, and red, respectively, for a pixel. The high byte in each DWORD is not used.
Although I don't like it I have manually copy the alpha channel to convert a bitmap into texture. Don't know of any other ways around it. My advice would be to avoid gdi altogether if you can.

07-05-2009, 06:53 AM
So, what is the alternative? Essentially, I need to render into a memory bitmap, with RGB+alpha channel. So long as I have access to the pixel data at the end of the render, I can copy the bit planes around and massage them as required.

GDI+ seems to handle the alpha channel fine for saving and loading - if I can get OpenGL to render the alpha pixels into memory somewhere, then I can give that data to the DIBSection, and GDI+ will handle it fine.

So, how do I render into memory with the alpha channel intact?

Simon Mihevc
07-05-2009, 12:08 PM
The alternatives are pixel buffer objects and pbuffers. You could also render into a window and copy the contents to texture, then get texture pixel data and store it.

However I have a hunch that the problem with the part of code which gets and saves pixel data, could you paste it? Also did you check whether alpha gets zeroed after getting the pixel data with GetDIBits or similar or via textures - glCopyTexImage2D and then glGetTexImage? If you didn't use the second one, you might want to try it to get to pixel data. Alpha should show up in there.

07-05-2009, 02:42 PM
OpenGL contex on DIB section is not accelerated.. so you end up with software rasterizer.
GDI+ internally use DX7, but not D3D7.
If you want to use OpenGL for offscreen rendering, create opengl context on some window, then create fbo and render to texture. Then grab result back and create bitmap.