always alpha == 0xFF

Why alpha values in framebuffers are always 0xFF in double buffer mode? however, they are all right in single buffer.

Probably because when you are in double buffered mode you are in fullscreen using 16-bit color, which doesn’t support a destination alpha channel. In single buffered mode, you are probably in a window using 32-bit color, which does support a destination alpha channel.

j

Thank you j, I used the same program to exam this problem, the only difference is to change PIXELFORMATDESCRIPTOR to support double duffers, Does not OpenGL supported the destination alpha channel in double buffer mode? (Hardware : GeForce3, Platform: Win2000)

Use DescribePixelFormat and check out if you have 32 bits for pixeldepth.

V-man

Geforce3 is an awesome card, you can bet it has destination alpha double buffered. You need to choose the right pixel format descriptor.