Colorbuffer alpha value doesnt get cleared

Hi,

when I do

glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
			
glClear(GL_COLOR_BUFFER_BIT);	

and I read the buffer afterwards with glReadPixels() I get 0xFF as alpha value. The value I get is:

00 00 00 FF … for each pixel.

How can I clear the FF also?

Andreas

Does your buffer have an alpha channel? If it doesn’t, then the specification states that the alpha value obtained should 1.0. Alpha bits can be queried with

glGetIntegerv(GL_ALPHA_BITS, &AlphaBits); //deprecated

or by querying an FBO with

glGetFramebufferAttachmentParameteriv( GL_READ_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0, GL_FRAMEBUFFER_ATTACHMENT_ALPHA_SIZE, &AlphaSize);
// or when default framebuffer is bound:
//glGetFramebufferAttachmentParameteriv( GL_READ_FRAMEBUFFER, GL_BACK_LEFT,
//GL_FRAMEBUFFER_ATTACHMENT_ALPHA_SIZE, &AlphaSize);

If you have 0 alpha bits returned, then you’d need to make sure that you create a buffer/attachment that contains an alpha channel.

Otherwise, have you changed the default color write mask anywhere? This would disable drawing to the alpha channel at all, including clearing

glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE);

Thanks a lot for your quick reply!

glGetIntegerv(GL_ALPHA_BITS, &AlphaBits); //deprecated

gives me 0. This means I do not have Alpha enabled. Also have no glColorMask.

So how do I enable Alpha for my buffer?

Thx, Andreas

You enable alpha with the library you use to initialize your window.

For example, if you use glut, you have to pass GLUT_ALPHA to glutInitDisplayMode().

Thanks u very much! I am using SDL and SDL_GL_SetAttribute() helped!!!

andreas