glColorMask() question

Hi, I have one problem.
In description of void glClear(GLbitfield mask); I have read this “Masking operations, such as glColorMask() and glIndexMask(), are also effective”. So I do this:

glColorMask(GL_FALSE,GL_FALSE,GL_FALSE,GL_TRUE);
glClearColor(0,0,0,0.5); // unsigned byte (0,0,0,127) right?
glClear(GL_COLOR_BUFFER_BIT);
glColorMask(GL_TRUE,GL_TRUE,GL_TRUE,GL_TRUE);
glCopyTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,0,0,WindowWidth,WindowHeight,0); // my window is 1024x768
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, pTextureData); // its rectangular texture, same resolution 1024x768, belive me :)

And then I try to look alpha component, for example, pTextureData[3] its not equal to 127. Help me plz?

  1. “its not equal to 127” - so what value(s) do you get?
  2. when creating your textre use GL_RGBA8 as internal format (I mean the glTexImage2D call). If you just create texture with GL_RGBA (or with 4 components) then you’ll get default texture precision and that could be 16-bit.

All your rendering to alpha will only have an effect if you have selected a pixelformat with alpha bits.
Check glGetIntegerv(GL_ALPHA_BITS, &alphaBits). If it returns zero, all readbacks of alpha should return 1.0. Change your pixelformat selection in that case.
Mind that GLUT defines for some unknown reason
#define GLUT_RGBA GLUT_RGB
and you need to add GLUT_ALPHA to really get destination alpha planes.

GL_RGBA8 yes thx