View Full Version : glColorMask() question

10-08-2006, 09:34 PM
Hi, I have one problem.
In description of void glClear(GLbitfield mask); I have read this "Masking operations, such as glColorMask() and glIndexMask(), are also effective". So I do this:

glClearColor(0,0,0,0.5); // unsigned byte (0,0,0,127) right?
glCopyTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,0,0,Windo wWidth,WindowHeight,0); // my window is 1024x768
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, pTextureData); // its rectangular texture, same resolution 1024x768, belive me :)And then I try to look alpha component, for example, pTextureData[3] its not equal to 127. Help me plz?

10-08-2006, 10:32 PM
1. "its not equal to 127" - so what value(s) do you get?
2. when creating your textre use GL_RGBA8 as internal format (I mean the glTexImage2D call). If you just create texture with GL_RGBA (or with 4 components) then you'll get default texture precision and that could be 16-bit.

10-09-2006, 12:12 AM
All your rendering to alpha will only have an effect if you have selected a pixelformat with alpha bits.
Check glGetIntegerv(GL_ALPHA_BITS, &alphaBits). If it returns zero, all readbacks of alpha should return 1.0. Change your pixelformat selection in that case.
Mind that GLUT defines for some unknown reason
and you need to add GLUT_ALPHA to really get destination alpha planes.

10-15-2006, 02:46 AM
GL_RGBA8 yes thx