glReadPixels on GeForce2

Hi,

I am doing some glReadPixels on a GeForce2 card drawing RGBA and reading them back. With the lastest driver 6.31 the read back always returns 255 in the alpha channe;, while with driver 6.18 the alpha channel gets read back right.

I was wondering if anyone have any suggestion.

Thanks for all your suggestion

–x

PS: here is a simple code that I make up to test.

void draw() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// set up matrices
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0,1,0,1,-1,1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

// Setup rendering params
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glShadeModel(GL_FLAT);

// draw a simple quad with an alpha value
glColor4ub(255,0,255,128);
glBegin(GL_QUADS);
	glVertex2f(0,0);
	glVertex2f(0.2,0);
	glVertex2f(0.2,0.2);
	glVertex2f(0,0.2);
glEnd();
glFlush();

// read pixels back
glReadPixels(0,0,256,256,GL_RGBA,GL_UNSIGNED_BYTE,buffer);

// copy the alpha channel buffer in all the channels
for(int i = 0; i < 256*256; i ++) {
	buffer[i*4+0] = buffer[i*4+3];
	buffer[i*4+1] = buffer[i*4+3];
	buffer[i*4+2] = buffer[i*4+3];
	buffer[i*4+3] = buffer[i*4+3];
}

// draw the quad in the 1st quadrant
glRasterPos2d(0.5,0.5);
glDrawPixels(256,256,GL_RGBA,GL_UNSIGNED_BYTE,buffer);

// swap buffers
glutSwapBuffers();

}

int main(int argc, char **argv) {
glutInit(&argc, argv);

glutInitDisplayMode(
	GLUT_RGBA | 
	GLUT_DOUBLE |
	GLUT_DEPTH
);

glClearColor(0,0,0,00);
glClearDepth(0);

glutInitWindowSize(512,512);
glutCreateWindow("TestAlphaReadBack");

glutDisplayFunc(&draw);

// glutDisplayFunc(&drawTesselation);

glutMainLoop();

return 0;

}

Sounds like you’ve discovered a bug fix of mine.

Note that you are requesting a GLUT_RGBA visual. This is a misleading enumerant in GLUT. In fact, GLUT_RGBA is defined as the same thing as GLUT_RGB. Right now, you’re probably getting a visual that has no alpha channel. You can confirm this by doing a glGetIntegerv(GL_ALPHA_BITS, &alphaBits). alphaBits should be either 0 or 8.

If a visual has no alpha channel, when you read back alpha, OpenGL specifies that we pretend the buffer’s alpha is effectively 1 (or 255, in unsigned byte terms).

We didn’t have this check in, and so we were always reading back the alpha channel, even when it wasn’t “supposed to exist”.

I fixed this bug on Aug. 24, and the first version to include the fix should be 6.22.

The fix for your app is to specify GLUT_ALPHA as one of the features you’re requesting.

  • Matt

Thanks this works fine!!!

–x