Dear OpenGl Community,

I have a strange problem with glReadPixels. Maybe you have a suggestion for me.

I was playing around with this function a bit and noticed a strange effect as soon as I display objects, which don't contain solid color. On screen they render fine but the pixels I read show strange patterns. To get closer to the source of the problem I wrote a quick and dirty demo program, which just displays one single quad and then uses glReadPixels to read all pixels of the viewport. Those I write to a tga file.

As soon as I apply a gradient to that quad I get the folowing effect. On top is a screenshot of the window, bottom are the read pixels. Its vertically mirrored since I was to lazy to change the origin before writing the tga.

If I use a solid color for the quad all works fine. Also for multiple quads. But as soon as I have a gradient it screws up the read pixels. It gets even worse if I map a texture with an uneven pattern onto the quad - which is what I eventually want to do.

Code :
#include <windows.h>
#include <gl\glew.h>
#include <gl\glut.h>
#include <gl\freeglut_ext.h>
#include <iostream>
#include <stdio.h>
void renderScene(void)
  glClear    (GL_COLOR_BUFFER_BIT );  
  glVertex2f(10 , 10);
  glVertex2f(10, 512 +10);
  glVertex2f(512 + 10 , 512 +10 );
  /** glReadPixels will work if a solid color is used on this quad,
  * so commenting out folowing line will give correct behaviour */
  glVertex2f(512 + 10  , 10);
void changeSize(GLsizei w, GLsizei h)
  //Reset the coordinate system
  glOrtho(0, w , 0, h , 1,-1);
  glMatrixMode(GL_MODELVIEW );
bool writeTGA(const std::string & filename, unsigned char * texture, unsigned int  width, unsigned int  height, unsigned int  bpp)
  FILE *file = fopen(filename.c_str(), "w");	
  if( file == NULL)
    return false;
  GLubyte header[] = {
    00,00,02, 00,00,00, 00,00,00, 00,00,00,
    0xff & width, 0xff & width >> 8,
    0xff & height, 0xff & height >> 8,
    bpp, 0x20 
  fwrite( header, sizeof(header), 1, file);
  fwrite(texture, width * height * bpp / 8, 1, file);
  return true;
int main(int argc, char ** argv)
  unsigned int windowW  = 800;
  unsigned int windowH  = 600;
  int          windowId = 0;
  std::cout << "ReadPixels Test" << std::endl;
  std::cout << "-----------------------------\n" << std::endl;
  //Setup FreeGlut
  int tmpArg = 0;
  glutInit(&tmpArg, NULL);
  // Note: glutSetOption is only available with freeglut
  glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB);
  glutInitWindowSize(windowW, windowH);
  windowId = glutCreateWindow("glReadPixels Test");
  //setup some opengl states
  glClearColor (0.5, 0.5, 0.5, 1.0);
  glPixelStorei(GL_PACK_ALIGNMENT, 1);   
  //adjust window size 
  changeSize(windowW, windowH);
  //main loop
  int counter = 200;
    if(50 == counter )
      unsigned char * texture = (unsigned char *)malloc(windowW * windowH * 3);
      memset(texture, 0x00, windowW * windowH * 3);
      glPixelStorei(GL_PACK_ALIGNMENT, 1);     
      glReadPixels(0, 0, windowW , windowH, GL_BGR, GL_UNSIGNED_BYTE, texture);
      //write pixels to file
      writeTGA("testout.tga", texture, windowW, windowH,  24);
  return 0;

This seems like one of those problems, where I stare at my code for too long and just don't see the obvious. At least I hope it's that trivial and some of you can point me to my error. I have looked through forum and internet and this problem seems to be not very common - I found no references describing this behaviour.

Query for the gl version string returns 3.3.0 at my system . I'm using freeglut to setup the opengl window, in case you want to test that code.

I have played around with all kinds of opengl settings but I just don't seem to find the right switch here. GL_PACK_ALIGNMENT seems not the problem here, though I still set it to 1 just to make sure. I also tried to move swapBuffers behind the glReadPixels since I read it somewhere in the forum but it changes nothing.