Problem with glReadPixels

Dear OpenGl Community,

I have a strange problem with glReadPixels. Maybe you have a suggestion for me.

I was playing around with this function a bit and noticed a strange effect as soon as I display objects, which don’t contain solid color. On screen they render fine but the pixels I read show strange patterns. To get closer to the source of the problem I wrote a quick and dirty demo program, which just displays one single quad and then uses glReadPixels to read all pixels of the viewport. Those I write to a tga file.

As soon as I apply a gradient to that quad I get the folowing effect. On top is a screenshot of the window, bottom are the read pixels. Its vertically mirrored since I was to lazy to change the origin before writing the tga.

If I use a solid color for the quad all works fine. Also for multiple quads. But as soon as I have a gradient it screws up the read pixels. It gets even worse if I map a texture with an uneven pattern onto the quad - which is what I eventually want to do.

#include <windows.h>
#include <gl\glew.h>
#include <gl\glut.h>
#include <gl\freeglut_ext.h>

#include <iostream>
#include <stdio.h>

//helpers
void renderScene(void)
{
  glClear    (GL_COLOR_BUFFER_BIT );  
 
  glBegin(GL_QUADS);

  glColor3f(1,0,0);
  glVertex2f(10 , 10);
  glVertex2f(10, 512 +10);
  glVertex2f(512 + 10 , 512 +10 );

  /** glReadPixels will work if a solid color is used on this quad,
  * so commenting out folowing line will give correct behaviour */
  glColor3f(0,1,0);

  glVertex2f(512 + 10  , 10);

  glEnd();

  glFlush();
}

void changeSize(GLsizei w, GLsizei h)
{
  glViewport(0,0,w,h);

  //Reset the coordinate system
  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();
                                        
  glOrtho(0, w , 0, h , 1,-1);

  glMatrixMode(GL_MODELVIEW );
  glLoadIdentity();
}

bool writeTGA(const std::string & filename, unsigned char * texture, unsigned int  width, unsigned int  height, unsigned int  bpp)
{
  FILE *file = fopen(filename.c_str(), "w");	

  if( file == NULL)
    return false;

  GLubyte header[] = {
    00,00,02, 00,00,00, 00,00,00, 00,00,00,
    0xff & width, 0xff & width >> 8,
    0xff & height, 0xff & height >> 8,
    bpp, 0x20 
  }; 

  fwrite( header, sizeof(header), 1, file);

  fwrite(texture, width * height * bpp / 8, 1, file);

  fclose(file);

  return true;
}

//main
int main(int argc, char ** argv)
{
  unsigned int windowW  = 800;
  unsigned int windowH  = 600;
  int          windowId = 0;
 
  std::cout << "ReadPixels Test" << std::endl;
  std::cout << "-----------------------------
" << std::endl;
 
  //Setup FreeGlut
  int tmpArg = 0;
  glutInit(&tmpArg, NULL);
  
  // Note: glutSetOption is only available with freeglut
  glutSetOption(GLUT_ACTION_ON_WINDOW_CLOSE, GLUT_ACTION_GLUTMAINLOOP_RETURNS);        
  glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB);
  glutInitWindowSize(windowW, windowH);

  windowId = glutCreateWindow("glReadPixels Test");

  glutDisplayFunc(renderScene);
  glutIdleFunc(renderScene);
  glutReshapeFunc(changeSize);

  //setup some opengl states
  glClearColor (0.5, 0.5, 0.5, 1.0);
  glPixelStorei(GL_PACK_ALIGNMENT, 1);   

  //adjust window size 
  changeSize(windowW, windowH);

  //main loop
  int counter = 200;
  while(counter--)
  {
    glutPostRedisplay();

    glutMainLoopEvent();

    glutSwapBuffers();

    if(50 == counter )
    {
      unsigned char * texture = (unsigned char *)malloc(windowW * windowH * 3);
            
      memset(texture, 0x00, windowW * windowH * 3);
    
      glPixelStorei(GL_PACK_ALIGNMENT, 1);     
      glReadPixels(0, 0, windowW , windowH, GL_BGR, GL_UNSIGNED_BYTE, texture);

      //write pixels to file
      writeTGA("testout.tga", texture, windowW, windowH,  24);

      free(texture);
    }

    Sleep(25);
  }

  return 0;
}

This seems like one of those problems, where I stare at my code for too long and just don’t see the obvious. At least I hope it’s that trivial and some of you can point me to my error. I have looked through forum and internet and this problem seems to be not very common - I found no references describing this behaviour.

Query for the gl version string returns 3.3.0 at my system . I’m using freeglut to setup the opengl window, in case you want to test that code.

I have played around with all kinds of opengl settings but I just don’t seem to find the right switch here. GL_PACK_ALIGNMENT seems not the problem here, though I still set it to 1 just to make sure. I also tried to move swapBuffers behind the glReadPixels since I read it somewhere in the forum but it changes nothing.

Thanks,
Michael

Looks like you are using double buffering => glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGB

And in renderScene you call glFlush(). I suggest you actually swap buffers with gluSwapBuffers().
You should glread the buffer before swapping.

Thanks for your reply. Unfortunately that doesn’t do it (also tested on a nother pc, just to make sure) :frowning: I had also tested single buffering and also changed the swapbuffer to be called after readpixels. I also had glflush removed in one version. I now tested again what you suggested: still not working. There are quite a few switches and I hope I just don’t get the combination right.

PS: I now also tested this whole stuff with an offscreen framebuffer and reading back from it. Exact same behaviour. What am I missing. I’m sure this has to work somehow.

Thanks in advance.

cheers

my guess is you are writing to an RGBA output format, or you have the output size wrong somehow. The shifting like that makes me think you are reading 4 instead of 3 pixels.

Hi dukey. I also think that something goes wrong with the output format. The strange thing though, when I render solid, one color rectangles, for example, those are read exactly as they should. Only if I apply a gradient (or texture or I guess any pattern) it goes wrong. Maybe I find some other cases where it doesn’t work but for now I just wanted to get the basics working. It also seems to read the pixels right until it encounters a gradient.

PS: The output format I set to unsigned byte and the color format I’m reading is set to rgb. I also tested rgba but this looks even more strange. If he was somehow reading rgba right now I’d get a crash in program since I only allocated memory space for rgb. But there are other packed formats, but those I didn’t specify here.

I found the error. A fresh look at the code revealed a missing letter in the opening of the file for tga writing. It has to be “wb” and not “w”. Quite obvious, but since I copy-pasted this part of the code and it was working for solid shapes I falsely assumed it as working. I was looking at the wrong parts of my code.

sorry for the disturbance. In the end no opengl problem with glreadpixels after all. Sometimes a good nights sleep is all it needs to find a bug :slight_smile:

cheers