glReadPixels() placement in program

Hi,

I was trying to use some code posted here to capture the screen as a TGA image (as it seemed most straighforward).

My problem is with calling the screendump() method - when called the display does not update (e.g. if I alter the rotation the update doesn’t display), and the program seems to hang indefinately. I am remotely working using ssh and xterm tp login to a linux machine (if that makes any difference). Also, please note that the program works fine until I make the glReadPixels call as far as I am aware.

Here is some of the code:

 
void screendump(void)
{
	FILE *out = fopen("screenshot.tga","wb");
	char pixel_data[3*500*500];
	short TGAhead[] = {0,2,0,0,0,0,500,500,24};

	glReadBuffer(GL_FRONT);
	glReadPixels(0,0,w,h,GL_BGR,GL_UNSIGNED_BYTE,pixel_data);

	fwrite(&TGAhead,sizeof(TGAhead),1,out);
	fwrite(pixel_data,3*500*500,1,out);
	fclose(out);
}

void display(void) {
     
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glLoadIdentity();
//    gluLookAt( 36.0, 15.0, 42.0, 0.0, 10.0, 0.0, 0.0, 1.0, 0.0 ); 
    gluLookAt( 0, 0, 20.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0 ); 
  
     
    glColorMaterial( GL_FRONT, GL_AMBIENT_AND_DIFFUSE );
    glEnable( GL_COLOR_MATERIAL );

    glEnable( GL_DEPTH_TEST );
    glEnable( GL_SMOOTH );

    glPushMatrix();
    //Rotate
    glRotatef(rotX,1.0,0.0,0.0);
    glRotatef(rotY,0.0,1.0,0.0);
    glRotatef(rotZ,0.0,0.0,1.0);
    //Translate
    glTranslatef(transX,0.0,0.0);
    glTranslatef(0.0,transY,0.0); 
    glTranslatef(0.0,0.0,transZ);
    //Scale
    glScalef(sc,sc,sc);

    //Render the polygon!
    render_poly();	
    //Draws the axis!
    axis();
	
    glPopMatrix();

    glFlush();
	
    glutSwapBuffers();
    screendump();
    glutPostRedisplay();
} 

Just an observation, i don’t know if this will fix the problem, but try placing the screendump() before you swap the buffers.

if you put screendump() before swapping buffers then you will want to read from GL_BACK as that is what is rendered to by default.

You might want to check to make sure that the framebuffer you frequested with glut (GLUT_RGB) is RBG and not GLUT_RGBA by accident.

hope this helps.

i’m not too familiar with glut, but doesn’t glutPostRedisplay make glut call the display func? if it is so, i don’t think it’s a good idea to place it at the end of the display func itself, since the display func will be called over and over again. and, each time a screenshot is created, which might cause your prog to hang.

Hi again,

Thanks for the replies all! The reason I have glutPostRedisplay() there is because my keyboard function alters the values rotX, transX, sc, etc. and then updates the screen based on the transformation. I myself wondered if that was why the program was hanging, but should’nt it only redisplay on keyboard interupt? Im quite new to this so this is only a guess.

“glReadPixels(0,0,w,h,GL_BGR,GL_UNSIGNED_BYTE,pixel_data);”
What values are in w and h?

Reading GL_BGR is dangerous with the default pack alignment of 4. Consult the glPixelStore manual. Set it to 1.

You should give youself a good slap on your fingers for allocating 723kB for the image on the stack. Use a malloc.

To brtnrdr:

>>You might want to check to make sure that the framebuffer you frequested with glut (GLUT_RGB) is RBG and not GLUT_RGBA by accident.<<

GLUT_RGBA actually is defined as GLUT_RGB. Look in glut.h.
To request destination alpha planes in the pixelformat you need to add GLUT_ALPHA to the bitfield.
Anyway that has absolutley nothing to do with the amount of data glReadPixels reads. That’s specified with its format parameter only.

in that case i think the glutPostRedisplay belongs at the end of your keyboard func.

by the way, relic: don’t blame davehh for that stack thing. his screendump function looks like what I posted in this forum some time ago :stuck_out_tongue: is it sure that local variables are always put on the stack- independent of their size, compiler optimization flags etc.?

Hi again,

I’ve switched my code around a bit based on advice here (thanks!) and its working after a fashion. I moved screendump() to a keyboard function and put glutPostRedisplay() at the end of that function. Im getting the image now, although it does take a long time to write the TGA file out (probably due to me using remote login).

Thanks for the help, much appreciated.

“by the way, relic: don’t blame davehh for that stack thing. his screendump function looks like what I posted in this forum some time ago is it sure that local variables are always put on the stack- independent of their size, compiler optimization flags etc.?”

Shame on you then. :wink:
I didn’t care for compiler switches messing with local allocations so far. In my old school progamming-C-world local variables feed from the stack and it can be severly limited in space depending on platform and ring. 732 kB on the stack is completely out of the question coding style-wise. :slight_smile:

hm. maybe i’ll be a bit wiser when i’ve had 2050 posts here, too :stuck_out_tongue: btw: i’ve had some mysterious seg faults recently. the code was something like

void ExposeFunc() {
char c[256];
sprintf(c, "whatever...");
glCallLists(strlen(c), GL_UNSIGNED_BYTE, c);

...}

very funny: the seg fault did not occur when i changed the size of the character string (maybe to c[128]).

since gl puts its commands into a queue, is it possible that glCallLists was executed at a time when my ExposeFunc had already been completed and therefore the local character string was not valid anymore?

“since gl puts its commands into a queue, is it possible that glCallLists was executed at a time when my ExposeFunc had already been completed and therefore the local character string was not valid anymore?”

No, the GL calls must not return until all user informations have been either completely buffered or processed.
256 or 128 bytes wouldn’t make a difference for GL here if the strlen is the same. Both local variables would leave the scope at the same speed.
Run it through a debugger and catch the segfault to see where it comes from.
If you suspect stack corruption, does it also go away if you make the char c[256] global?

“If you suspect stack corruption, does it also go away if you make the char c[256] global?”

i can’t check this now, cause i’m at work. but i’m pretty curious if that works; maybe i’ll leave work early today :smiley:

i’ve tried some different configurations now:

  1. local char c[256], compiler optimization -O3 : crash

  2. local char c[256], compiler -O2: no problem

  3. local char c[512], compiler -O3: no problem

  4. global char c[256] or global char c[512], compiler opimization -O3 : no problem

that’s all a bit confusing. i should mention that the seg fault doesn’t occur immediately- it happens about 2 seconds after starting the prog.

well, for my own sake, i’ve decided to blame the compiler (instead of having to admit that my code is a mess :wink: ), since the problem occurs only with optimization flag -O3.

What compiler would that be and what does -O3 do in it?

my system is suse 10.0 linux, the compiler is gcc 4.0.2. ‘-Ox’ is a flag for code optimization (as i wrote above :wink: ). from the man page: -O1 : optimize, -O2 : optimize even more, -O3 : optimize yet more.

the man page says that -O2 means that the following compiler flags are used:

-fdefer-pop -fdelayed-branch -fguess-branch-probability -fcprop-registers -floop-optimize
-fif-conversion -fif-conversion2 -ftree-ccp -ftree-dce -ftree-dominator-opts -ftree-dse -ftree-ter -ftree-lrs -ftree-sra -ftree-copyrename
-ftree-fre -ftree-ch -fmerge-constants

but i have to admit that this stuff is far beyond the horizons that i ever wanted to reach :stuck_out_tongue:

at this moment, i get around the problem by splitting my source code into several files, compiling them with different -Ox flags. the file, which contains the code that causes trouble is compiled with -O2, the others with -O3. and, of course, i’ve made the character string global (thanks :slight_smile: for the tip).

BULLSHIET!!!

finally, i’ve found the problem. somewhere in my display func i had some obsolete lines, which accessed the character string like this:

sprintf(c, “%9.3f, %9.3f”, value1, value2);

value1 and value2 are calculated by an obviously instable algorithm. i needed them in a former program version, but not in the current. they are calculated, growing towards inf, becoming so big that after 2 seconds “%9.3f, %9.3f” results in a string much longer than 256 bytes. that’s all. i didn’t notice it, because i print them to the string, but do not output it on the screen. i removed the line, compiled everything with highest optimization level -O3 and it works now.

it’s still a bit mysterious to me, because it happened only with compiler flag -O3, but i think i’ll get me a beer now, watch brasilia kick away japan, and forget about all that stuff :wink:

Relic, you reckon GL_RGB and GL_RGBA are defined the same!???

#define GL_RGB 0x1907
#define GL_RGBA 0x1908

i’ll leave that one to you, relic :smiley:

I’m all over it. :wink:
Read my posts again and you’ll find that I said:
“GLUT_RGBA actually is defined as GLUT_RGB. Look in glut.h.”
Cool, I found a manual for the RTFM answer:
http://www.opengl.org/resources/libraries/glut/glut-3.spec.pdf
Page 49 at the bottom. :smiley:

Should be more like RTFQ, was thinking of GL rather than GLUT definitions.