View Full Version : read stencil buffer

07-22-2011, 09:28 AM

I try to read the values of the stencil buffer for all pixels of the viewport, so I do:

unsigned char * data = new unsigned char[w*h];
glReadPixels(0,0,w,h,GL_STENCIL_INDEX,GL_UNSIGNED_ BYTE,data);

Where w and h are the width and heigth of the viewport.

However, I have a segmentation fault when calling glReadPixels. Why?

07-22-2011, 09:44 AM
unsigned char * data = new unsigned char[screenw*screenh];
glReadPixels(0,0,screenw,screenh,GL_STENCIL_INDEX, GL_UNSIGNED_BYTE,data);
delete [] data;

this doesn't crash in my case. so check your allocation and deallocation, check "w" and "h" values twice, if it's ok - something is wrong with surrounding code(and you, probably, should submit some to get help).

07-22-2011, 12:20 PM
It's very weird, it seems that when I call glReadPixels, it flushes all variables.

So before calling the function, sizeof(data)>0 but after, sizeof(data) is 0......

07-22-2011, 05:00 PM
You might need some padding for the width, or adjust the packing with glPixelStorei. Data returned by glReadPixels isn't byte-packed by default.

Read item 7, "Watch Your Pixel Store Alignment" here (http://www.opengl.org/resources/features/KilgardTechniques/oglpitfall/).

07-23-2011, 01:51 AM
And as usual when a question about stencil not working is asked, we should always ask: did you ensure you created a Window which has stencil capabilities ?