glReadPixels GL_DEPTH_COMPONENT

i am having a problem retrieving the value of the depth buffer with glReadPixels. i am not calling glReadPixels within a glBegin-glEnd and i am positive i am doing everything right. i am porting code exactly as i have from another windows app. this time i am using glut, and it will consistently not work on mac osx and windows. the value of the last ( GLvoid* pixels ) parameter does not change. glReadPixels does not generate an error either upon a call to glGetError.

it is not my video card either because i can get it to work when i do not use a glut rendering context. in short then, does anyone know of problems with reading the depth buffer with glut? i have tried explicitly reading the front and back buffers as well but to no avail.

thanks

Hello,

In my experience, reading the depth buffer with glReadPixels works well with glut.
Perhaps you have forgotten to set the depth buffer with GLUT_DEPTH in your glutInitDisplayMode, because it is not the default.

Sumpfratte

no it’s in there. i have been looking at this for literally a week and i cannot figure it out.

//init display mode
glutInitDisplayMode (GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB | GLUT_ACCUM );

//depth test is enabled
glClearDepth( 1.0f );
glEnable(GL_DEPTH_TEST);
glDepthFunc( GL_LEQUAL );

//the read pixels line
glReadPixels( _nX, _nY , 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &dZ );

i MUST be doing something wrong because it does not work anywhere.

hmmm…could you perhaps post some more code?
Because the question is, where you call glReadPixels() :confused:

In glut you have to call always glutCreateWindow before calling an OpenGL function and to register a display function, but i don’t think you have forgotten that, because you have no errors.

May I ask what’s the value of x and y ?
and BTW is dZ exactly a GLfloat or some other type ?
Also, what’s your system specs (especially the combo OS + graphics card + driver version) ?

yes it’s a float, and i have tried it with a double and unsigned int as well, but i cannot get any value returned, it is always unchanged. this has nothing to do with the os, graphics card combo. one because i can get the same code to work without using glut, and two because i have the same problem using this with glut across different os’s and video cards, ( i have tried 4 different machines and cannot get it to work ). mainly between osx and windows.

i am calling glReadPixels in a mouse down handler. the values of x and y are perfectly valid as well, i have checked that. it is not being called within a glBegin-glEnd pair and i get no error when calling glGetError immediately after the call.

thanks.

maybe it has to do with the place you call glReadPixels, as sumpfratte suggested.
No, it’s really weird. The simple fact that you change the library can’t change the behaviour of glReadPixels. There MUST be something else …

ah, yes i am a fool.

i do not think i ever actually had the float GL_FLOAT combo in there, that works. double does not.

i DO have another windows project though, that does not use glut, that works while specifying GL_FLOAT but sends an uncast double.

i have it working with glut though.

thanks.

Doh !
A friend of mine had the same problem with ATi drivers under linux…