no depth information while picking objects

Hello all,

I have some problems with picking. I hope, this is the correct thread to post it.

The picking mechanism basically works, but I have no depth-information in the
picking buffer. The picked objects are correct, but they always have a
z-min and z-max of 2147483648. I marked it in the sourcecode-fragment.

My environment is MinGW and Eclipse on Windows.

This is the relevant code of my program

glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glSelectBuffer(PICKING_BUFSIZE, m_arPickingBuffer);
glRenderMode(GL_SELECT);

glMatrixMode(GL_PROJECTION);
gluPickMatrix(dPickX, dPickY, iPickRadius, iPickRadius, arTibasVp);
gluPerspective(45, dRatio, 1, 1000);
glMatrixMode(GL_MODELVIEW);
gluLookAt(vEye.x, vEye.y, vEye.z, // eye position
vView.x, vView.y, vView.z, // look at position
0.0f, 1.0f, 0.0f); // up-vector

DrawTheScene()…

iHits = glRenderMode (GL_RENDER);
ProcessHits(iHits);

void TibasVpOgl::ProcessHits(const GLint _iHits)
{
// pActHit[0] := hit count for hit 0
// pActHit[1] := Minimum depth for hit
// pActHit[2] := Maximum depth for hit
// pActHit[3] := name for 1. hit in this group
// pActHit[4] := name for 2. hit in this group… until hit count

  // pActHit[5] := hit count for hit 1
  // ...
  // ... until _iHits
  s32					s32ActHit;
  s32					s32ActName;
  GLuint				*pActValue	= m_arPickingBuffer;
  GLuint				uiNamesCount;
  GLuint				uiZ1, uiZ2, uiName;
  RefPtr<TreeItem>	pTI;
  qDebug() << "//////////////// hits:" << _iHits;	!!! result is correct !!!
  // iter over all hits
  for (s32ActHit=0; s32ActHit<_iHits; s32ActHit++)
  {
  	// number of names for this hit
  	uiNamesCount = *pActValue++; !!! result is correct !!!
      qDebug() << "number of names for this hit: " << uiNamesCount;
      // z1
      uiZ1 = *pActValue++; !!! result is wrong, always 2147483648 !!!
      qDebug() << "z1: " << uiZ1;
      // z2
      uiZ2 = *pActValue++; !!! result is wrong, always 2147483648 !!!
      qDebug() << "z2: " << uiZ2; !!! always 2147483648 !!!
      // iter over all names
      qDebug() << "names for hit " << s32ActHit << ": ";
      for (s32ActName = 0; s32ActName < (s32) uiNamesCount; s32ActName++)
      {
      	// get the next name
      	uiName = *pActValue++;
  	    qDebug() << "uiName: " << uiName; !!! result is correct !!!
      }
  }

} // TibasVpOgl::ProcessHits

Does anybody have an idea?

thanks for your help in advance
Andreas

I don’t use GL’s selection mode for picking, so I can’t really comment on the code.
Are you sure you have depth buffer attached and working as expected?
Also, do you have enough depth for your scene within this range 1-1000?

gluPerspective(45, dRatio, 1, 1000);

What do you mean with “depth buffer attached”?

I just have a little scene in the range from
x=[-10…10]
y=[-1…8]
z=[-5…2]

The scene has multiple cubes, spheres and cones and I can turn the scene. The display is always correct, that means that object in the foreground hides the objects in the background.
Therefor I belive, the depth information is calculated correct by OpenGL, but I’m still not an expert.

object in the foreground hides the objects in the background.

That does not prove you have a depth buffer, as the order in which you draw the objects could produce the same appearance for entirely different reasons.

What do you mean with “depth buffer attached”?

I mean how have you created the context? What options did you use to ensure there is a depth buffer?
Where’s the code to create the context and set the initial state?

I just have a little scene in the range from
x=[-10…10]
y=[-1…8]
z=[-5…2]

finally, where’s the code which sets up the projection matrix?

I don’t create the context by myself. I use the QT-library and there is a class QGLWidget, which does it automatically. Perhaps I shouln’t use this library?

I believe the code in my initial post set’s up the projection matrix

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPickMatrix(dPickX, dPickY, iPickRadius, iPickRadius, arTibasVp);
gluPerspective(45, dRatio, 1, 1000);

…and this code set’s up the modelview matrix

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(vEye.x, vEye.y, vEye.z, // eye position
vView.x, vView.y, vView.z, // look at position
0.0f, 1.0f, 0.0f); // up-vector

I don’t create the context by myself. I use the QT-library and there is a class QGLWidget, which does it automatically. Perhaps I shouln’t use this library?

I’m not saying you should or should not use what ever library.
The point is though, you can not be sure what you context actually is, which seems bad to me.

Try
glGetIntegerv (GL_DEPTH_BITS) to see how many bits are assigned to yours,

Yes, you didn’t say it. I said it, because I like to have the control on my program :slight_smile:

Try glGetIntegerv (GL_DEPTH_BITS) to see how many bits are assigned to yours,

I wrote a class which reports all the states of OpenGL. This are much more information then I need atm, but it’s for further use.
Here are the some of the states, which are related to this topic:

States which are true:

GL_COLOR_MATERIAL: true
GL_COLOR_WRITEMASK: true
GL_CURRENT_RASTER_POSITION_VALID: true
GL_DEPTH_TEST: true
GL_DEPTH_WRITEMASK: true
GL_DITHER: true
GL_DOUBLEBUFFER: true
GL_EDGE_FLAG: true
GL_RGBA_MODE: true
GL_SCISSOR_TEST: true

Bits values:

GL_ACCUM_ALPHA_BITS: 16
GL_ACCUM_BLUE_BITS: 16
GL_ACCUM_GREEN_BITS: 16
GL_ACCUM_RED_BITS: 16
GL_ALPHA_BITS: 8
GL_BLUE_BITS: 8
GL_DEPTH_BITS: 24
GL_GREEN_BITS: 8
GL_INDEX_BITS: 32
GL_RED_BITS: 8
GL_STENCIL_BITS: 8
GL_SUBPIXEL_BITS: 4

States which are false:

GL_ALPHA_TEST: false
GL_AUTO_NORMAL: false
GL_BLEND: false
GL_COLOR_ARRAY: false
GL_COLOR_LOGIC_OP: false
GL_CULL_FACE: false
GL_EDGE_FLAG_ARRAY: false
GL_FOG: false
GL_INDEX_ARRAY: false

As you can see GL_DEPTH_BITS is 24 and GL_DEPTH_TEST is true

Hi,
I have the same problem and I’m slowly getting out of mind…
Anybody can help?