hidden surface removal does not work

Hi,
After one day of googling to solve this problem still no solution.
I stripped down my app to concentrate on the problem.
There is no real depth, that is when rotating the red triangle stays in front of the blue triangle. (this also happened with textures, which I removed for clarity).
Maybe this is an wxWidgets issue. If so, I hope to hear from someone that at least the OpenGL code is valid.

Thanks.

Linux Ubuntu wxWidgets.
Code:
…setting up the window
int attribList[] = {WX_GL_RGBA, WX_GL_DOUBLEBUFFER, WX_GL_DEPTH_SIZE, 8, 0};// also tried 16, 24 and 32
test3DWindow = new Test3DWindow( this, -1, wxPoint(0,0), wxSize(1250,1250), wxVSCROLL|wxHSCROLL, wxString(wxT(“oglw”)), &attribList[0]);

…init and drawing
[b]bool Test3DWindow::initGL()
{
glClearColor(0.0f, 0.0f, 0.0f, 1.0f); // Black Background
glClearDepth(1.0f); // Depth Buffer Setup
glEnable(GL_DEPTH_TEST); // Enables Depth Testing
glDepthFunc(GL_LEQUAL); // The Type Of Depth Testing To Do
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
//glDepthMask(GL_TRUE);
glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
return true;
}

void Test3DWindow::draw(double par, long lightx, long lighty, float deltaAngleX, float deltaAngleY, float deltaAngleZ)
{
wxPaintDC PaintDc(this);
SetCurrent();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
anglex = anglex + deltaAngleX;
angley = angley + deltaAngleY;
anglez = anglez + deltaAngleZ;

glClear(GL_COLOR_BUFFER_BIT);

glViewport (0, 0, 1000, 1000);//can be anywhere: all unit values are relative to these width and height values

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();//set (0,0,0) weer int midden
glTranslatef(0, 0, -400);

glRotatef(anglex, 1, 0, 0);
glRotatef(angley, 0, 1, 0);
glRotatef(anglez, 1, 0, 1);

glClearColor(0.0, 0.0, 0.0, 0.0);
glColor3f(0, 0, 255);
glLineWidth(3);
glBegin(GL_LINE);
glClearColor(0.0, 0.0, 0.0, 0.0);
glColor3f(0, 0, 255);
glVertex3f(0, 400, 0);
glVertex3f(0, -400, 0);
glEnd();
glBegin(GL_LINE);
glClearColor(0.0, 0.0, 0.0, 0.0);
glColor3f(0, 0, 255);
glVertex3f(400, 0, 0);
glVertex3f(-400, 0, 0);
glEnd();
glBegin(GL_LINE);
glClearColor(0.0, 0.0, 0.0, 0.0);
glColor3f(255, 0, 0);
glVertex3f(0, 0, 400);
glVertex3f(0, 0, -400);
glEnd();
glBegin(GL_LINE);
glClearColor(0.0, 0.0, 0.0, 0.0);
glColor3f(0, 255, 255);
glVertex3f(-50, 0, 50);
glVertex3f(50, 0, 50);
glEnd();
glDisable(GL_LINE);

glClearColor(0.0, 0.0, 0.0, 0.0);

glColor3f(0, 0, 255);
glBegin(GL_TRIANGLES); // Drawing Using Triangles
glVertex3f( 0, 0, 0); // Top
glVertex3f(0, 100, 0); // Bottom Left
glVertex3f( 100, 100, 0); // Bottom Right
glEnd();

glColor3f(255, 0, 0);
glBegin(GL_TRIANGLES); // Drawing Using Triangles
glVertex3f( 10, 0, 20); // Top
glVertex3f(10, 100, 20); // Bottom Left
glVertex3f( 110, 100, 20); // Bottom Right
glEnd();
glDisable(GL_TRIANGLES);

glMatrixMode(GL_PROJECTION);//next operation are on the projection matrix
//
glLoadIdentity();//reset the axis system to this view(port); (0,0,0) is int midden (daar orienteert bv. glLookAt zich gemakkelijk op)
//
// gluOrtho2D(500, 500, 500, 500);
//glOrtho(-500, 500, -500, 500, 0.00001, 1000);
//glFrustum(-500, 500, -500, 500, 0.00001, 1000000);//wanneer een groot object kantelt kan het buiten de box komen en niet meer visual zijn
gluPerspective(120.0f, 1.0f, 200.0f, 100000.0f);

glMatrixMode(GL_MODELVIEW);

glFlush ();

SwapBuffers();

}
[/b]

Try a test. Render solid cubes and check. If they are not correctly rendered with depth,then perhaps you need to check that your resndering context actually includes a depth buffer (and /or you have specified a supported format)

Agree with BionicBytes. Most probably you don’t have a depth buffer. Besides that GL_CULL_FACE maybe won’t have any effect as you don’t include surface normals.

Thanks. Solid cube renders also quite messy.
My assumption is that the depth buffer is set via wxWidgets:
int attribList[] = {WX_GL_RGBA, WX_GL_DOUBLEBUFFER, WX_GL_DEPTH_SIZE, 8, 0};// also tried 16, 24 and 32
test3DWindow = new Test3DWindow( this, -1, wxPoint(0,0), wxSize(1250,1250), wxVSCROLL|wxHSCROLL, wxString(wxT(“oglw”)), &attribList[0]);

Do you know an OGL function to check the (existence of) depth buffer?
What do you mean by “specified a supported format”?
Thanks.

You can query the GL_DEPTH_BITS framebuffer info with standard OpenGL queries. This shall return 0 if you don’t have a depth buffer.

Thanks guys!
I did the glGetIntegerv GL_DEPTH_BITS test at the end of the initGL function, result was 0.
Then I did the same test in the draw function, just after
wxPaintDC PaintDc(this);
SetCurrent();

result: 24, but still bad rendering.
When I do call glEnable(GL_DEPTH_TEST) also in the draw function (just after those two lines) every thing renders fine!
I thought that glEnable(GL_DEPTH_TEST) only needs to be set once…
Anyway, it works now!
Many thanks!