ATI HWA Depth Testing

I used to program OpenGL in RealBasic, but I’m learning C++ now. To initialize OGL I simply ported the code from RB, and now depth testing doesn’t work. I use this block of code right after I initialize an AGLContext and set everything up, maybe the problem’s in here:

void BaseglInitializeViewport (void)
{
glClearColor(0.0, 0.0, 0.0, 0.0);
glEnable(GL_DEPTH_TEST);
glClearDepth(1.0);
glDepthFunc(GL_LEQUAL);
glDepthMask(GL_TRUE);
glEnable(GL_COLOR_MATERIAL);
glShadeModel(GL_SMOOTH);
glLightModeli(GL_LIGHT_MODEL_TWO_SIDE, 1);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glColorMaterial(GL_FRONT_AND_BACK, GL_AMBIENT);
}

Also, when I use AGL_RENDERER_ID, AGL_RENDERER_GENERIC_ID to get a software OGL renderer, depth testing works fine, but under my ATI HWA renderer, with the same code, it doesn’t work.

Help!!

That all looks fine, how are you setting up your rendering contexts & pixel formats? The bug is probably somewhere in there.

Pixel Format is set up like so:

GLint attrib[] = { AGL_RGBA, AGL_ACCELERATED, AGL_DOUBLEBUFFER, AGL_NONE };

the rendering context is taken straight from the apple developer sample code project called “OpenGL Drawsprocket.”

Ahh using Apple sample code, eh? well there is the problem hehe You need to specify depth buffer parameters in your pixel format when you are going to be using the depth buffer, like this:

// if using 32bit display …
GLint pixel_attributes_32bit= {
AGL_ACCELERATED,
AGL_RGBA,
AGL_DOUBLEBUFFER,
AGL_DEPTH_SIZE,
32,
AGL_NONE
};
// if using 16bit display…
GLint pixel_attributes_16bit= {
AGL_ACCELERATED,
AGL_RGBA,
AGL_DOUBLEBUFFER,
AGL_DEPTH_SIZE,
16,
AGL_NONE
};

Wow, thanks! it’s all good now.

All you Mac OpenGL programmers out there should check out Lunar Siege 3D at:
http://homepage.mac.com/g4soon/Downloads.html

It’s the game I’m currently porting from RealBasic to C++. tell me what you think of it!

G4Soon@Mac.com

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.