UnAccelerated on NT

Hi,
My question is: What can cause a program to never get an accelerated rendering context on NT. I have a program that seems to follow the normal route for OpenGL setup, but I can’t seem to get an accelerated rendering context on NT. It works fine on 9x and other operation systems. And I can’t post code because it is spread out over a hundred different places and there is some sensitive code wrapped up in it.
Thanks
Trotsky

Does your drivers have hardware acceleration on NT?

yep.

Maybe it helps if you set the dwFlags in the pixelformatdescriptor to PFD_GENERIC_ACCELERATED.

I remember, that the original openGL screen saver in NT weren’t accelerated on my machine (TNT). But I never had a problem with with my gl-apps. Maybe it helps looking at their source (MSDN-Doku).

Kilam.

Could you post the initalization code that you use? Only the stuff that pertains to OpenGL. You know the CreateWindow and pixelformat stuff?
Thanks.

Here it is:

void OpenGLPane::Init()
// Constructing rendering context and defining pixel format.
{
PIXELFORMATDESCRIPTOR pfd={
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW |
PFD_SUPPORT_OPENGL |
PFD_DOUBLEBUFFER,
PFD_TYPE_RGBA,
32,
0,0,0,0,0,0,
0,0,0,0,0,0,0,
32,
0,0,
PFD_MAIN_PLANE,
0,
0,0,0
};

valid_draw=false;

hDC = GetDC((HWND)*this);
hPalette = NULL;

int pixelFormat = ChoosePixelFormat(hDC,&pfd);
BOOL success = SetPixelFormat(hDC,pixelFormat,&pfd);

hglrc = wglCreateContext(hDC);

show();
}

Kilam.

Well the only thing I can see different is that you use 32 bits for the depthbuffer and I’m using 16? Now the question is: Could that cause the ChoosePixelFormat func to get an unaccelerated rendering context.
The card that I’m using is a GeForce(2) Card.
I think they support 32 bit depthbuffers but I’m not sure.
-Trotsky