Double buffering problem

Is there something that could prevent using double buffering? ChoosePixelFormat chooses the requested pixel format but SetPixelFormat returns an error (pixel format unsupported).

I checked all I could and ran out of ideas. It’s especially difficult to debug since the code base is very huge (it’s an opengl gadget that is part of an interface library for a 3d renderer so it’s very difficult to keep trace of what’s going.

Rich.

Originally posted by DaaZ:
[b]Is there something that could prevent using double buffering? ChoosePixelFormat chooses the requested pixel format but SetPixelFormat returns an error (pixel format unsupported).

I checked all I could and ran out of ideas. It’s especially difficult to debug since the code base is very huge (it’s an opengl gadget that is part of an interface library for a 3d renderer so it’s very difficult to keep trace of what’s going.

Rich.[/b]

If you cannot obtain a non-doublebuffered pixelformat, you might try writing to the front buffer (glDrawBuffer( GL_FRONT))

HTH

Jean-Marc

The problem is that I should get a double buffer without problem. The video card I have in this machine should have no problem at all handling i (Evans & Sutherland…)

Rich.

Actually I found one weird thing…

Seems I can’t change the double buffer property on the window, it must be destroyed than re-created with the pixel format.

Altough other parameters, even in the dwFlags field like stereo buffer, works. This is the part that confuses me the most.

I can add or remove a stereo buffer to my context on the fly without a hitch. But I can’t add or remove a double buffer to my context without re-creating it. I mean it would be fine if it was a restreint on buffers because allocation graphic memory that’s not contiguous is not the best thing to do.

And isn’t a stereo buffer at least as big as a double buffer anyway? So why should there be a problem ading or removing a stereo buffer but not a double buffer?

Rich.