View Full Version : Flashing in Full Screen?

05-04-2004, 10:26 AM
My OpenGL app uses the scissor to selectivly clear certain regions of the screen. This means that on a given frame, only part of the screen may be redrawn. This works fine (linux/windows using NVidia cards.) when I am operating in a window, but as soon as I go full screen, I get flashing. It looks like only one buffer is being updated so every other frame you'll see an old version of the display. Any idea what this is, how to fix it, or why it only shows up when I am running fullscreen?


05-04-2004, 01:15 PM
Maybe you have a single buffer in windowed mode and a double buffer in fullscreen ?

05-04-2004, 04:41 PM
More specifically, after a SwapBuffers call, the status of the back buffer is undefined.

05-04-2004, 06:27 PM
Most cards select SWAP_COPY when you're windowed mode, and may turn to flip mode when full-screen. If you depend on the contents of the back buffer after SwapBuffers(), you'll get undefined behavior.

You can try to force SWAP_COPY even in full-screen by explicitly setting that bit when you set your pixel format.

05-04-2004, 08:00 PM
I have not explicitly done any kind of mode changing when I am in full-screen vs. windows w.r.t. the buffer swapping. I'll give it a shot. I've tried all sorts of things to try to remedy it. The undefiend behavior is what I was looking for. Thanks! Any other suggestions would be more than welcome on possible causes of this. I know it must be tied to that windowing mode.

05-04-2004, 08:03 PM
One more thing...where can I get some info on how to set this buffer flipping mode? A quick check of my books and the net didn't show anything when I looked for "Swap copy".

Thanks again!

05-04-2004, 09:31 PM
It's known as PFD_SWAP_COPY, and is a flag for the pixel format descriptor.