vista swapbuffers

unlike windows xp, it seems to me that vista does not perform a Blt when SwapBuffers is called
needless to say this is critical for my application since i am redrawing into regions of the back buffer only when needed, which will obviously not work with page flipping
i have tried disabling desktop composition (aka windows desktop manager) and running in xp compatibility mode but none helped
does anybody have a solution for this?
forgot to mention that this happens on both radeon x800 and geforce 8600, so i guess it is not a driver issue

as usual, thanks for any help you may provide

Can you specify the problem more closely?

Do you want to say that back buffer becomes undefined after the swap? This is actually a default behaviour and you must have been lucky if it worked on XP.

thanks for your reply,

yes that is exactly what i mean
as for being lucky, i’d rather say that as long as my experience is concerned, Blt (copy) is the default behavior (even on apple’s tiger os’es)
in the 90’s, Blt (‘block transfer’ at that time, if memory does not fail me) was an option in TNT’s control panel
is there any way i can enable this on all cards?
i know i may throw an fbo in, but i’d like to stick with traditional double buffering

It might be true, but specification isn’t very clear in that case. Basically, the default behaviour is “unknown”. If you look at http://msdn2.microsoft.com/en-us/library/ms537569.aspx , there are some flags that specify the swapping behaviour, but they are only hints. I am aftraid you will have to use a render-to-texture approach.

thanks again,

i’m currently having a look at the page you’ve pointed me at
on a different note, i haven’t tried single buffering on vista yet, do you believe it could be worth a try in my case?
i mean, since the screen is now composited, will tearing still be present?

unlike windows xp, it seems to me that vista does not perform a Blt when SwapBuffers is called

The OpenGL specification does not state what the contents of the back buffer are when you do a SwapBuffers. It is implementation-defined.

You have been relying on implementation-defined behavior for your application. You cannot force a GL implementation to use a particular swap buffer routine. If you need this behavior, you have to emulate it. Use an FBO to store your buffer, and then blit that to the screen backbuffer, then do a Swap.

is there any way i can enable this on all cards?

No.

There’s some more info in the Pipeline newsletter Volume 3:
http://www.opengl.org/pipeline/article/vol003_7/

Just your pixelformat descriptor… (swap copy vs swap exchange)

http://oss.sgi.com/projects/ogl-sample/registry/ARB/wgl_pixel_format.txt

Or download this and go through the supported pixel formats.

http://majorgeeks.com/GLInfo_d3640.html

thanks mark, that worked perfectly,

though there seems to be a problem with current ati vista drivers: it looks like wglchoosepixelformat / getpixelformatattrib erroneously expects / returns values like 0 or 1 instead of swap_copy / swap_exchange / swap_undefined, no similar problem on nvidia so far

the weird part is that that small glview utility identifies swap methods correctly even on ati + vista, so i was wondering if sources are available

thanks all for your feedback