PDA

View Full Version : Changing video modes



marcus256
04-07-2002, 07:02 AM
Haven't found any good description about what is "allowed" to do regarding video mode changes under Windows, without destroying the GL context and creating a new context.

As far as I have understood, it is not a good idea to change BPP without creating a new GL context.

In windowed mode, it seems to be OK to change the size of a window without creating a new GL context, which leads me to believe that it would be OK to change the resolution of a fullscreen window by changing the video mode to a mode with the same BPP as the current mode, without destroying the GL context.

What is safe - on all possible video cards? (I assume some things will work on some cards, but not on other cards)

davepermen
04-07-2002, 07:08 AM
on my gf2mx i can switch to and from fullscreen without problems. but i always run at 32bitsperpixel, so i dunno..

Ozzy
04-07-2002, 10:15 PM
Hello Marcus,

Well u know, i was using video modes changes without destroying/creating GLcontext since a long time ago now.. and on every boards (not professionals) (with acceptable drivers) i didn't get any troubles.
Anyway, i was also *not* using any stencil buffer thus it simplifies many things!! ;)
As an example, changing BPP from 32 to 16 with a stencil will cause terrible probs on GeForce boards (switching to software emu)
Then, if it is GLFW related (i'm pleased you're still thinking to the video modes changes without destroying context ;)
well, i still dunno what is 100% safe to do! :( (instead of destroying/creating GLcontext)
Moreover as u said there should be compatibility probs with others platforms (Unix based &| Mac and so on...)
As a conclusion, suppose that a GLFW user has to do *only* specific things concerning video modes changes (with the same Bpp or without using stencil with win32 for instance) then, a possibility would be for him to patch the GLFW sources that's all.. ;)
Btw, it could be interesting to know about others systems behaviors when only dealing with != BPP during vmode changes (without using stencil and other stuff).. :)

marcus256
04-10-2002, 10:12 PM
Hello Ozzy,

Yes this is for GLFW, of course. I have (almost) working code (many cleanups need to be done), but I want to know exactly what effects it will have on different cards. What I really am looking for is some kind of specification or statement regarding this issue, coming from either some M$ spec or some driver manufacturer. GLFW needs to be robust, and I need to know what to write in the docs, so that GLFW users know what the functionality does and how well it works on various cards.

Michael Steinberg
04-11-2002, 01:34 AM
It's always good to stay on the safe side of things. I for my part recreate the context everytime the window or desktop setting change. Switching resolution almost never worked for me with the same DC somehow, but maybe i did something wrong.

jwatte
04-11-2002, 10:36 AM
To avoid having to re-allocate textures and stuff, you probably want to create a second context, wglShareLists() into that, and then destroy the first (old) context. Some people create one context up-front which they always keep hanging around but never activate, just to keep the lists always available.

V-man
04-11-2002, 04:46 PM
Originally posted by Michael Steinberg:
It's always good to stay on the safe side of things. I for my part recreate the context everytime the window or desktop setting change. Switching resolution almost never worked for me with the same DC somehow, but maybe i did something wrong.


You recreate the window everytime the desktop changes? This has been on my mind for a long time but I never bothered with it. If the bit depth or resolution changes, does it corrupt your window or something? Isnt the driver suppose to handle it.

V-man

zeckensack
04-11-2002, 05:19 PM
Just a little status report about What Works For Me (TM):
1)I can safely resize windows (duh)
2)I can switch between borderless and 'normal' window style (via SetWindowLong)
3)If I combine both of the above with resolution switches (standard GDI calls again), I can switch between desktop-friendly windowed mode and fullscreen.
This is all without ever destroying and recreating contexts/windows.

#3 works on Radeon and Geforce cards. On an old Savage2k (yeah, quite unsupported nowadays, haven't tested the new driver that was released just yesterday after some 1 1/2 years without updates http://www.opengl.org/discussion_boards/ubb/wink.gif ) I get a black screen - but no crash - when switching from windowed to fullscreen. If I switch back to windowed the stuff shows up again just fine.
I remember having a workaround for this sometime but I don't know exactly what was required anymore. Haven't been 'working' on the S2K for a good amount of time.

Michael Steinberg
04-11-2002, 08:37 PM
Originally posted by V-man:

You recreate the window everytime the desktop changes? This has been on my mind for a long time but I never bothered with it. If the bit depth or resolution changes, does it corrupt your window or something? Isnt the driver suppose to handle it.

V-man

It's not that i ever tested it. I just did it because i dont really like to test things out and use special paths on different platforms and 3d accelerators/drivers. And it works great btw. http://www.opengl.org/discussion_boards/ubb/smile.gif At least for me it is the safe side, i know it will work.

Michael Steinberg
04-11-2002, 08:39 PM
Originally posted by jwatte:
To avoid having to re-allocate textures and stuff, you probably want to create a second context, wglShareLists() into that, and then destroy the first (old) context. Some people create one context up-front which they always keep hanging around but never activate, just to keep the lists always available.

As my engine (was as im recoding it now) transparent to reloading stuff i didnt have to work with these "tricks". However it might safe some time to do it, but for me its not worth it.

marcus256
04-14-2002, 09:57 PM
So, the impression I get is that it "should" work (to siwtch fullscreen resolution without changing contexts), it's really up to the driver? If the BPP changes, I suppose the driver could be forced to software rendering (e.g. if one BPP mode supports something in hardware that the other BPP mode does not support).

What I'm really interested in is: is this supported in "wgl" (not really OpenGL), so that it's up to the driver to support it - or is it a case that some manufacturers (nVidia, ATI) support more than wgl requires?