Win32 ChangeDisplaySettings question

Hi Folks,
is it safe to use ChangeDisplaySettings() to switch between windowed and fullscreen mode?
Or may it happen that i loose some displaylists/textures?
Please let me know and thanks in advance…

Dunno what that has in common with advanced OpenGL but yes it is safe(the only way even ) and yes it will require that you reinitialize your OpenGL system…

I thin I’ll have to correct Mihail121, as his info isn’t right.
As long as you don’t destroy your render context, textures, displaylists, VBOs, verex/fragmentprogramms (and most other GL-Things) will stay valid.And as ChangeDisplaySettings won’t destroy your render context, you also won’t have to reinitialize your OpenGL-Subsystem.Only thing necessary would be to alter your viewports size.

I do this all the time and have had very few troubles (on both Nvidia and ATI) however, you may lose p-buffers (I think if you check the spec you have to account for this)

Originally posted by XFire:
Hi Folks,
is it safe to use ChangeDisplaySettings() to switch between windowed and fullscreen mode?

You can check this thread for information on why you should shut your gl context and gl window down before mode changing.

As mentionned in the thread you gave :

Do buffer objects survive screen resolution changes, etc.?

RESOLVED: YES. This is not mentioned in the spec, so by default they behave just like other OpenGL state, like texture objects – the data is unmodified by external events like modeswitches, switching the system into standby or hibernate mode, etc.

If the spec explicitly says VBOs, as any other GL state, should survive mode switches, I’m willing to rely on this, and if it practically doesn’t work , I would send a bug report.
Implementors are responsible for producing implementations that conform to the spec (which they contributed to, btw. It’s not like the spec fell from the sky. They designed it. And please don’t tell me OpenGL is old. VBO is no legacy feature.).

don’t forget that if you change colour depth you’ll need to get another pixel format which means releasing the gl context (and losing all resident textures etc etc)…

True, switching fullscreen off/on can change the pixel format without you noticing it. Which is baaaaaaad, and requires complete OpenGL reinitialization (hell, even entry points might not be valid anymore).

Originally posted by kehziah:
If the spec explicitly says VBOs, as any other GL state, should survive mode switches, I’m willing to rely on this, and if it practically doesn’t work , I would send a bug report.

Read the rest of the thread. From my point of view that part of the spec is an overstatement:

  1. OpenGL has nothing to do with modechanges, behaviour in that case should be specified by the platform dependent glue api, not by OpenGL specs.
  2. Your pixelformat may no longer be valid or have the same index in the new resolution.
  3. Due to win32 OS issues, there are implementations which preclude OpenGL resources surviving resolution changes of any type.

I think that it is a bad habit to rely on GL contex survival after a video mode change. Consider:

  • Onboard memory may not suffice for your previously allocated resources (textures, buffers, display lists etc).
  • Similarly, the onboard memory may become fragmented - how should the driver rearrange memory blocks?
  • The available pixelformats in one video mode may not be the same as those for another video mode (e.g. you may lose a buffer that you previously had, e.g. stencil or alpha).

…and a number of other technical issues that the driver may be facing, I’m sure.

Even if “most modern, decently behaving cards” can handle it, it’s hardly enough to think of it as a hard fact or a spec.

In other words: be nice, reinitialize your context! At the very least you should design your engine to be able to do both. With portability issues, different OSes, OS versions, cards, drivers etc in mind, the only safe way is to kill/recreate the context.

From my point of view : color depth should stay the same during the app’s lifetime. Corolary : an app should not change color depth dynamically (otherwise it is responsible for all the trouble of making sure that the pixelformat is still valid etc etc…).

Screen res change should be ok though. Yes, it can lead to onboard memory fragmentation; yes, some resident resources can be moved to system memory. That’s no big deal. The driver should be up to the task.

I agree that WGL is by far under-developed (there is no WGL ‘spec’ AFAIK). But we all know where that comes from.

Regarding the Win32 OS issues, I can’t imagine major IHVs approving statements like the one cited knowing that there is an irremediable obstacle in their major target platform that won’t allow them to implement it. There must be a workaround, I think they thought of it before.

Sure, “be nice and re-init everything” will always work. But if the spec says something, I should be allowed to rely on it. Especially regarding an “Issue” that is “resolved”. Otherwise, modify the spec. That’s no big deal either. But maintaining some coherency between theory (spec) and real world (shipping implementation) is not a bad thing IMO.

Originally posted by kehziah:
Screen res change should be ok though. Yes, it can lead to onboard memory fragmentation; yes, some resident resources can be moved to system memory. That’s no big deal. The driver should be up to the task.

Sure? Do you think the driver can move the back/zbuffer to system memory if there isn’t enough onboard memory for it?
And this is not an impossible situation, imagine how much memory you need for 4x FSAA, and even more if you are using stereo.

Originally posted by kehziah:
But if the spec says something, I should be allowed to rely on it

That’s why the spec shouldn’t say anything, and strictly speaking it doesn’t, as the issues section is not formal part of the spec (Do buffer objects survive screen resolution changes, etc.? RESOLVED: YES. This is not mentioned in the spec).

I don’t think the spec writers thought deeply enough about the implications of that paragraph (which not only affects VBO, but OpenGL implementations on ALL OSs). And that’s something I already complained about in this forum when VBO spec was first issued.

[This message has been edited by evanGLizr (edited 09-08-2003).]

So guys you wanna tell me that you are gonna change resolutions,switch to fullscreen mode and vise/versa without re-initializing system?!No that it’s not very safe…it’s ULTRA unsafe…

Hi Folks,
thanks for the great support!
In that case i’ll reinitalize the whole app.
Thanks again!

Originally posted by evanGLizr:
Sure? Do you think the driver can move the back/zbuffer to system memory if there isn’t enough onboard memory for it?
And this is not an impossible situation, imagine how much memory you need for 4x FSAA, and even more if you are using stereo.

True. But I guess you won’t try 1600*1200 4xFSAA stereo on a 16-MB TNT, will you?

I don’t think the spec writers thought deeply enough about the implications of that paragraph (which not only affects VBO, but OpenGL implementations on ALL OSs). And that’s something I already complained about in this forum when VBO spec was first issued.

In this thread, mcraighead states that the context survives mode switches (including color depth) with their driver under all win32 OSes, and cass that the only trouble is mapped buffers. If they can do it, others can as well.

Originally posted by kehziah:
True. But I guess you won’t try 1600*1200 4xFSAA stereo on a 16-MB TNT, will you?

This is exactly my point: I don’t think that assuming that the user doesn’t make “insane” selections is a good policy. That is begging for instability and p*ssed of users. Most non-technical users will surely think “this app is dead slow”, rather than “oh, perhaps the driver was forced to software rendering since my card does not support an on-board stencil buffer in the resolution that I have slected”.

Originally posted by kehziah:
In this thread, mcraighead states that the context survives mode switches (including color depth) with their driver under all win32 OSes, and cass that the only trouble is mapped buffers. If they can do it, others can as well.

Well, the context survives, but that does not really say what happens to it. Which buffers are kept? At what precision? Which textures are flushed? Etc. etc. Not only will you get better stability (potentially) by recreating the context, you also get better control over your resources and potentially better performance.

I agree that it’s a good thing that a context survives a mode switch (regardless of performance and resource issues) for windowed applications. I also think that it is reasonable to assume that the following sequence (or similar) should not affect the context (for a fullscreen window):

  1. Iconify window
  2. Mode switch (e.g. to desktop mode)
  3. Mode switch to fullscreen mode
  4. Restore window

But for other kinds of fullscreen mode swithces (i.e. user selected resolution) I don’t really see a need for not recreating the context. How often does the user change the video mode? Once or twice after installing the application, and perhaps after installing a new video card… Hardly enough for motivating cutting corners by not reinitializing the context.