Pixel Format vs. Display Format

It is clear (and unfortunate) that the pixelformats you can enumerate are dependent on the device context you use to query. This means that the only way to query backbuffer pixel formats for different display (frontbuffer) formats is to actually set the display to that resolution and then query.

This created a big question for me. What would happen if I created a window with a pixel format only available for one display format and then switched to another display format. Using nVidia’s pixelformat tool I created a window with a pixel format for a 32-bit display mode, then switched to 16-bit mode. To my surprise it appeared to worked just fine.

So, now I am left to wonder if this is defined behavior, or if it works because nvidia decided to make it work. Also, is the backbuffer pixel format still 32-bit, or was the pixel format actually changed when I switched modes (and would a call to GetPixelFormat be accurate).

It seemed to work with both the EXCHANGE and COPY swap methods, so I am betting that the pixel format was silently modified when I switched. Of course, exhange and copy are just hints, so it might not mean anything.

Do not rely on behavior like this. It is not safe as per the documentation. I can’t say whether the behavior is planned or lucky.

-Evan

that are my expiriences with it:

it only sometimes behaviour correctly (and not with every chipset).
i did it some time ago, and first i thought: i don’t need do anything special to change the current resolution/bpp. but this turned out to not to be safe. i got strange effects,
when i switched more than one time in an application.
ie.: when i switched from 16bit to 32bit and then back to 16bit, dithering didn’t work anymore.(even if i called glEnalbe for it)
in some cases, switching from 16 to 32 bits had no noticable effects…

the only stable soultion i found was to create a new window every time i change the colornumber (bpp). (i simply created a 2nd child window over my mainapp window) this works fine in my app.