m_iDepthBits is set to 24
m_iStencilBits is set to 8
If i set m_iColorDepth to 32, it works fine.
If i set it to 16, i get the Microsoft renderer and my program crashes (but that could be something else).
Do Geforces (with 45.25 drivers) only support 32 Bit framebuffers?
If i remember correctly i once used a 16 Bit buffer, but that only worked without an alpha-channel. Since i request an alpha-channel, it does not work.
You won’t get 16bit colour if you need destination alpha and/or stencil IIRC. With 24-bit colour and z you get nice 32bit alignment with alpha and z/stencil, which the memory controller probably likes.
Of course i want good quality. But only if the pc is fast enough. Isn´t that why people use a 16 Bit color-buffer? Because their pc is not good enough? I just want another option to reduce the quality but to increase the speed. And in a fillrate-limited situation the color-depth can make a big difference.
It says, that my Gf 4 supports only 2 16 Bit Color formats with alpha. However, those 2 formats use 16 Bit for RGB, but additionally 8 Bit for alpha. Plus, they are only for rendering to bitmaps (not to a window) and they are only rendered by software, so no hw acceleration.
Seems as if here is no 16 Bit color mode with alpha.