Color depth woes...

Hi,

I have the binaries (but as of yet no source) for a particular OpenGL application and I need to know what color depth the application is trying to use…it does NOT have a true full screen capability that I know of, so I can’t use the monitor to tell me what mode it is in. The program runs much faster and looks better when I am in 32-bit mode on my TNT2 than in 16-bit, but some of the eventual users of this program will be using cards that only support 24-bit color (no 8-bit alpha channel).

So, is there an easy way to snoop out what sort of pixel format descriptor the application is using, or otherwise is there a way to force 24-bit color (with no alpha channel) on my TNT2, which usually only gives the option of 8/16/32 bpp?

Thanks!

Originally posted by Jared@ETC:

So, is there an easy way to snoop out what sort of pixel format descriptor the application is using,

Use a debugger that can set breakpoints on system functions like wgl* that take PFDs as input. Don’t forget about the pixel format ARB extension.

[This message has been edited by roffe (edited 07-09-2003).]