PDA

View Full Version : Why my OpenGL program uses Microsoft GDI renderer instead of my GeForce 5200



ioquan
06-27-2004, 11:53 AM
I just installed Windows XP and a GeForce 5200 FX. My OpenGL engine used to use my old video card (GeForce 4 MX) to render, but with the new video card and OS it is defaulting to the Microsoft GDI renderer. What can cause this? Is there anything I should look to change in my code?

tfpsly
06-27-2004, 12:09 PM
You should upgrade your drivers: the default drivers comming with XP are craps and they do not handle ogl very well.

rd2
06-27-2004, 12:50 PM
The color depth of your desktop can cause this; I've seen this when it's set to 16 bit colors instead of 32.

/D

zeckensack
06-27-2004, 01:26 PM
Right. On a 16 bpp desktop you won't get alpha bits in the framebuffer, and you won't get stencil. If your application requests any of these, try without.

SirKnight
06-27-2004, 06:21 PM
Yeah you're not going to get alpha using a 16bit color but you can still get 8 bits of stencil with a 16bit color format that is accelerated. It's format type 210 (R5G6B5A0 / Z24S8).

You might want to get this program I found on nehe.gamedev.net under the downloads section called "iGL" which is an OpenGL Information Viewer. It can show you all pixel format types and what's accelerated and what's not. It's a neat program. It can tell you about everything you want to know about your graphics system (as far as OpenGL goes).

EDIT: Actually there's more than just format type 210 that gives 8 stencil with 16bpp. Thanks to that program I was able to find that out. :D

-SirKnight

al_bob
06-27-2004, 11:27 PM
It's format type 210AFAIK pixel format numbers are implementation-specific. Two video cards will likely not have the same list of supported pixel formats or enumerants. Pixel format numbers are also OS-specific (ie: Windows has them, Linux/X does not).

ioquan
06-28-2004, 10:53 AM
Thanks for all of your replies, the problem is solved now.

SirKnight
06-28-2004, 10:59 AM
Originally posted by al_bob:

It's format type 210AFAIK pixel format numbers are implementation-specific. Two video cards will likely not have the same list of supported pixel formats or enumerants. Pixel format numbers are also OS-specific (ie: Windows has them, Linux/X does not).Exactly, but since he is using windows and an nvidia fx card there wouldn't be any problem looking up the format by that number I gave. But what I should have done is mentioned what you did in addition.

EDIT: Looking up in that program I talked about might I add. :)

-SirKnight