Renderer Detection, GDI Generic & hardware accel.

Hello,

I’ve been experiencing a strange problem, where OpenGL extensions viewer detects my renderer as my video card properly (Nvidia GeForce GO 7950 GTX) but some other benchmarks/demos/games do not, instead defaulting to “GDI Generic”

OpenGL Extensions Viewer runs tests verifying me up to OpenGL 2.1 with good performance and 100% compatibility.

FurMark, another benchmark, does not detect OpenGL 2.0 and defaults to GDI Generic. I have been told that FurMark uses the same initialization code as the following ( http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=24 ) and that this detection/initialization process is quite standard. Neither of these sees the hardware…

I am wondering if there is something in particular here that could be going wrong?

I have hardware acceleration to full in my windows preferences, have tried uninstalling drivers, Driver Sweeping them, and re-installing. I have verified that I have the latest forceware drivers (x64 for my system).

I am wondering if this is related to my system, I am running Windows XP Pro 64-bit, with two Geforce 7950s running in SLI.

My drivers are up to date and x64.

Here is an excerpt from OpenGL Extensions Viewer:

Renderer: GeForce Go 7950 GTX/PCI/SSE2
Vendor: NVIDIA Corporation
Memory: 512 MB
Version: 2.1.2
Shading language version: 1.20 NVIDIA via Cg compiler


Max texture size: 4096 x 4096
Max texture coordinates: 8
Max vertex texture image units: 4
Max texture image units: 16
Max geometry texture units: 0
Max anisotropic filtering value: 16
Max number of light sources: 8
Max viewport size: 4096 x 4096
Max uniform vertex components: 1024
Max uniform fragment components: 1024
Max geometry uniform components: 0
Max varying floats: 32
Max samples: 16
Max draw buffers: 4


Extensions: 137

...

Core features
v1.1 (100 % - 7/7)
v1.2 (100 % - 8/8)
v1.3 (100 % - 9/9)
v1.4 (100 % - 15/15)
v1.5 (100 % - 3/3)
v2.0 (100 % - 10/10)
v2.1 (100 % - 3/3)
v3.0 (34 % - 8/23)
v3.1 (25 % - 2/8)
v3.2 (0 % - 0/9)

OpenGL driver version check (Current: 6.14.11.7948, Latest known: 6.14.11.7948):
Latest version of display drivers found
According the database, you are running the latest display drivers for your video card.

Compiled vertex array support
This feature improves OpenGL performance by using video memory to cache transformed vertices.

Multitexture support
This feature accelerates complex rendering such as lightmaps or environment mapping.

Secondary color support
This feature provides an alternate method of coloring specular highlights on polygons.

S3TC compression support
This feature improves texture mapping performance in some applications by using lossy compression.

Vertex array range support
This feature improves performance in some applications by using AGP for dynamic vertex transformation.

Texture edge clamp support
This feature improves texturing quality by adding clamping control to edge texel filtering.

Vertex program support
This feature enables a wide variety of effects via flexible vertex programming (equivalent to DX8 Vertex Shader.)

Fragment program support
This feature enables a wide variety of effects via per pixel programming (equivalent to DX9 Pixel Shader.)

Texture anisotropic filtering support
This feature improves the quality of texture mapping on oblique surfaces.

Occlusion test support
This feature provides hardware accelerated culling for objects.

Point sprite support
This feature improves performance in some particle systems.

OpenGL Shading Language support
This feature enables high level shading language for shaders.

Frame buffer object support
This feature enables render to texture functionality.

but then, the output from a prototype of a game I am trying to run ( the results are also detecting GDI Generic, like FurMark, and like the sample code file linked to above ):

============================================================================
Log file of 'C:\Program Files (x86)\Infinity\ICP\InfinityClientProto.exe'
============================================================================

Free disk space: 45207.9 Mb
Total disk space: 190771.8 Mb
System time is 9/27/2009 16:02:16.

C:\Program Files (x86)\Infinity\ICP\InfinityClientProto.exe, run by Administrator.

Operating system:  Windows 2003 Server (5.2.3790).

4 processor(s), type 586.

20% memory in use.

4094 MBytes physical memory.

3255 MBytes physical memory free.

0 MBytes paging file.

0 MBytes paging file free.

2048 MBytes user address space.

1988 MBytes user address space free.

Vendor: GenuineIntel
Number of CPUs: 4
CPU(s) name: Unknown processor
CPU(s) speed: 2660 Mhz
Features1: FPU VME DE PSE TSC MSR PAE MCE CX8 APIC MTRR PGE MCA CMOV PAT PSE36  
Features2: MMX FXSR SSE SSE2  
Features3: AMD3DNOW  

[516 ms, -140402%/97%] Installing IKernel
[516 ms, -140400%/97%] Installing ISound
[516 ms, -140400%/97%] Installing IPhysics
[562 ms, -140398%/97%] Checking for setup2.ini file existance
[562 ms, -140398%/97%] Thread 3996 set to cpu 0
[1064 ms, -140395%/96%] Client thread installed
[21639 ms, -140847%/96%] Waiting queue...
[21639 ms, -140847%/96%] Accepted...
[21657 ms, -140846%/96%] Setting shader level: 2, real colors: 0, texture level: 2, preloading: 0
[21657 ms, -140846%/96%] Initializing
[21796 ms, -140842%/96%]     Creating renderer
[21868 ms, -140810%/96%]         Installing IOpenGLRenderer
[21868 ms, -140810%/96%]         Installing IRenderer
[21868 ms, -140810%/96%]         Installing IInput
[21892 ms, -140809%/96%]         Initializing OpenGL..
[22486 ms, -140743%/96%]         Color bits: 24 asked, got 24
[22486 ms, -140743%/96%]         Alpha bits: 8 asked, got 8
[22487 ms, -140743%/96%]         Depth bits: 24 asked, got 24
[22487 ms, -140743%/96%]         Stencil bits: 8 asked, got 8
[22609 ms, -140707%/96%]         OpenGL renderer installed
[22626 ms, -140697%/96%]         Vendor String: Microsoft Corporation
[22627 ms, -140697%/96%]         Renderer String: GDI Generic
[22627 ms, -140697%/96%]         Version String: 1.1.0
[22627 ms, -140695%/96%]         Detected an ATI video card
[22655 ms, -140683%/96%]         EXCEPTION DETECTED
[22655 ms, -140682%/96%]         Exception:
[22655 ms, -140682%/96%]         Exception in:
[22655 ms, -140682%/96%]         - File: F:\Projects\I-Novae\Src\Engine\IOpenGLRenderer\COpenGLRenderer.cpp
[22655 ms, -140682%/96%]         - Line: 192
[22655 ms, -140682%/96%]         The operation is not supported by your system/configuration
[22655 ms, -140681%/96%]         
[22655 ms, -140681%/96%]         Details:
[22655 ms, -140681%/96%]         InfinityClientProto requires hardware acceleration. Make sure your video card
[22655 ms, -140681%/96%]         drivers are up-to-date. You can get the most recent drivers at:
[22655 ms, -140681%/96%]         www.nvidia.com for NVidia card (TNT/Geforce series)
[22655 ms, -140680%/96%]         www.ati.com for ATI cards (Radeon series)
[22655 ms, -140680%/96%]         
[22655 ms, -140680%/96%]         If InfinityClientProto still doesn't run, try to change your desktop settings
[22655 ms, -140680%/96%]         (16 to 32 bits colors), disable antialiasing or other unusual settings.
[22655 ms, -140680%/96%]         You can press OK to ignore the error and continue at your own risks, or
[22655 ms, -140680%/96%]         CANCEL to stop the program now.
[22655 ms, -140680%/96%]         
[23851 ms, -140656%/96%]         IPhysics uninstalled
[23851 ms, -140656%/96%]         ISound uninstalled
[23851 ms, -140656%/96%]         IScene uninstalled

I would really appreciate any feedback / alternatives / fixes to my problem!!

Thanks a bunch for your time!!

I think the problem is with ChoosePixelFormat because this function takes a PIXELFORMATDESCRIPTOR and gives you back a an integer that IDw a pixelformat that best matches what you have asked for. Obvisouly, it is choosing a bad pixelformat since it should prefer a hw accelerated format.

The best thing to do is not use ChoosePixelFormat.
Use DescribePixelFormat to find out about all the possible formats and choose one with your OWN CODE.
Then call SetPixelFormat.

Also, yes, it is standard code. You will have problems with a lot of OpenGL programs. I hope the source code is available :slight_smile:

Sorry to necro/bump that topic, but I’m the creator of that program, and the problem still hasn’t been fixed :slight_smile:

It happens for a very, very small minority of users, and it’s not specific to my program. I asked the user to test Nehe’s program (I use the same initialization code for ChoosePixelFormat) and GLEW’s “visualinfo” and they have the same problem, they only detect GDI Generic formats.

A few months ago, I implemented the workaround of DescribePixelFormat and checking for compatible formats myself, and none of the formats matched.

Now, the really weird part for me is that the OpenGL extensions viewer is working on the OP’s machine, which means that it’s not a hardware or a driver problem. I have no idea what the OpenGL extensions viewer is doing different during its initializations.

Y.

What values do you set in the PixelFormatDescriptor ?

If DescribePixelFormat does not list formats which you know should be available then perhaps the problem is in the Window DC.
What window styles do you specify in the dwStyle parameter of CreateWindowEx ?

In your RegisterClassEx call do you set CS_OwnDC ?

Hi!

I have the exact same problem on several machines. My custom software can’t get hardware accelerated OpenGL, but the OpenGL Extensions viewer software detects that I have 2 rendering contexts on my machine; the GDI Generic and the GeForce 9400M that I would like my software to choose. I can even run the tests in OGLExt viewer on both renderers and clearly see that the GDI fails on most tests… but why does that get picked when using ChoosePixelFormat and then SetPixelFormat??

My OpenGL initialization is almost entirely the same as NeHe’s code, and Nehes original code fails as well…

My PixelFormatDescriptor looks like this:

PIXELFORMATDESCRIPTOR pfd;
int iFormat;
// set the pixel format for the DC
ZeroMemory( &pfd, sizeof( pfd ) );
pfd.nSize = sizeof( pfd );
pfd.nVersion = 1;
pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.cColorBits = 24;
pfd.cDepthBits = 16;
pfd.iLayerType = PFD_MAIN_PLANE;

I haven’t tried implementing the DescribePixelFormat way yet but from the earlier posts I doubt that it will solve my problems.

My code sets the CS_OWNDC flag as well.

Any more ideas? Please help! :slight_smile:

Sincerely
/ Mattias

I would recommend always using:
pfd.cColorBits = 32;
pfd.cDepthBits = 24;
as its the most commonly supported by hardware.

Set window style flag WS_CLIPCHILDREN.

What do you get if you call DescribePixelFormat for PixelFormat number 1 on a problem machine ?
The OpenGL formats are usually listed first so this one should return a PIXELFORMATDESCRIPTOR with PFD_SUPPORT_OPENGL set.

Is it happening only on particular types of hardware or particular driver versions ?

I just ran GLInfo on a GeForce 8800GTX driver 190.38 and it listed a lot of PixelFormats with neither PFD_SUPPORT_OPENGL or PFD_GENERIC_FORMAT, which seems odd, maybe ChoosePixelFormat is returning one of these.

Try using DescribePixelFormat to read all the available PixelFormats on a problem machine and print the returned ColorBits, DepthBits and PFD_SUPPORT_OPENGL and PFD_GENERIC_FORMAT flags for each one.

You can see the result of visualinfo here:
http://www.infinity-universe.com/Infinit…67175#msg267175

This is my PFD structure:
static PIXELFORMATDESCRIPTOR pfd =
{
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW |
PFD_SUPPORT_OPENGL |
PFD_DOUBLEBUFFER |
PFD_SWAP_EXCHANGE,
PFD_TYPE_RGBA,
24,
0, 0, 0, 0, 0, 0,
8,
0,
0,
0, 0, 0, 0,
24,
8,
0,
PFD_MAIN_PLANE,
0,
0, 0, 0
};

The style for the window class are CS_HREDRAW | CS_VREDRAW | CS_OWNDC.

It happens so rarely that I cannot say if it happens on a specific hardware/driver version.

Y.

Hi again! I tried changing to 24 bit depth buffer but with no luck. Same thing with DescribePixelFormat with argument 1. And the window style flag was already set.

I tried the code from the bottom of this page (http://www.wischik.com/lu/programmer/wingl.html#accelerated) to go through all pixel formats available, and none of them are marked as ICD, and all of them are marked as software modes.

Just for information so that I haven’t left anything out; I’m running Windows 7 on a MacBook Pro 13", with a 9400M graphics card.

No matter what I do it selects the GDI Generic mode… one more interesting thing I found was to search for the OpenGLDrivers key in regedit using this path: “SOFTWARE\Microsoft\Windows NT\CurrentVersion\OpenGLDrivers”, and it is indeed not set at all, which I guess might suggest that I just need to download new drivers for my graphics card. BUT - how in the world does OpenGL Extensions Viewer manage to create a test mode where it runs accelerated graphics, and also how does that peice of software allow me to switch between devices/renderers… is there a Select/Enumerate Device capability somewhere in OpenGL that I just haven’t found?

This is driving me crazy :slight_smile:

The only thing that i can think of is that OpenGL Extensions Viewer may be bypassing OpenGL32.dll and instead searching for known GPU installable client driver DLL’s and directly loading them itself.

For NVIDIA GPU’s there should be the registry key:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\OpenGLDrivers\RIVATNT
which contains the values:
Dll REG_SZ ‘nvoglnt’
DriverVersion REG_DWORD 0x00010000
Flags REG_DWORD 0x00000003
Version REG_DWORD 0x00000002

In Windows XP 32-bit The NVIDIA ICD DLL should be found at:
C:\WINDOWS\system32
voglnt.dll

In Windows Vista 64-bit there are two versions:
C:\Windows\SysWOW64
voglv32.dll
C:\Windows\System32
voglv64.dll

ATI should be simiilar to this, just different file names.

If most OpenGL programs dont work then the driver is not properly installed (especially if the registry key is missing).
Note that most laptops dont let you install the normal drivers, you have to use the drivers provided by the laptop manufacturer (Although there are ways around this).

Interesting; when looking through the registry I found keys containing nvoglv64 and nvoglv32 under both HKEY_LOCAL_MACHINE/SYSTEM/ControlSet001/Control/Class/long serial no/0000 and HKEY_LOCAL_MACHINE/SYSTEM/ControlSet001/Control/Video/long serial no/0000 called OpenGLDriverName and OpenGLDriverNameWoW.

So maybe that makes sense, and Gl Ext View loads those dlls by itself. Really weird… but I guess the only sane thing for me to do would be to download new drivers and see what happens of course…

… to follow up a bit on V-man’s suggestion to avoid ChoosePixelFormat, the MSDN states only that the “closest” match is returned (as if a PFD were a point in some space) and guarantees uniqueness of function pointers per pixel format …

…yet another thing i recently tripped over in some dll ass-hattery of my own is that you apparently need to load opengl32.dll before you call SetPixelFormat on your dc, otherwise function pointers aquired therefrom, like wglCreateContext, will fail with an invalid pixel format error. Several other GDI procs seem to have wgl counterparts hooked or some such in the same way. Nothing officially documented that i could find…

I remember reading something of the sort when I started learning GL, but that was a long time ago. I don’t remember what it was about even and never bothered to do anything about it since I never had problems even on crappy Matrox and Intels.