Windows Problems

Sorry for the crosspost (and if this isnt an advanced topic) but i’ve had no response on the beginner board for a week and the problem may be more complex than i think.

Anyway, I’ve written a program that sets a window up and sets an GL rendering context. it works fine on my machine at work (win 2K matrox dual monitor card) but on my home machine (Win98 / Geforce 2 GTS) it dies when it tries to get a suitable pixel format (i.e ChoosePixelFormat() returns 0 which is invalid).

The pixel format isnt anything special and i’ve tried multiple combinations but nothing works. i’ve checked my Device context and the window handle and everythings fine (and as i’ve said it works at work).

Does anybody have any idea what the problem may be?

Cheers

Allan

The only obvious reason I can think of is that your PIXELFORMAT structure is asking for a pixel format that cannot be found…

Is it possible for you to e-mail me your initialization code (e-mail in my profile) ? It is very difficult to find the problem without seeing this part of the code… You may as well post it here.

Regards.

Eric

Have you tried enumerating the available pixelformats to check what is available on the different machines?

Also, from the code snippet in the beginners post I saw you were requesting a format that uses the GLSCREEN_DEPTH constant for both the colorBits and the depthBits… don’t know if that is really what you want.

Just my .02

Jean-Marc

Yeah - sorry people
GLSCREEN_DEPTH is a personal #define that was set to ethier 16 / 32. as i said i’ve tried loads and loads of combinations and it always fails.

heres the code i’ve been using:

#define GLSCREEN_DEPTH 16

//---------------------------------

PIXELFORMATDESCRIPTOR pfd;
memset(&pfd, 0, sizeof IXELFORMATDESCRIPTOR));

pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
pfd.nVersion = 1;
pfd.iPixelType = PFD_TYPE_RGBA;
pfd.dwFlags = (PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER | PFD_DRAW_TO_WINDOW);
pfd.cDepthBits = GLSCREEN_DEPTH;

int PixelFormat = 0;

if(!(PixelFormat = ChoosePixelFormat(g_Hdc, &pfd)))
{
MessageBox(NULL, TEXT(“ERROR : Could Not Choose Suitable Pixel Format”), sClassName, MB_ICONERROR);
return FALSE;
}
//-----------------------------------

apologies if the spacing is broke.

i’ll try doing the enum code in my lunch hour and see if that works when i get home. as an additional i’ve downloaded examples from Nehe and Gametutorials.com and they both compile and run fine. (i actually copied the pixelformat they use and tried that and it still didnt run).

Cheers for everyone’s help

Allan

Actually, in the snippet you sent by e-mail and the one that is posted here, you never set the cColorBits member of the PIXELFORMATDESCRIPTOR structure. Is it a mistake with cut’n’paste or is it actually like this in your code?

Regards.

Eric

That may be a stupid question but are you sure your HDC is valid? I mean, if you copied the init code from NeHe or somewhere else and it does not work, then I wonder whether you do something that makes the HDC valid on Win2K and invalid on Win98…

Can’t say for sure though…

Is it possible to get a sample app that shows the bug? (I mean, something that we can compile and run).

Regards.

Eric

I’ve just found something that may be the problem on the net. my program doesnt have an opengl calls in it at the minute so the compiler apparently doesnt link to opengl32.lib. when ChoosePixelFormat tries to use it internally it fails returning 0. I cant say for sure but win2k and win98 proberbly treat library loading in a different way (?)

anyway, it sounds plausible and i’ll try it tonight and post a rpely tomorrow. Thanks for all the help eveybody

(and it was a cut ‘n’ paste error - Doh!)

Cheers

Allan

ChoosePixelFormat, DescibePixelFormat and those other windows specific functions (not wgl) are not in opengl32.dll

Your pixelformat thing has an important flag not set

pixelDesc.iLayerType=PFD_MAIN_PLANE;

You should just copy someone elses code.

Someone had something as stupid like this in his tutorial

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR_MIPMAP_LINEAR);

so even tutorials have mistakes!

V-man

Originally posted by V-man:
[b]Your pixelformat thing has an important flag not set

pixelDesc.iLayerType=PFD_MAIN_PLANE;
[/b]

V-Man,

This is from the latest documentation files coming with Visual Studio .NET:

“iLayerType
Ignored. Earlier implementations of OpenGL used this member, but it is no longer used.”

I thought about that as well and checked the docs just to make sure… One thing I don’t get is what they call an “earlier” implementation… If the “later” one is the one that ships with XP (kidding ), that means you should set this flag all the time (well, I do anyway !).

Perhaps this is actually where the problem is…

As a side note, I agree with you on tutorials: you should really check 2 or 3 when copying such code, as some of them can have really big mistakes !

Regards.

Eric

I was curious so I just tried getting a pixelformat here without linking to opengl32.lib and things worked fine. My guess is that your problem lies elsewhere – I’d have to go with the theory that there’s something wrong with your DC.

(I’m on Win2K for what it’s worth.)

Have you tried downloading the PixelFormat utility to check which pixel formats are available on your system? That might be a good place to start.

Thanks for the help everyone but it’s sorted. The problem was that the “Helpful” compiler wasnt linking to opengl32.lib as i wasnt making any gl calls so ChoosePixelFormat (which uses it internally) was always failing.

i put a glGetString Call in the program and it worked fine.

Thanks again everyone

Allan

Huh?
So the DLLMain in opengl32.dll wasn’t being called so it didn’t load the ICD for ChoosePixelFormat to query?
Bloody hell - that’s a useful thing to know.

Originally posted by knackered:
Huh?
So the DLLMain in opengl32.dll wasn’t being called so it didn’t load the ICD for ChoosePixelFormat to query?
Bloody hell - that’s a useful thing to know.

I experienced this problem yesterday. I downloaded Microsoft’s pixelformat utility displayer, glenum, and looked through its code. Then I found this before the pixelformat operations:

/* Dummy call to force loading of opengl32.dll… */
glFlush();

When I added glFlush before the ChoosePixelFormat() call, it worked. Even Microsoft knows this behaviour.

People usually hasn’t this problem because they create the window before choosing the pixelformat, like this:

WM_CREATE:
//
WM_SIZE:
glViewPort(…); /Code to size the viewport/

When the window is created it receives both messages, so a OpenGL command is executed before ChoosePixelFormat, and this implies opengl32.dll to be loaded.

But this time I wasn’t using any GL commands before ChoosePixelFormat, so it always failed.

secnuop, you say that you were able to get a valid pixelformat without linking against opengl32.dll, but maybe you didn’t use the PFD_SUPPORT_OPENGL flag.

[This message has been edited by Azdo (edited 06-27-2002).]

Originally posted by Azdo:
secnuop, you say that you were able to get a valid pixelformat without linking against opengl32.dll, but maybe you didn’t use the PFD_SUPPORT_OPENGL flag.
[/b]

That’s strange. I’m almost 99% positive I do have the PFD_SUPPORT_OPENGL flag (it’s in a part of my code that I don’t think I would have changed). Additionally, I know the pixel format that was returned matched the format I requested, so it seemed as though ChoosePixelFormat was actually doing what I expected it to do even though I wasn’t linking to opengl32.lib (to be perfectly accurate I never was linking to opengl32.dll.

I was assuming gdi was loading opengl32.dll as it needed to in order to query the available pixel formats and to choose the right one. I guess I need to investigate a little more though.

Just to confirm…

My pixel format descriptor does have the PFD_SUPPORT_OPENGL flag on and I am not linking agains opengl32.lib. I’m not even including gl.h or anything that should be including gl.h, so I’m positive there are no OpenGL functions being called in my program.

But I can see from the Visual Studio debug “output” window that opengl32.dll is being loaded and the driver OpenGL dlls are being loaded.

I’ll clean up the code a bit if somebody else would like to see it, but it’s really just window creation and ChoosePixelFormat.

[This message has been edited by secnuop (edited 06-28-2002).]

Originally posted by Azdo:
[b] [quote]

/* Dummy call to force loading of opengl32.dll… */
glFlush();

When I added glFlush before the ChoosePixelFormat() call, it worked. Even Microsoft knows this behaviour.

People usually hasn’t this problem because they create the window before choosing the pixelformat, like this:

WM_CREATE:
//
WM_SIZE:
glViewPort(…); /Code to size the viewport/

When the window is created it receives both messages, so a OpenGL command is executed before ChoosePixelFormat, and this implies opengl32.dll to be loaded.

[This message has been edited by Azdo (edited 06-27-2002).][/b][/QUOTE]

It receives WM_CREATE first, in which I create my opengl context before any GL calls, then comes WM_SIZE.

I also remember a microsoft example that had that “dummy cal” business in it.

From what I remember, a DLL is loaded as soon as a call is made that uses a function of that DLL. How does this situation get explaned?

V-man

This entire issue seams to be quite possible. When not manually loading a DLL yourself, it is up to the compilier and the static library to load the dll. ANd frankly it can do so whenever if feels like it. Which is most likely whenever the first call to a function in that library is made. The inital call (i am guessing) fails internally (in the static lib), the dll is then loading and the call should then succeed. Again this is all internal to the lib nothing you should see. For opengl this is a little retarded in that if you are linking a static lib you are definately going to be using the opengl32 dll at some point in time. For windows it makes good sense to defer dll loading since there are so many dll’s your code path may end up never needed half of them.

On a side note. After writing a couple dlls/plugins for 3d studio max I understand its plugin system quite well. They are just ordinary dlls. 3d Studio Max has its own dll/plugin manager which actuall autodefers loading of plugins. One of the exported functions in the dll that is required simple returns a 1 or 0 if you want your plugin autodefered. This is useful since big plugins like a fileimporter/exporter or Character Studio for example tend to use a lot of memory but aren’t really necessary to be running all the time. Character Studio won’t load until you try to use it. And the file exporters I believe are actually unloading after the file is written. Load unload Load unload. But if you have debugged a dll in max and seen how many dll’s it touches and loads you can understand why it functions the way it does. The list of dll’s just goes on and on and on in the debugger. Its ridiculous.

I have to say this has been a very interesting post. Very useful and very informative.

Devulon

Glad it’s been of interest to people (and i got rid of that damn problem)

Keep up the good work people

Allan

Originally posted by Devulon:
This entire issue seams to be quite possible. When not manually loading a DLL yourself, it is up to the compilier and the static library to load the dll.
Devulon

In fact, GetLastError() after my calling to ChoosePixelFormat() returns

126L ERROR_MOD_NOT_FOUND

which could be referring to opengl32.dll…

I haven’t run into this before, but something funky is happening on Win9X. MS knowledge base entry Q233390 has the following to say:

OpenGL applications calling ChoosePixelFormat may encounter Access Violations (AV) during debugging. After calling ChoosePixelFormat, the debug window will display one or more AVs in Gdi32.dll. The number of AVs correlates to the number of pixel formats for the device being used. This is a benign behavior that appears only with some video drivers.

Maybe there’s some related behavior that isn’t quite as “benign”.

Good that there’s an easy workaround.