Software rendering after a few destroy/create cycles

When I have my program recreate (destroy/create) the rendering window, after a few times of doing this, I get software rendering. It looks like some resource on the card is going away, but I can’t understand why. BTW, the pixel format I get is still one that claims to be hardware, so I guess that it’s the ICD that’s giving me software rendering, not Windows.

I verified that for the window creation, GetDC and wglCreateContext there are corresponding wglDeleteContext, ReleaseDC and window destruction that are called, and that they all succeed (don’t return an error).

While this OpenGL window is just part of all the windows created, I’m pretty sure that the problem is with it, and not other window resources not being freed. This was tested by reducing the size of the OpenGL window, and it took a lot more creation/deletion cycles to get software rendering.

Any idea what could be causing this? I’m running on Windows 2000 with a 32MB GeForce SDR with 6.31 reference drivers.

Would be great if someone could answer, I have exactly the same problem.

many thanx,
Nils

Are you deleting your textures and then reloading them? When you destroy the HGLRC you lose access to the textures.

Are the threads associated with the windows’ message loops also being shut down?

  • Matt

Are you using wglMakeCurrent(NULL,NULL) before deleting the thread’s context?

WhatEver - this happens even without textures.

mcraighead, I’ll have to check this. Why does this matter?

DFrey, no. Didn’t think it was necessary from the documentation. Update: tried it, didn’t help (why would it even be helpful?).

[This message has been edited by ET3D (edited 05-03-2001).]

Have you checked if this problem occurs with no GL calls other than the WGL stuff?

It’s not that difficult to test if you replace the opengl.h with a dummygl.h that has a horde of empty #defines for each function you use (which you can get by compiling without the lib and cut’n’paste the linker-errors)

#define glGet(_a, _b)
#define glMatrixMode(_a)
#define glDrawRangedElements(_a,_b,…)

#define glXXX_func(_param)
etc.

Then you can remove your gl calls without modifying your code-

Thanks for the idea, V. I can’t implement it as you suggest, because WGL always reports that the context is in hardware, and only the slow rendering speed tells me that it’s not. But I can probably disable rendering for the first few window creations (by having the rendering thread not call any rendering functions) and see if that makes a difference.

[b]

DFrey, no. Didn’t think it was necessary from the documentation. Update: tried it, didn’t help (why would it even be helpful?).
[/b]
Why? Because as far as I knew, it could have been due to a buggy OpenGL implementation (one not always releasing resources when it should). It’s also called a shot in the dark. It was just an idea tossed out. Everheard of a brainstorming session? You don’t have to be so rude simply because you didn’t understand my motivation for suggesting something.

Sorry, didn’t mean to be rude. I really wanted to understand why it could be useful. I really should start taking the medication for my foot in mouth disease.

Problem solved. Looks like creating and deleting the rendering context in the rendering thread caused the problem. Once I moved the creation and deletion to the main thread, the problem disappeared. I still have no idea why this happens, but at least the problem is solved.