View Full Version : OpenGL Corruption?

03-10-2002, 04:26 AM
Im having some stange problems, very rarely, and Im 99% sure it isnt my fault, but I want to be absolutely positive about so I'd appreciate whatever feedback anyone can give me.

First, take a look at this picture: http://www.ronfrazier.net/corrupt.gif

This is my scene after the corruption has occurred. This very rarely happens. I first noticed it several months ago. It happened again a few days ago, and again yesterday. What happens is that everything runs fine, and then I switch to a graphics or modeling program, switch back, and everything looks like this.

Now, analyzing the scene, its quite interesting what happens. I can definitely see a hierarchy to the jumbled mess. As outlined in the image:

Item #1 is my landscape for the whole scene. I draw the entire landscape with coordinates relative to the global origin, which is in the bottom left corner of the image. So, the landscape quads which should cover about 75% of that image are all jumbled near the origin.

Items 2,3, and 4 are all models in the scene. All models are drawn with coordinates relative to their local origin. Each model seems to be in the correct place. #2 is a mostly box shaped object at the front of the scene (the front of the scene runs up from the bottom left corner at a roughly 45% angle). #3 is a prefect box located slightly behind and slightly higher relative to #2. #4 is a bunch of tree models located across the back of the scene.

So it seems that openGL is getting the projection and worldview transformations correct, but it seems that every object is jumbled around the local origin. The vertex coordinates are "somewhat correct", in that the 2 things that are boxed shape do indeed look approximately box shaped, and the landscape tiles (item #1) are also relatively correct with respect to each other (the red brick is in the left of the scene, the grass is in the middle and back right, and the light tan dirt is in the front right corner of the scene). The correct textures are also used for each object, thought the tex coords just "seem" wrong (as best as I can interpret from this jumbled mess).

For the details of what Im doing, I have a GeForce 2 64MB running the 23.11 drivers (I think thats correct...nvoglnt.dll says However, I think (but am not positive) that the first time this happened several months ago I was running a different set of drivers. All applications including mine are running windowed, 32bit desktop. All of my primitives are rendered from separate system memory vertex buffers using glDrawElements (I'm not using VAR or CVA). It seems almost like my vertex arrays are getting corrupted, but that is highly unlikely. All vertex buffers are allocated separately, and they all deallocate just fine, which means there wasnt just a sequential overwrite of the vertex buffers. I didnt think to try it, but if it happens again, I will try to create a new object with a new vertex buffer and see if that object shows the same errors.

Just for the record, the reason why it happened once several months ago, and then now twice in one week probably corresponds to the fact that I've been doing a bunch of graphics and modelling lately, so I dont think the problem is getting worse, just corresponding to my increased usage of other programs. I really dont think this is my fault, and from what I've seen I dont think this will ever be a problem on an end user's machine, but I'm just a little paranoid.

So what do you think?

03-10-2002, 07:01 AM
Perhaps you have memory corruption in your program, and switching in/out is one of the provokers? Have you turned on run-time memory checking? Do you have access to BoundsChecker or Purify or something like that?

03-10-2002, 11:44 AM
That was my first thought, but that seems hightly unlikely. First, no I dont either of those, but I am using Midnight's memory manager (if you arent familiar, look here: http://www.flipcode.com/cgi-bin/msg.cgi?...=askmid&id=-1). (http://www.flipcode.com/cgi-bin/msg.cgi?showThread=22August2000-MemoryManagement&forum=askmid&id=-1).) It does allocate additional memory before and after all allocations, and it checks them when deallocating to verify they werent modified. It is reporting no errors. It would also seem unlikely because each vertex buffer is a separate call to new[], so we couldnt just have a linear overwrite of the entire vertex buffers (as this would corrupt the verification blocks and trigger the memory manager to hit a breakpoint. Since every object in the scene gets trashed, this would mean every vertex buffer would have to be overwritten from head to tail individually. Also seems highly unlikely. Especially since I never touch any of these vertex buffers after they are loaded (I fill them once and am done with them). Additionally, I have run this program probably 1000 times, and I often switch in/out of programs while running (text editors and such), and never have a problem. It just seems to be switching between other programs using openGL, and even then only rarely (after it happened the last time, I spent 15 minutes trying to force it to happen again, and it didnt).

Next time it happens, I will definitely be sure to break and inspect the vertex buffer's contents to be sure there isnt a corruption. But in the mean time, I'm almost absolutely certain thats not it (unless of course, the driver is doing it while I have it bound, but again, seems highly unlikely).