Weird VAR observation
I'm probably doing something in the wrong order, or something...
My app works like this:-
- OpenGL context is created - extensions are initialised etc.
- First pass through render loop, a static bool flag indicates that I must wglAllocateMemoryNV() to allocate 4mb of fast memory (i've copied the allocate_memory() function from the learning_var demo). Static bool flag is set to indicate memory has been allocated (so the app will never call the allocate memory function again).
- render mesh using var and fences.
- loop back
I'm getting reasonable frame rates, but not as good as the learning_var demo.
Now, if I change to full screen if the app was started in window'd mode, or change to window'd mode if the app started in full screen, the gl window and context are destroyed, and re-created. Because of the static bool flag, the agp/video memory is not reallocated, because I don't reset the flag. I get a massive performance boost! Around double the frame rate!
What would explain this?