View Full Version : Uniform buffer clobbered on XRandR mode change

10-20-2014, 03:20 PM
I have a program that runs in X11 and uses GLX. After adding a key to toggle fullscreen mode at runtime, I noticed a strange problem: my models would disappear. After poking around a bit, I disabled blending and the models reappeared, but were all black. This indicates that the uniform buffer holding material parameters was clobbered and effectively replaced with zeroes. After some more debugging I isolated the problem to my set_video_mode function, which uses several XRandR calls to change the monitor configuration to one suitable for the requested fullscreen mode. Using the xrandr command-line utility to modify the monitor configuration while the program is running has the same effect.

Is this something I should have expected? Or is it a driver bug? I thought it's the OpenGL implementation's job to maintain validity of the context and all objects within it. Any idea why only that single, tiny uniform buffer gets clobbered and the much larger vertex and index buffers are left intact?

I have a GeForce GTX 660 with driver version 340.32 and X.org version 1.16.1.

11-03-2014, 02:21 AM
Do you use the extensions GLX_ARB_create_context_robustness and GL_ARB_robustness?

See this quote from the GL_ARB_robustness spec:

If the reset notification behavior is NO_RESET_NOTIFICATION_ARB,
then the implementation will never deliver notification of reset
events, and GetGraphicsResetStatusARB will always return
[fn1: In this case it is recommended that implementations should
not allow loss of context state no matter what events occur.
However, this is only a recommendation, and cannot be relied
upon by applications.]

It's not very clear to me, but I will assume this only applies to contexts that were created with the GLX_CONTEXT_ROBUST_ACCESS_BIT_ARB flag.
Otherwise, if outside events are allowed to damage the context of unsuspecting (GL_ARB_robustness-unaware) applications, that would be extremely rude and break many old applications.

So I'd say if you are not creating the GLX context with GLX_CONTEXT_ROBUST_ACCESS_BIT_ARB, then losing the contents of your buffer should be a driver bug.