Driver issues with GL 3.2 core on Mac Mini 2010 (OS X.8.2)

I’m developing a native app on Mac Mini 2010 model (with GeForce 320M video card) under OS X.8.2. One specific shader causes the application to be unstable.
Symptoms:
In 50% launches I can see half-drawn (in a sense that not all objects are there) first frame blinking with black.
In other 50% launches it goes fine, while still there is a 5-10% chance of going into blinking mode in about 10 seconds past the start.
When in blinking mode, the app is semi-responsive. Most often my message loop code is still running (and I close can close the app normally), but sometimes (20%) the whole system freezes for about 5 sec. One time it didn’t unfreeze at all, forcing me to reboot. I suspect these freezes are driver reset pauses (experienced similar behavior with Catalyst on Windows station as well).

No symptoms are detected if I don’t use the shader, though I didn’t test this too much. The shader is relatively simple: deriving the normal from a height map (not using dFd* instructions), using it for tangent space lighting with a single point light with no shadow. Shader compile/link logs don’t say anything. I tried cutting off shader features to find out what exactly the driver doesn’t like, but it turned out that there is no exact thing it doesn’t like. Cutting features off just slowly reduces the chance of bug happening.

I tried using OpenGL Profiler (bundled with XCode) to investigate, but it crashes randomly when the app is in that blinking mode. When the app runs OK, the profiler also works fine for me.

My question is: how do I investigate the case? What debug information I can see?
I could provide a testcase for Apple/Nvidia but I’m not sure who to contact on the matter.

Do you use discard or return statements in the fragment shader? I had a shader that used discard and return without writing anything to the outputs, and the Nvidia/OSX combo produced black for the entire shader. Once I assigned a dummy value to the output before discard; return; everything worked. Granted, this was a long time ago (10.6.2, I think) on the old GL2.1 driver, but your situation sounded a bit familiar so I thought I’d throw it out there.

Thanks for looking into it, malexander. My shader doesn’t use discard, and it unconditionally assigns the only output vector.

I have found that Apple’s GLSL clang-based compiler is very sensitive to uninitialized variables. If you have locals that you are conditionally assigning, try giving them a default, even if you know they won’t be used. If that is not the case, can you post the shaders?

I don’t think there are any conditionals in the code…

Please see the shader source here:
https://gist.github.com/4322513

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.