Nvidia's GPU affinity extension is a complete failure

I can’t believe how murky and complicated this spec is:
http://developer.download.nvidia.com/opengl/specs/WGL_nv_gpu_affinity.txt

All I need is a setting to indicate “don’t use the damn Intel integrated chip”.

AFAIK, NV_gpu_affinity is only applicable in the context of multiple NVIDIA GPUs, it has nothing to do with mixed vendor setups, like that of an integrated Intel + NVIDIA discrete GPU.

Check page 3 of the Optimus Renderint Policies guide at https://developer.nvidia.com/optimus to see how to enable the discrete GPU for you app.

According to that document you can do this:

Starting with the Release 302 drivers, application developers can direct the Optimusdriver at runtime to use the High Performance Graphics to render any application–eventhose applications for which there is no existing application profile. They can do this byexporting a global variable named NvOptimusEnablement. The Optimus driver looks forthe existence and value of the export. Only the LSB of the DWORD matters at this time. Avalue of 0x00000001 indicates that rendering should be performed using HighPerformance Graphics. A value of 0x00000000 indicates that this method should beignored.

extern "C" {
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}

Is there something equivalent for AMD cards?