The GL spec on double-precision attributes is odd. It says that all `double` and `dvec*` types use up only one attribute index. But when it comes time to count resources, the driver may count `dvec3` and `dev4` attributes twice.

I'm curious: on what hardware do these attributes count twice? I don't have access to any GL 4.x hardware, so I'd be interested to know if it's an AMD or NVIDIA thing, so I know who to blame for yet another OpenGL WTF moment.