Hi,

According to the following document (*), GK110 GPUs (eg. GTX 780) offer atomicMin/Max/And/Or/Xor operations on 64-bit unsigned integers.

How can I access this functionality through GLSL? It is neither defined in ARB_shader_image_load_store nor NV_shader_buffer_store.

Thanks,
Fred

(*) http://www.nvidia.com/content/PDF/ke...Whitepaper.pdf