Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 6 of 6

Thread: GeForce second GPU usage

  1. #1
    Newbie Newbie
    Join Date
    Nov 2013
    Posts
    3

    GeForce second GPU usage

    I have a PC with 2 Geforce GTX 660 GPUs. I want to use both GPUs for rendering of each frame. (similar to the old SFR SLI).

    Directx allows me to use the second GPU via adapter enumurator. But in OpenGL i cannot found how to use second card without SLI. Regardless of the GPU usage, the second GPU is never used for rendering. We have tried opening several GPU load applications such as Furmark.

    We have also tried monitor enumerator function in Windows API and forced the context creation to be in the second monitor which is connected to the second GPU. Yet, the rendering is still done in the first GPU.

    NVIDIA slides suggest using the GPU-affinity function, but this functionality is only available in Quadro graphics cards.

    How can I use the second card for rendering in OpenGL?

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,109
    I don't believe you can use 2 nVidia cards without SLI or having a Quadro card in the mix

  3. #3
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    3,126
    On Linux you can easily. I guess this is just a limitation on the Windows side.

  4. #4
    Newbie Newbie
    Join Date
    Nov 2013
    Posts
    3
    Quote Originally Posted by Dark Photon View Post
    On Linux you can easily. I guess this is just a limitation on the Windows side.
    How can we use second GPU in Linux? Does nvidia driver distribute the jobs regardless of our context or we do something special when creating context in order to be able to use second GPU?

  5. #5
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    3,126
    The latter (since you're presuming no SLI). You designate which GPU you're talking to.

  6. #6
    Newbie Newbie
    Join Date
    Nov 2013
    Posts
    3
    Thanks for your replies. I found that via XOpenDisplay we can select GPU we want in Linux. However, in Windows EnumMonitor doesn't do the same thing. The only way for windows seems affinity but it is only available in Quadro cards not GeForce.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •