Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 9 of 9

Thread: Can one actually select GPU on which to create GL context? (for Windows)

Hybrid View

  1. #1
    Junior Member Newbie
    Join Date
    Oct 2012
    Posts
    12

    Can one actually select GPU on which to create GL context? (for Windows)

    Hi Everyone,

    I am working on a research project where we need to ship our prototype (a small game with use of OpenGL) to our participants. No need to say that this application has to be robust and work with many different configuration of GPUs. Due to requirements of having a better control over messages from the OS to the app we decided to abandon GLUT, i.e., using win32 development. Moreover, in the case, described below, glut didn't worked either.

    However, yesterday I encountered with a kind of weird specs of a laptop, which I believe will become a norm pretty soon. It is a Core i7+Nvidia chip. This effectively gives windows 2 gpus: Intel HD 3000 and Nvidia 5XX.

    I found that to control which GPU is used we need to use affinity (for Nvidia) and association (for AMD/ATI). However, when you create a GL context in windows it creates it on Intel chip, thus, extension functions for GPU affinity (since the case for NVidia) are not available. Tried to load functions both manually and by using glew.

    Does anyone know how to select the NVidia GPU in this case?

    Does anyone has similar settings but with ATI/AMD chip, if so, do you have the same problem?

    Thanks for any help in Advance!!!

  2. #2

  3. #3
    Junior Member Newbie
    Join Date
    Oct 2012
    Posts
    12
    Thanks,

    I wonder if this will change, given that new intel cpus are coming with hd 3000/4000 and in all new laptops, that have discreet graphics, OS will always see two devices, and by default will use intel stuff.

  4. #4
    Super Moderator OpenGL Guru dorbie's Avatar
    Join Date
    Jul 2000
    Location
    Bay Area, CA, USA
    Posts
    3,946
    AMD has an extension for this, here is the whitepaper describing it:

    http://developer.amd.com/tools/gpu/A...WhitePaper.pdf

  5. #5
    Junior Member Newbie
    Join Date
    Oct 2012
    Posts
    12
    Thanks Dorbie, I saw this paper already but it was not what I was looking for. I was looking for a hardware independent type of API that allows to select from a set of gpus one. For instance, in my case I have two GPUs from intel (HD 3000) and nVidia, and it appears that in windows driver decides which one to pick for the context. Users, however, can override this logic, which is not an option for me.

    In such heterogeneous cases (several manufactures of hardware) we need EXT or ARB type of extensions, not specific from NV or AMD.

    And I am strongly convinced that my case in near future will be prevalent, since today every intel cpu now has a gpu, and I do not think that users will be satisfied with it, thus they will seeks for a discreet cards, from AMD and nVidia.

    Best,
    Ildar

  6. #6
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    I do not think that users will be satisfied with it
    How much do you think Microsoft cares? It's their Installable Client Driver model, which they developed at a time when having multiple GPUs made no sense. They never updated it, and I guarantee you they don't plan to. I guarantee you that Intel doesn't plan to put any back-doors into their driver to be able to select some non-Intel graphics driver either.

    So that's what users are stuck with on Windows.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •