Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: GLX issue since Ubuntu 16 with Python 3 extension (glXChooseFBConfig returns NULL)

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Newbie Newbie
    Join Date
    Sep 2017
    Posts
    2

    GLX issue since Ubuntu 16 with Python 3 extension (glXChooseFBConfig returns NULL)

    I've spend the day tracking down this issue and I'm running out of places to look for answers...

    I'm building a CPython extension which uses OpenGL. The core library is working fine when tested out of Python (straight C++ exe).
    However, when calling the exact same code from the Python extension (a .so) the glXChooseFBConfig call returns NULL.

    I've tried modifying my initialization code with no luck.
    - I've traced all SO events under GDB to confirm both the working C++ version and the failing extension are loading the same libraries and they are.
    - I'm turning to this subreddit as the issue happens on two very different machines both running Ubuntu 16. One Intel X64 and the other a Tegra X2 devkit.
    - export DISPLAY=:0 fails on GetX11Display().

    Any idea on where I could look next?
    Thanks

  2. #2
    Senior Member OpenGL Guru Dark Photon's Avatar
    Join Date
    Oct 2004
    Location
    Druidia
    Posts
    4,167
    Quote Originally Posted by papaours View Post
    I'm building a CPython extension which uses OpenGL. The core library is working fine when tested out of Python (straight C++ exe).
    Ok, that's good.

    However, when calling the exact same code from the Python extension (a .so) the glXChooseFBConfig call returns NULL.
    From that, I infer that:
    1) You have called XOpenDisplay(), and
    2) It succeeded.

    Correct?

    What "display_name" did you pass to "XOpenDisplay()"?

    ...the issue happens on two very different machines both running Ubuntu 16. One Intel X64 and the other a Tegra X2 devkit.
    Ok. Run "glxinfo" on both of these boxes and post the output inside of [code]...[/code] tags. Please also report what $DISPLAY setting you're using on each of these two boxes. This should correspond to the "display_name" that you've been passing "XOpenDisplay()".

    export DISPLAY=:0 fails on GetX11Display().
    What do you mean? The export command just sets a shell variable and shouldn't fail. Are you instead saying that when you set DISPLAY to :0 and then run (some program), then it fails on GetX11Display()? What app were you running? Also, what is GetX11Display() and what is it doing? Is this some function you wrote?

  3. #3
    Newbie Newbie
    Join Date
    Sep 2017
    Posts
    2
    Thanks for taking the time to answer, I just solved my problem...
    It was (as usual) a stupid overlook on my end.

    The X11 display variable was unfortunately stored in a static global and the .so was setting it back to NULL when loading the renderer (the variable is in the window system which a core component).

    What really threw me off was that the problem was not there when running the executable directly. I'm not sure why but meh...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •