GLX issue since Ubuntu 16 with Python 3 extension (glXChooseFBConfig returns NULL)

I’ve spend the day tracking down this issue and I’m running out of places to look for answers…

I’m building a CPython extension which uses OpenGL. The core library is working fine when tested out of Python (straight C++ exe).
However, when calling the exact same code from the Python extension (a .so) the glXChooseFBConfig call returns NULL.

I’ve tried modifying my initialization code with no luck.

  • I’ve traced all SO events under GDB to confirm both the working C++ version and the failing extension are loading the same libraries and they are.
  • I’m turning to this subreddit as the issue happens on two very different machines both running Ubuntu 16. One Intel X64 and the other a Tegra X2 devkit.
  • export DISPLAY=:0 fails on GetX11Display().

Any idea on where I could look next?
Thanks

Ok, that’s good.

However, when calling the exact same code from the Python extension (a .so) the glXChooseFBConfig call returns NULL.

From that, I infer that:

  1. You have called XOpenDisplay(), and
  2. It succeeded.

Correct?

What “display_name” did you pass to “XOpenDisplay()”?

…the issue happens on two very different machines both running Ubuntu 16. One Intel X64 and the other a Tegra X2 devkit.

Ok. Run “glxinfo” on both of these boxes and post the output inside of [noparse]

...

[/noparse] tags. Please also report what $DISPLAY setting you’re using on each of these two boxes. This should correspond to the “display_name” that you’ve been passing “XOpenDisplay()”.

export DISPLAY=:0 fails on GetX11Display().

What do you mean? The export command just sets a shell variable and shouldn’t fail. Are you instead saying that when you set DISPLAY to :0 and then run (some program), then it fails on GetX11Display()? What app were you running? Also, what is GetX11Display() and what is it doing? Is this some function you wrote?

Thanks for taking the time to answer, I just solved my problem…
It was (as usual) a stupid overlook on my end.

The X11 display variable was unfortunately stored in a static global and the .so was setting it back to NULL when loading the renderer (the variable is in the window system which a core component).

What really threw me off was that the problem was not there when running the executable directly. I’m not sure why but meh… :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.