Hello board,
I am working on a complex Qt/OpenGL Application.
Xorg starts leaking in VRAM when i’m using the application and never release the memory, until I restart X of course.
$ nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.48 Driver Version: 390.48 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 105... Off | 00000000:01:00.0 Off | N/A |
| N/A 46C P8 4W / N/A | 50MiB / 4040MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 29628 G /usr/lib/xorg-server/Xorg 47MiB |
+-----------------------------------------------------------------------------+
$ ./myOpenGLQtBasedApp ... doing graphic stuff then exiting
$ nvidia-smi
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.48 Driver Version: 390.48 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 105... Off | 00000000:01:00.0 Off | N/A |
| N/A 46C P8 4W / N/A | 110MiB / 4040MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 29628 G /usr/lib/xorg-server/Xorg 107MiB |
+-----------------------------------------------------------------------------+
The version of Xorg does not matter, tested a few.
The version of the driver does not matter, as long as it’s nvidia, tested 340, 384, 390
The linux distribution does not matter, tested Ubuntu 16.04, 18.04, fedora
The de does not matter, tested Unity, Gnome-shell, Xfce, Lxde + Compton, Openbox + compton
The compositor used does not matter, but the leak disappear without a compositor.
I did not test Wayland.
Do you know what could cause this behavior ?
Could this be due to OpenGL Sharing Context ?
If yes, where and how could it be implemented in our application or in Qt ? Could we force OpenGL not to share anything between processes ?
If not, what could, in our code create this behavior ?