avoid tearing effect in a camera acquisition project

Hello,

I have developed an application that connects to a camera and grabs images showing them in a real time video stream.
This application is developed in C++, the GUI is created through wxWidget, the OS is Raspbian and it is based on OpenGL.
I have followed these steps to activate the OpenGL driver:

and selected the “GL (Fake KMS) Desktop Driver”.

The CPU usage dropped down quit a lot but it is still high: it starts from 17% and it increases over time reaching 50%.
The problem is that the video presents a tearing effect as you can see if you look at the top part of the screen:

are there some options to enable in order to avoid this effect?
Thanks!

A SwapInterval of 1 is the usual default for most window system interface layers that OpenGL is stacked on top of (examples: WGL, GLX, AGL, EGL). This, if implemented properly by the WSI, should give you no tearing.

That said, it’s not uncommon for drivers to allow you to override the OpenGL application’s SwapInterval setting. So there may be some configuration you need to do on your RPi to ensure no tearing when applications request no tearing.

Websearching, there are plenty of hits of others having to do some tweaking on their RPi to avoid tearing issues. A few websearches you may want to do: “rpi opengl vsync”, “rpi opengl tearing”. For instance, link. See this wiki page for more info: Swap Interval.

Also for what it’s worth, the calls you might make to set the swap interval in an OpenGL app depending on your WSI: eglSwapInterval() (EGL), glXSwapIntervalEXT (GLX), wglSwapIntervalEXT() (WGL), etc. You might websearch the first 2 with “rpi” as well.

Thank you for your reply, in the next days I will try to search for that function and post the result!