Currently, all touch gestures are recognized based on the device's initial orientation. This becomes problematic for users who plan on using their device in different orientations and expect the gestures to adjust accordingly.
Libinput, as far as I'm aware, does not provide accelerometer data or device orientation data which could be used to change gestures accordingly.
One way of fixing this behavior is by reading accelerometer data through X11 like https://github.com/GuLinux/ScreenRotator does so.
The knowledge required to implement this functionality is above my skill level for now.
Currently, all touch gestures are recognized based on the device's initial orientation. This becomes problematic for users who plan on using their device in different orientations and expect the gestures to adjust accordingly.
Libinput, as far as I'm aware, does not provide accelerometer data or device orientation data which could be used to change gestures accordingly.
One way of fixing this behavior is by reading accelerometer data through X11 like https://github.com/GuLinux/ScreenRotator does so.
The knowledge required to implement this functionality is above my skill level for now.