Summary:
Device is created from Connection::processEvents which is run in the
main gui thread, while Connection itself is in a different thread. Thus
passing *this* as parent is wrong.
This change removes the parent, moves the created Device into the
Connection thread and also deletes properly by using deleteLater.
Reviewers: #plasma
Subscribers: plasma-devel
Tags: #plasma
Differential Revision: https://phabricator.kde.org/D1746
Summary:
The signals emitted by LibInput::Connection carry the Device for which
the input event was received. This Device is passed to the input handlers.
Custom event classes are added which extend QMouseEvent, QKeyEvent and
QWheelEvent respectively and expose the Device. The Device is only passed
around as a forward declared pointer, so even if compiled without libinput
support, it should still compile.
Event handlers which need to get access to the Device can now just cast
the event pointer to the custom class and access it. This can be used in
future to handle device specific key codes, etc.
As we don't have a proper event classes for touch events the event
handlers do not yet have access to the Device. Here the internal API
needs to be adjusted in future.
Reviewers: #plasma
Subscribers: plasma-devel
Tags: #plasma
Differential Revision: https://phabricator.kde.org/D1667
Summary:
The Event class now holds a pointer to the Device and not only to the
native libinput_device.
Reviewers: #plasma
Subscribers: plasma-devel
Tags: #plasma
Differential Revision: https://phabricator.kde.org/D1666
Summary: Similar to the already existing hasKeyboard, hasTouch and hasPointer.
Reviewers: #plasma
Subscribers: plasma-devel
Tags: #plasma
Differential Revision: https://phabricator.kde.org/D1671
Summary:
The LibInput::Device provides a way to enable/disable the device.
This is used by the Connection to toggle all touchpad devices on/off
when the touchpad key is pressed. For this KWin "steals" the global
shortcuts from the touchpad kded.
Detecting what is a touchpad is unfortunately not tivial. The code
uses the following approach:
* it's a pointer
* it's not also a keyboard or touch screen
* it's at least one of the following:
** supports multiple tap fingers
** supports disable while typing
** supports disable on external mouse
If the code finds a touchpad and changes it's state successfully,
it triggers the touchpadEnabledChanged on Plasma's osdService.
Test Plan: Tested on notebook with toggle touchpad button
Reviewers: #plasma
Subscribers: plasma-devel
Projects: #plasma
Differential Revision: https://phabricator.kde.org/D1545
Summary:
The configuration file kcminput, group Mouse is parsed to decide whether
pointer devices should be in left handed mode. The config is applied
whenever a new device is added.
In addition the Connection listens to KGlobalSettings for a mouse
settings changed signal which might be emitted by the mouse KCM.
When such a signal is emitted, all pointer devices are reconfigured.
This allows to change the mouse handed settings on Wayland.
Reviewers: #plasma
Subscribers: plasma-devel
Projects: #plasma
Differential Revision: https://phabricator.kde.org/D1543
Summary:
The Device class wraps all the information we can get from libinput
about the device, like whether it's a keyboard, pointer, touch, etc.
In addition some more information is queried to figure out how "useful"
a device is. For a keyboard all alphanumeric keys are checked whether
they exist, for a pointer all (normal) buttons are queried.
All the information is exposed as Q_PROPERTY and used by the
DebugConsole. The DebugConsole gained a new tab "Input Devices" which
renders all devices and their properties in a tree view. When plugging
in/out a device, the model gets reset, so it's always up to date.
The new Device class can be used in future to configure the device,
e.g. disable touch pad, set mouse acceleration, etc.
Reviewers: #plasma
Subscribers: plasma-devel
Projects: #plasma
Differential Revision: https://phabricator.kde.org/D1538
The Connection thread fills the event queue, it gets read from the
main thread. In order to properly support the threaded approach the
setup is changed to delegate into the own thread.
When the Connection is created we move it into a dedicated thread
so that even processing happens in the thread. Currently all events
are still queued directly.
This change is motivated by the fact that we need to suspend libinput
before switching the virtual terminal. Also we don't want to take over
libinput if we do not have a VirtualTerminal created - in windowed mode
we don't want libinput to be started. So binding it to the backends which
create the VirtualTerminal makes sense.
The KWin::Application gains a new signal virtualTerminalCreated which is
emitted from VirtualTerminal once it's properly setup. This is used by
Input to create Libinput integration instead of binding it to logind.
Furthermore Libinput gets suspended when the VirtualTerminal reports that
it is no longer active. For re-activation we still just use logind's
session active property.
When our session gets inactive libinput loses all devices, thus our
Seat would not have neither keyboard, pointer nor touch. To not confuse
all connected clients we block updates while libinput is suspended. After
resume we check whether something actually changed and emit the
corresponding signals to ensure everything is up to date.
We handle device added/remove to monitor whether we have keyboard,
pointer and touch devices and emit signals. Those are used to update
the SeatInferface from InputRedirection.
LogindIntegration starts monitoring the Active property on the session
and emits a signal when the state changes. The LibInput::Connection
connects to this signal during the setup and uses it to suspend/resume
the libinput context.
Libinput is an optional dependency for getting low level input events.
As opening the input devices requires root privs this is rather
pointless in the current state. But there is a small added test app which
can be executed with root privs to demonstrate the functionality. To
properly get input events we need a wrapper like it's used in weston.
So far the following is setup:
* opening devices found by udev
* forwarding keyboard events to InputRedirection
* forwarding pointer button events to InputRedirection
* forwarding pointer axis events to InputRedirection
* signals emitted for pointer motion events
Pointer motion events need some further work as they are provided
as delta events. We need to track that and map them properly.
Also missing are touch events due to me not having a touch screen.
It should be fairly simple to setup the touch events, though.
Also hotplugging of devices is not yet implemented.