Summary:
How drag'n'drop works on Wayland:
When a surface has a pointer grab and a button pressed on the surface
(implicit grab) the client can initiate a drag'n'drop operation on the
data device. For this the client needs to provide a data source
describing the data to be transmitted with the drag'n'drop operation.
When a drag'n'drop operation is active all pointer events are interpreted
as part of the drag'n'drop operation, the pointer device is grabbed.
Pointer events are no longer sent to the focused pointer but to the
focused data device. When the pointer moves to another surface an
enter event is sent to a data device for that surface and a leave
event is sent to the data device previously focused. An enter event
carries a data offer which is created from the data source for the
operation.
During pointer motion there is a feedback mechanism. The data offer
can signal to the data source that it can or cannot accept the data
at the current pointer position. This can be used by the client being
dragged from to update the cursor.
The drag'n'drop operation ends with the implicit grab being removed,
that is the pressed pointer button which triggered the operation gets
released. The server sends a drop event to the focused data device.
The data transfer can now be started. For that the receiving client
creates a pipe and passes the file descriptor through the data offer
to the sending data source. The sending client will write into the
file descriptor and close it to finish the transfer.
Drag'n'drop could also be initiated through a touch device grab, but
this is not yet implemented.
The implementation in this change focuses on the adjustments for pointer.
For the user of the library drag'n'drop is implemented in the
SeatInterface. Signals are emitted whenever drag is started or ended.
The interaction for pointer events hardly changes. Motion, button press
and button release can still be indicated in the same way. If a button
release removes the implicit grab the drop is automatically performed,
without the user of the library having to do anything.
The only change during drag and drop for the library user is that
setFocusedPointerSurface is blocked. To update the current drag target
the library user should use setDragTarget. Sending the enter/leave to the
data device gets performed automatically.
The data device which triggered the drag and drop operation is exposed
in the SeatInterface. The user of the library should make sure to render
the additional drag icon provided on the data device. At least QtWayland
based applications will freeze during drag and drop if the icon doesn't
get rendered.
The implementation is currently still lacking the client side and due to
that also auto test. It's currently only tested with QtWayland clients.
Reviewers: #plasma, sebas
Subscribers: plasma-devel
Projects: #plasma
Differential Revision: https://phabricator.kde.org/D1046
Summary:
The signal gets emitted whenever the focused PointerInterfaces gets
newly set or reset to nullptr. This is needed to better track the
current cursor image in the compositor.
Reviewers: #plasma, sebas
Subscribers: plasma-devel
Projects: #plasma
Differential Revision: https://phabricator.kde.org/D1007
So far we only supported mapping global to surface-local coordinates
using a 2D-offset. With this change it's possible to register a
QMatrix4x4 to describe the transformation for going from global to
surface-local coordinates in a full 3D space.
The existing 2D-offset is transformed to use the new matrix based
variant describing a translation.
REVIEW: 126271
* Raises wl_seat supported version to 4 in both server and client
* Raises wl_keyboard supported version to 4 in wl_keyboard
* wl_pointer and wl_touch are still on version 3
* Raises minimum Wayland version to 1.6
The selection is supposed to be sent to the DataDeviceInterface just
before getting keyboard focus. In order to do that the SeatInterface
keeps track of the DataDeviceInterface which is the current selection
and the DataDeviceInterface of the focused keyboard client.
SeatInterface friends DataDeviceManagerInterface so that the latter
can register each created DataDevice for the SeatInterface.
The button state is a seat-global state and not a per pointer state.
All pressed/released and axis events are moved to the SeatInterface
and just invoke the related method on the focused surface pointer.
Makes PointerInterface more like other Interface classes wrapping
wl_resource. The most important change is the handling of the
focused surface. This is now kept in the SeatInterface and can also
be set if there is no PointerInterface for the client yet.
The unit tests had to be adjusted and some are also disabled as the
button events are not yet moved into SeatInterface.
This method is supposed to return the PointerInterface for the current
focused surface. At the moment it just creates the one global
PointerInterface. The existing SeatInterface::pointer method got
removed as that is actually wrong usage.
There can only be one focused surface per Seat, thus the information
should be hold in the seat.
This only adjusts the API, the actual data is still hold in the
PointerInterface. This still needs adjustment.
New base class KWayland::Server::Global which all Interface classes
for a wl_global inherit. Furthermore there is a shared base class
for all the Private classes of that type.
Qt::MouseButton is mapped to the linux buttons. The mapping does
not match the mapping used in QtWayland module [1] as that seems
to be incorrect to me. E.g. Qt::BackButton is not mapped to BTN_BACK.
[1] qtwayland/src/client/qwaylandinputdevice.cpp
This makes it possible to install and then use it. Installation is still
commented since we can't give enough stability guarantees for now.
In detail:
- do not actually install headers
- generate the export header into Wayland/Server
- include it from there
REVIEW:120579
So far the Seat interface is provided together with pointer and
keyboard. As always touch is not yet implemented. The pointer interface
is still lacking the set cursor callback. Keyboard on the other hand is
complete.
Both Keyboard and Pointer have the concept of a focused surface and only
to the bound interface belonging to the same client as the focused
surface events are sent.
The change comes with a set of new auto tests also verifying the client
side which wasn't possible before as we couldn't fake events.