In case the compositor wants to cancel a touch sequence, we need to
ignore subsequent touch motion and touch up events until a new sequence
is initiated by the user.
Previously, it was implicitly handled by clearing the mapping table
between the touch slots and touch ids generated by kwayland-server.
Once in a while, we receive complaints from other fellow KDE developers
about the file organization of kwin. This change addresses some of those
complaints by moving all of source code in a separate directory, src/,
thus making the project structure more traditional. Things such as tests
are kept in their own toplevel directories.
This change may wreak havoc on merge requests that add new files to kwin,
but if a patch modifies an already existing file, git should be smart
enough to figure out that the file has been relocated.
We may potentially split the src/ directory further to make navigating
the source code easier, but hopefully this is good enough already.
Occasionally, I see complaints about the file organization of kwin,
which is fair enough.
This change makes the source code more relocatable by removing relative
paths from includes.
CMAKE_CURRENT_SOURCE_DIR was added to the interface include directories
of kwin library. This means that as long as you link against kwin target,
the real location of the source code of the library doesn't matter.
With autotests, things are not as convenient as with kwin target. Some
tests use cpp files from kwin core. If we move all source code in a src/
directory, they will need to be adjusted, but mostly only in build
scripts.
This allows running Xwayland apps as root. Xwayland started with an
empty Xauthority file. After kwin has received the display number, the
file is updated with an actual authority entry.
BUG: 432625
As it was pointed out in 6acf35e4cc, it is
better to return raw pointers than qpointers because returning a qpointer
is equivalent to constructing a new one.
Warning messages are not the kind of messages that should be ignored,
they indicate that something is off or wrong.
Also, this makes triaging bugs easier as we no longer have to ask people
to run kwin with the QT_LOGGING_RULES environment variable set.
This logs to a tracefs filesystem which can be viewed in tools such as
gpuvis to see precise timings of activities in relation to other trace
markers in X or graphic drivers.
This patch is loosely based on D23114. Though modified with thread
safety, support for string building, and a RAII pattern for durations.
Ultimately that expanded it somewhat.
At the moment, our frame scheduling infrastructure is still heavily
based on Xinerama-style rendering. Specifically, we assume that painting
is driven by a single timer, etc.
This change introduces a new type - RenderLoop. Its main purpose is to
drive compositing on a specific output, or in case of X11, on the
overlay window.
With RenderLoop, compositing is synchronized to vblank events. It
exposes the last and the next estimated presentation timestamp. The
expected presentation timestamp can be used by effects to ensure that
animations are synchronized with the upcoming vblank event.
On Wayland, every outputs has its own render loop. On X11, per screen
rendering is not possible, therefore the platform exposes the render
loop for the overlay window. Ideally, the Scene has to expose the
RenderLoop, but as the first step towards better compositing scheduling
it's good as is for the time being.
The RenderLoop tries to minimize the latency by delaying compositing as
close as possible to the next vblank event. One tricky thing about it is
that if compositing is too close to the next vblank event, animations
may become a little bit choppy. However, increasing the latency reduces
the choppiness.
Given that, there is no any "silver bullet" solution for the choppiness
issue, a new option has been added in the Compositing KCM to specify the
amount of latency. By default, it's "Medium," but if a user is not
satisfied with the upstream default, they can tweak it.
At the moment, the Screens class is convoluted with ifdefs because of
MockScreens.
The goal of this change is to reduce the number of usages of the
MockScreens class so it is possible to get rid of the ifdefs.
Since the Screens class is a convenience wrapper around AbstractOutput
objects that come from the Platform, it should not be platform-specific.
By dropping createScreens(), output-related code becomes simpler.
This change introduces a new component - ColorManager that is
responsible for color management stuff.
At the moment, it's very naive. It is useful only for updating gamma
ramps. But in the future, it will be extended with more CMS-related
features.
The ColorManager depends on lcms2 library. This is an optional
dependency. If lcms2 is not installed, the color manager won't be built.
This also fixes the issue where colord and nightcolor overwrite each
other's gamma ramps. With this change, the ColorManager will resolve the
conflict between two.
Effects are given the interval between two consecutive frames. The main
flaw of this approach is that if the Compositor transitions from the idle
state to "active" state, i.e. when there is something to repaint,
effects may see a very large interval between the last painted frame and
the current. In order to address this issue, the Scene invalidates the
timer that is used to measure time between consecutive frames before the
Compositor is about to become idle.
While this works perfectly fine with Xinerama-style rendering, with per
screen rendering, determining whether the compositor is about to idle is
rather a tedious task mostly because a single output can't be used for
the test.
Furthermore, since the Compositor schedules pointless repaints just to
ensure that it's idle, it might take several attempts to figure out
whether the scene timer must be invalidated if you use (true) per screen
rendering.
Ideally, all effects should use a timeline helper that is aware of the
underlying render loop and its timings. However, this option is off the
table because it will involve a lot of work to implement it.
Alternative and much simpler option is to pass the expected presentation
time to effects rather than time between consecutive frames. This means
that effects are responsible for determining how much animation timelines
have to be advanced. Typically, an effect would have to store the
presentation timestamp provided in either prePaint{Screen,Window} and
use it in the subsequent prePaint{Screen,Window} call to estimate the
amount of time passed between the next and the last frames.
Unfortunately, this is an API incompatible change. However, it shouldn't
take a lot of work to port third-party binary effects, which don't use the
AnimationEffect class, to the new API. On the bright side, we no longer
need to be concerned about the Compositor getting idle.
We do still try to determine whether the Compositor is about to idle,
primarily, because the OpenGL render backend swaps buffers on present,
but that will change with the ongoing compositing timing rework.
There were multiple other cases of placing the mouse between screens at
the start of tests. It seems to be all copy paste.
Only maximise and pointerConstraints were failing before this, but we
may as well fix all of them.