The EffectLoader is a subclass of AbstractEffectLoader delegating all
methods to instances of:
* BuiltInEffectLoader
* ScriptedEffectLoader
* PluginEffectLoader
It's used by the EffectsHandlerImpl and replaces the complete Effect
loading mechanism we so far found in it. This also means that KLibrary
is no longer needed to load the Effects as the PluginEffectLoader uses
the KPluginTrader, which removes lots of deprecated functionality.
REVIEW: 117054
All the connections in EffectsHandlerImpl are replaced by the new syntax.
Where it makes sense the wrapping slot method is added as a lambda and
the slot method is removed.
REVIEW: 117076
Implemented in KWin core to forward to new global shortcut system. This
method should be extended/changed once we go to Qt5/KF5 to make the usage
easier (no more KAction).
Each global shortcut in the effects makes use of this new method.
InputRedirection forwards pointer events (currently motion, press and
release) through the EffectsHandlerImpl for the case that an effect has
intercepted pointer events.
If the KWin operation mode is not X11 only, the window for intercepting
the mouse events is no longer created.
EffectsHandlerImpl::isEffectsSupported performs the check whether the
effect with the given name is supported by the current compositor.
The check is the following:
* if effect is already loaded, it is supported
* if the effect cannot be found, it is not supported
* if it's a scripted effect, it's always supported
* if it's a built-in effect, we ask BuiltInEffects::supported
* for all other effects we resolve the library and the supported
method
The idea behind providing this functionality in the DBus interface is
to allow filtering in the effects KCM for the effects which are
supported by the current compositor.
In addition a areEffectsSupported method is added which takes a
list of names and returns a list of bools.
REVIEW: 116665
Screens provides a size which is constructed from the size of
the bounding geometry of all screens and provides an overload taking
an int to return the size of a specified screen. For geometry() a new
ovload is added without an argument, which is just a convenient wrapper
for QRect(QPoint(0, 0), size()).
Both new methods are exported to effects and scripting as new
properties there called virtualScreenSize and virtualScreenGeometry.
The (virtual) size gets cached in screens and is updated whenever the
count or geometry changes.
Construction of Screens is slightly changed by moving the init code
from ctor into a virtual method init(). Reason is that we ended in
a loop with accessing the singleton pointer before it was set.
REVIEW: 116114
Loading all effects during startup can take some time[1] and during
that time the screen is frozen as the loading blocks the compositor.
This change doesn't load effects directly but puts them into a queue.
The loading is controlled by invoking the dequeue through a queued
connection. Thus we get a firing compositing timer in between and can
ensure that a frame is rendered when needed and also react to X events
during the loading.
[1] On my high-end system the set of effects I use take about 200 msec
to load.
REVIEW: 115297
As all effects have always been compiled into the same .so file it's
questionable whether resolving the effects through a library is useful
at all. By linking against the built-in effects we gain the following
advantages:
* don't have to load/unload the KLibrary
* don't have to resolve the create, supported and enabled functions
* no version check required
* no dependency resolving (effects don't use it)
* remove the KWIN_EFFECT macros from the effects
All the effects are now registered in an effects_builtins file which
maps the name to a factory method and supported or enabled by default
methods.
During loading the effects we first check whether there is a built-in
effect by the given name and make a shortcut to create it through that.
If that's not possible the normal plugin loading is used.
Completely unscientific testing [1] showed an improvement of almost 10
msec during loading all the effects I use.
[1] QElapsedTimer around the loading code, start kwin five times, take
average.
REVIEW: 115073
Effects can access the QPainter used by SceneQPainter to directly render
into the back buffer.
Obviously only available in Compositing Type QPainterCompositing.
Client used to have dedicated methods for different icon sizes instead
of combining all pixmaps into one QIcon. This resulted in various parts
of KWin having different access to the icons:
* effects only got one pixmap of size 32x32
* decorations only got the 16x16 and 32x32 pixmaps combined into a QIcon
* tabbox could request all icon sizes, but only as pixmap
Now all sizes are available in one QIcon allowing to easily access the
best fitting icon in a given UI.
With QtQuick2 it's possible that the scene graph rendering context either
lives in an own thread or uses the main GUI thread. In the latter case
it's the same thread as our compositing OpenGL context lives in. This
means our basic assumption that between two rendering passes the context
stays current does not hold.
The code already ensured that before we start a rendering pass the
context is made current, but there are many more possible cases. If we
use OpenGL in areas not triggered by the rendering loop but in response
to other events the context needs to be made current. This includes the
loading and unloading of effects (some effects use OpenGL in the static
effect check, in the ctor and dtor), background loading of texture data,
lazy loading after first usage invoked by shortcut, etc. etc.
To properly handle these cases new methods are added to EffectsHandler
to make the compositing OpenGL context current. These calls delegate down
into the scene. On non-OpenGL scenes they are noop, but on OpenGL they go
into the backend and make the context current. In addition they ensure
that Qt doesn't think that it's QOpenGLContext is current by calling
doneCurrent() on the QOpenGLContext::currentContext(). This unfortunately
causes an additional call to makeCurrent with a null context, but there
is no other way to tell Qt - it doesn't notice when a different context
is made current with low level API calls. In the multi-threaded
architecture this doesn't matter as ::currentContext() returns null.
A short evaluation showed that a transition to QOpenGLContext doesn't
seem feasible. Qt only supports either GLX or EGL while KWin supports
both and when entering the transition phase for Wayland, it would become
extremely tricky if our native platform is X11, but we want a Wayland
EGL context. A future solution might be to have a "KWin-QPA plugin" which
uses either xcb or Wayland and hides everything from Qt.
The API documentation is extended to describe when the effects-framework
ensures that an OpenGL context is current. The effects are changed to
make the context current in cases where it's not guaranteed. This has
been done by looking for creation or deletion of GLTextures and Shaders.
If there are other OpenGL usages outside the rendering loop, ctor/dtor
this needs to be changed, too.
Button Press/Release do no longer fall through to motion notify as
there is no shared mouse event in xcb. Also the methods in Effects and
TabBox are adjusted to process only button press/release or motion
notify.
ScreenEdges are no longer checked for button press/release. They don't
interact on button press/release so there is no need to check it.
This provides some sort of synthetic XSYNC support
for unmanaged clients and allows them to do an initial
update after mapping and before being painted (prevent
flicker)
Also it helps with Unmanaged clients performing quick
map/unmap/map cycles what also seems to induce the black
window issue on the nvidia blob.
CCBUG: 284888
BUG: 319184
FIXED-IN: 4.11
REVIEW: 111292
Eg. gtk+ alters the modality after mapping and
before unmapping the window.
Therfore the former implementation ahd a wrong idea
about the modality until the window was activated and
again had a wrong idea when the dialog closed, keeping
the main client dimmed.
Modality changes at runtime are uncommon but legal and can
happen anytime.
BUG: 321340
FIXED-IN: 4.11
REVIEW: 111154
Cross fading with previous pixmap is achieved by referencing the old
window pixmap. WindowPaintData has a cross-fade-factor which interpolates
between 0.0 (completely old pixmap) to 1.0 (completely new pixmap).
If a cross fading factor is set and a previous pixmap is valid this one
is rendered on top of the current pixmap with opacity adjusted. This
results in a smoother fading.
To simplify the setup the AnimationEffect is extended and also takes care
about correctly (un)referencing the previous window pixmap. The maximize
effect is adjusted to make use of this new capabilities.
Unfortunately this setup has a huge problem with the case that the window
decoration gets smaller (e.g. from normal to maximized state). In this
situation it can happen that the old window is rendered with parts outside
the content resulting in video garbage being shown. To prevent this a set
of new WindowQuads is generated with normalized texture coordinates in
the safe area which contains real content.
For OpenGL2Window a PreviousContentLeaf is added which is only set up in
case the crass fading factor is set.
REVIEW: 110578
With the removal of BoxSwitch all effects which want mouse events use the
fullscreen input window. The available functionality is too complex both
in EffectsHandler and in the Effects.
With this change only fullscreen input windows are supported and all
effects share the input window. This means there is at maximum one input
window. This simplifies the code in the Effects as they don't have to
keep track of the window they created any more. In EffectsHandler it
means that only one window needs to be created, destroyed and raised.
Also it means that we can properly react on screen size changes which had
been ignored in the past. Also quite some roundtrips to X are no longer
needed as we do not need to query the window geometry when creating the
input window.
REVIEW: 110156
The non-composited part handles the showWithX case with the four small
windows. The composited part shows a translucent QWidget with the
FrameSvg as done by the selection effect frame.
Outline connects to the Compositor toggled signal to switch the mode if
compositing gets suspended/resumed. This works fine also in the case that
the switch happens while the outline is shown. To support this Outline
is now a QObject and created with Workspace as a parent.
Given that the Outline handles both cases by itself, the outline effect
is no longer needed and is dropped together with all the hooks into the
effect system.
A specialised paintScreen method to render all windows of one
desktop. It's intended to be called during an already started
paintScreen process to get e.g. a thumbnail of a desktop.
Currently not yet exported to the Effects.
Everything that has nothing to do with rendering the window thumbnail
goes into an AbstractThumbnailItem.
This is a preparation step for adding a DesktopThumbnailItem.
With Qt5 QCursor does no longer provide ::handle() which was used to
set a cursor on a native XWindow for which we do not have a QWidget.
Also KWin has had for quite some time an optimized version to get the
cursor position without doing XQueryPointer each time ::pos() is called.
These two features are merged into a new class Cursor providing more or
less the same API as QCursor.
In addition the new class provides a facility to perform mouse polling
replacing the implementations in Compositor and ScreenEdges.
For more information about the new class see the documentation for the
new class in cursor.h.
EffectsHandlerImpl starts to monitor DBus for the screen being locked and
provides this information to the Effect system by allowing them to ask
whether the screen is currently locked and by emitting a signal when the
screen gets locked/unlocked.
This information is needed to ensure that no private data is shown on the
screen. The following effects are adjusted:
* taskbar thumbnails
* thumbnail aside
* mouse mark
* screen shot
BUG: 255712
FIXED-IN: 4.11
REVIEW: 108670
The main difference is that the activation of an edge is no longer
broadcasted to all effects and scripts, but instead a passed in slot of
the Effect/Script is invoked.
For this the EffectsHandler API is changed to take the Effect as an
argument to (un)reserveElectricBorder. As callback slot the existing
borderActivated is used.
In addition the ScreenEdge monitors the object for beeing destroyed and
unregisters the the edge automatically. This removes the need from the
Effect to call unregister in the dtor.
BUG: 309695
FIXED-IN: 4.11
No effect has ever used these methods and there is no reason why an
effect should use them. Reserve/unreserve is sufficient as the effect
will be notified anyway.
Instead of each effect, which needs to announce support, having custom
code to create a property and set it on the root window, there is now a
common API in EffectsHandler to take care of this.
The methods takes care of creating the atom if it has not already done
and set the property on the root window. Furthermore it allows multiple
effects to announce the same property without getting in conflict with
each other.
As a further convenience the property is automatically removed when the
effect is unloaded, so less things an effect author has to care about.
REVIEW: 107815
Two new interfaces are introduced:
* org.kde.kwin.Compositing
* org.kde.kwin.Effects
The Compositing interface is generated from scriptable elements on the
KWin::Compositor class and the Compositor is exported as /Compositor.
It provides the general Compositing related D-Bus methods like whether
the compositor is active and toggling and so on.
The Effects interface is generated from scriptable elements on the
KWin::EffectsHandlerImpl class and the instance is exported as /Effects.
It provides all the effects related D-Bus methods like loading an effect
or the list of all effects.
This removes the need to have all these methods provided on the global
org.kde.KWin interface. For backwards compatibility they are kept, but
no longer provided by the Workspace class. Instead a new DBusInterface
is generated which wrapps the calls and delegates it to one of our three
related Singleton objects:
* Workspace
* Compositor
* EffectsHandlerImpl
Obsoletes the need to go through the Workspace object to get to
the Compositor.
TODO for future: make the Compositor being the parent object for
the EffectsHandlerImpl.
Closing Review and bug from this commit, which is the top most
of the patch series.
REVIEW: 106060
BUG: 299277
FIXED-IN: 4.10
The Scene has always been created and destroyed inside what is
now the split out compositor. Which means it is actually owned
by the Compositor. The static pointer has never been needed
inside KWin core. Access to the Scene is not required for the
Window Manager. The only real usage is in the EffectsHandlerImpl
and in utils.h to provide a convenient way to figure out whether
compositing is currently active (scene != NULL).
The EffectsHandlerImpl gets also created by the Compositor after
the Scene is created and gets deleted just before the Scene gets
deleted. This allows to inject the Scene into the EffectsHandlerImpl
to resolve the static access in this class.
The convenient way to access the compositing() in utils.h had
to go. To provide the same feature the Compositor provides a
hasScene() access which has the same behavior as the old method.
In order to keep the code changes small in Workspace and Toplevel
a new method compositing() is defined which properly resolves
the state. A disadvantage is that this can no longer be inlined
and consists of several method calls and pointer checks.
The supportInformation is extended to also read the properties
on all effects. In addition each effect can be queried just for
itself through D-Bus, e.g.:
qdbus org.kde.kwin /KWin supportInformationForEffect kwin4_effect_blur
All effects are extended to provide their configured and read
settings through properties. In some cases also important
runtime information is exposed.
REVIEW: 105977
BUG: 305338
FIXED-IN: 4.9.1