This method replaces the X-KDE-ORDERING property in the Effect's desktop
files. This change is a preparation step for integrating the new Effect
Loader which doesn't read the ordering information. Thus it needs to be
provided by the Effect itself so that the EffectsHandler can properly
insert it into the chain.
Also for the built-in Effects on the long run it doesn't make much sense
to install the desktop files. And binary plugin effects will migrate to
json metadata which also doesn't have the KService::Ptr. Thus overall it
simplifies to read this information directly from the Effect.
Instead of using EffectsHandler::sendReloadMessage we generate the dbus
interface in each plugin and call the reconfigure slot directly. That way
it's more type safe and we don't need to link kwineffects from the
configs.
REVIEW: 116875
There are no advantages for the effects KCM to have all the effect
config modules in one plugin.
By having a plugin per effect we can use the KPluginTrader to easily
find the configuration plugin for a given effect and load it.
To make this possible the following changes are done:
* config_builtins.cpp is deleted
* add_subdirectory is used for all effects which have a config module
* toplevel CMakeLists.txt contains the sources again for the effects
which have a config module, but effects which don't have a config
module are still included and thus the macro is still used
* plugin created for the config module, name pattern is:
kwin_effectname_config
* plugin installed to ${PLUGIN_INSTALL_DIR}/kwin/effects/configs
* desktop file adjusted to new plugin name and keyword removed
* desktop file converted to json as meta data and no longer installed
* Uses K_PLUGIN_FACTORY_WITH_JSON
* Macros for config are dropped from kwineffects.h
REVIEW: 116854
Implemented in KWin core to forward to new global shortcut system. This
method should be extended/changed once we go to Qt5/KF5 to make the usage
easier (no more KAction).
Each global shortcut in the effects makes use of this new method.
Rational behind this change is that displayWidth and displayHeight are
X specific API calls in kwinglobals. For the future it's easier to only
rely on functionality which goes through the EffectsHandler API which
allows easier adjustments in KWin core.
displayWidth() and displayHeight() are only used to get the size or the
complete rect of all screens. This is also provided by:
effects->virtualScreenGeometry() or
effects->virtualScreenSize()
REVIEW: 116021
Screens provides a size which is constructed from the size of
the bounding geometry of all screens and provides an overload taking
an int to return the size of a specified screen. For geometry() a new
ovload is added without an argument, which is just a convenient wrapper
for QRect(QPoint(0, 0), size()).
Both new methods are exported to effects and scripting as new
properties there called virtualScreenSize and virtualScreenGeometry.
The (virtual) size gets cached in screens and is updated whenever the
count or geometry changes.
Construction of Screens is slightly changed by moving the init code
from ctor into a virtual method init(). Reason is that we ended in
a loop with accessing the singleton pointer before it was set.
REVIEW: 116114
By setting the X property _KDE_NET_WM_SKIP_CLOSE_ANIMATION to 1 a window
can request to be excluded from any close animation. This property is
read in Toplevel, so that it is available to both Client and Unmanaged.
If the window has this property set the Scene suppresses the paintWindow
loop of the Deleted. Thus no effect needs to be adjusted. But an effect
using drawWindow directly would still be able to render the Deleted as
there is no suppression.
Furthermore the property is passed to the EffectWindow so that an
Effect can make use of this functionality and not start the animation
in the first place.
REVIEW: 115288
* this effect is way cheaper than blur, don't cache it
* use its own atom
* also pass the matrix in the x property
* remove remnants of the cache
* do just a single pass
* get rid of config ui remnants
Effects can access the QPainter used by SceneQPainter to directly render
into the back buffer.
Obviously only available in Compositing Type QPainterCompositing.
Completing the task of replacing all NULL to nullptr in all the files in
libkwineffects folder.
(also substituting some "0" used as nullptr with nullptr)
REVIEW: 114823
Client used to have dedicated methods for different icon sizes instead
of combining all pixmaps into one QIcon. This resulted in various parts
of KWin having different access to the icons:
* effects only got one pixmap of size 32x32
* decorations only got the 16x16 and 32x32 pixmaps combined into a QIcon
* tabbox could request all icon sizes, but only as pixmap
Now all sizes are available in one QIcon allowing to easily access the
best fitting icon in a given UI.
With QtQuick2 it's possible that the scene graph rendering context either
lives in an own thread or uses the main GUI thread. In the latter case
it's the same thread as our compositing OpenGL context lives in. This
means our basic assumption that between two rendering passes the context
stays current does not hold.
The code already ensured that before we start a rendering pass the
context is made current, but there are many more possible cases. If we
use OpenGL in areas not triggered by the rendering loop but in response
to other events the context needs to be made current. This includes the
loading and unloading of effects (some effects use OpenGL in the static
effect check, in the ctor and dtor), background loading of texture data,
lazy loading after first usage invoked by shortcut, etc. etc.
To properly handle these cases new methods are added to EffectsHandler
to make the compositing OpenGL context current. These calls delegate down
into the scene. On non-OpenGL scenes they are noop, but on OpenGL they go
into the backend and make the context current. In addition they ensure
that Qt doesn't think that it's QOpenGLContext is current by calling
doneCurrent() on the QOpenGLContext::currentContext(). This unfortunately
causes an additional call to makeCurrent with a null context, but there
is no other way to tell Qt - it doesn't notice when a different context
is made current with low level API calls. In the multi-threaded
architecture this doesn't matter as ::currentContext() returns null.
A short evaluation showed that a transition to QOpenGLContext doesn't
seem feasible. Qt only supports either GLX or EGL while KWin supports
both and when entering the transition phase for Wayland, it would become
extremely tricky if our native platform is X11, but we want a Wayland
EGL context. A future solution might be to have a "KWin-QPA plugin" which
uses either xcb or Wayland and hides everything from Qt.
The API documentation is extended to describe when the effects-framework
ensures that an OpenGL context is current. The effects are changed to
make the context current in cases where it's not guaranteed. This has
been done by looking for creation or deletion of GLTextures and Shaders.
If there are other OpenGL usages outside the rendering loop, ctor/dtor
this needs to be changed, too.
* "" needs to be wrapped in QStringLiteral
* QString::fromUtf8 needed for const char* and QByteArray
* QByteArray::constData() needed to get to the const char*
This reverts commit 23dff966437bb664a2ffdb3f7957ef39978f5fad.
Using QVector is not a win when effects such as wobbly windows are
active, due to the realloc overhead. So revert this change for now.
Eg. gtk+ alters the modality after mapping and
before unmapping the window.
Therfore the former implementation ahd a wrong idea
about the modality until the window was activated and
again had a wrong idea when the dialog closed, keeping
the main client dimmed.
Modality changes at runtime are uncommon but legal and can
happen anytime.
BUG: 321340
FIXED-IN: 4.11
REVIEW: 111154
Just like Opacity for transforming only the decoration opacity.
It's an ABI break as the new enum value had to be included as anonther
non-float base value.
Cross fading with previous pixmap is achieved by referencing the old
window pixmap. WindowPaintData has a cross-fade-factor which interpolates
between 0.0 (completely old pixmap) to 1.0 (completely new pixmap).
If a cross fading factor is set and a previous pixmap is valid this one
is rendered on top of the current pixmap with opacity adjusted. This
results in a smoother fading.
To simplify the setup the AnimationEffect is extended and also takes care
about correctly (un)referencing the previous window pixmap. The maximize
effect is adjusted to make use of this new capabilities.
Unfortunately this setup has a huge problem with the case that the window
decoration gets smaller (e.g. from normal to maximized state). In this
situation it can happen that the old window is rendered with parts outside
the content resulting in video garbage being shown. To prevent this a set
of new WindowQuads is generated with normalized texture coordinates in
the safe area which contains real content.
For OpenGL2Window a PreviousContentLeaf is added which is only set up in
case the crass fading factor is set.
REVIEW: 110578
This avoids the overhead of allocating each WindowQuad on the heap
when appending items to the list, and also ensures that the quads
are continuous in memory.
Split WindowQuadDecoration into WindowQuadDecorationLeftRight
and WindowQuadDecorationTopBottom.
This simplifies the code in SceneOpenGL::Window::paintDecoration().
Unlike makeArrays() this function writes into a pre-allocated array,
and takes a matrix that's used to transform the texture coordinates.
This allows this function to handle coordinates for rectangular
textures correctly.
With the removal of BoxSwitch all effects which want mouse events use the
fullscreen input window. The available functionality is too complex both
in EffectsHandler and in the Effects.
With this change only fullscreen input windows are supported and all
effects share the input window. This means there is at maximum one input
window. This simplifies the code in the Effects as they don't have to
keep track of the window they created any more. In EffectsHandler it
means that only one window needs to be created, destroyed and raised.
Also it means that we can properly react on screen size changes which had
been ignored in the past. Also quite some roundtrips to X are no longer
needed as we do not need to query the window geometry when creating the
input window.
REVIEW: 110156
The non-composited part handles the showWithX case with the four small
windows. The composited part shows a translucent QWidget with the
FrameSvg as done by the selection effect frame.
Outline connects to the Compositor toggled signal to switch the mode if
compositing gets suspended/resumed. This works fine also in the case that
the switch happens while the outline is shown. To support this Outline
is now a QObject and created with Workspace as a parent.
Given that the Outline handles both cases by itself, the outline effect
is no longer needed and is dropped together with all the hooks into the
effect system.
EffectsHandlerImpl starts to monitor DBus for the screen being locked and
provides this information to the Effect system by allowing them to ask
whether the screen is currently locked and by emitting a signal when the
screen gets locked/unlocked.
This information is needed to ensure that no private data is shown on the
screen. The following effects are adjusted:
* taskbar thumbnails
* thumbnail aside
* mouse mark
* screen shot
BUG: 255712
FIXED-IN: 4.11
REVIEW: 108670
For each edge an additional "approach" area window is created. When the
mouse enters this approach window, it gets unmapped and a mouse polling
interval is started. If the mouse leaves the approach area again, the
window gets mapped again and the mouse polling is stopped.
During the approaching a signal is emitted with a factor in [0.0,1.0] to
describe how close the mouse is to the edge. 0.0 means far away, 1.0
means triggering the edge. This signal is passed to the effects to allow
using this information. E.g. to provide a glow corner effect or to make
use of it in the cube animation effect to start the animation on desktop
switch.
The main difference is that the activation of an edge is no longer
broadcasted to all effects and scripts, but instead a passed in slot of
the Effect/Script is invoked.
For this the EffectsHandler API is changed to take the Effect as an
argument to (un)reserveElectricBorder. As callback slot the existing
borderActivated is used.
In addition the ScreenEdge monitors the object for beeing destroyed and
unregisters the the edge automatically. This removes the need from the
Effect to call unregister in the dtor.
BUG: 309695
FIXED-IN: 4.11
No effect has ever used these methods and there is no reason why an
effect should use them. Reserve/unreserve is sufficient as the effect
will be notified anyway.
In effects it's obvious that compositing is enabled, so specifying the
translucent element is no problem.
In tabbox a context property "compositing" is injected which decides
whether "translucent" or "opaque" elements should be used. Here the
translucent elements are only used if the Blur effect is available - for
this a new Effect::Feature Blur is introduced and in addition it is
tested whether the theme provides the translucent element.
Also the masking is adjusted to ensure that only the shadow is not
blurred.
Reason for this change is that Plasma theme seems not always to pick up
whether compositing is used when used from inside KWin. It does not cover
the Desktop Change OSD which uses PlasmaCore.Dialog and there we cannot
(yet) inject that we use compositing.
Overall I'm quite unhappy with this patch and I do hope we can fix it in
the proper place in the lifetime of 4.10 and revert this patch.
CCBUG: 311995
REVIEW: 108438