With this, the WindowItem will know whether it's actually visible. As
the result, if a native wayland window has been minimized, kwin won't
try to schedule a new frame if just a frame callback has been committed.
EffectWindow::enablePainting() and EffectWindow::disablePainting() act
as a stone in the shoe. They have the final say whether the given window
is visible and they are invoked too late in the rendering process.
WindowItem needs to know whether the window is visible in advance,
before compositing starts.
This change replaces EffectWindow::enablePainting() and
EffectWindow::disablePainting() with EffectWindow::refVisible() and
EffectWindow::unrefVisible(). If an effect calls the refVisible()
function, the window will be kept visible regardless of its state. It
should be called when a window is minimized or closed, etc. If an effect
doesn't want to paint a window, it should not call effects->paintWindow().
EffectWindow::refVisible() doesn't replace EffectWindow::refWindow() but
supplements it. refVisible() only ensures that a window will be kept
visible while refWindow() ensures that the window won't be destroyed
until the effect is done with it.
Item::isVisible() is true if either the item has been marked hidden or
one of its ancestors.
In some cases, kwin may render invisible windows, for example for window
thumbnails.
This change makes rendering code use explicit visibility status when
rendering to ensure that it's still possible to render invisible windows.
If an effect is reloaded while it holds deleted references, it's
possible that the closed windows will get stuck in the "zombie" state.
This change introduces EffectWindowDeletedRef helper that can be
used to keep the closed window alive as long as the reference is valid.
If you lift fingers but not swipe them enough to switch to another
virtual desktop, the slide effect will play an animation to move from
the current position in the virtual desktop grid to the current desktop.
However, that animation doesn't feel right, there's something missing.
The slide effect uses a TimeLine to animate switching between virtual
desktops, it's great if the amount of sliding is constant.
This change makes the slide effect use the mass-spring-damper model to
simulate the motion of a spring in order to animate switching between
virtual desktops.
The mass-spring-damper equation is integrated using RK4. If the delta
interval is not multiple of the integration step precisely, the
SpringMotion will perform integration as many times as the integration
step fits into the delta. The leftover will be used for LERP between the
previous and the next integration results.
With the spring animation, the slide animation feels more natural when
you lift fingers. If you switch between virtual desktops without using a
gesture, the slide animation should look almost the same as if it were
implemented with the TimeLine.
It helps to contextualise the method as it's using several x11-isms,
some of them possible ot abstract.
In any case, the method is only called with X11Window and it's the only
case where it makes sense doing so.
The scripting API is extended to support custom fragment shaders. To
support this a new method addFragmentShader is added taking ShaderTraits
and fragment shader. The GLShader is not exposed, instead a uint id is
provided which maps to the GLShader.
This shader id can be used in the animate and set calls to specify the
shader. The animation object is extended by the "fragmentShader" property.
The shader sources are located in the "shaders" directory of the package
contents.
E.g. the scale effect extended by the shader of the invert effect will
have the following layout:
package/contents/
-> code/
-> main.js
-> shaders/
-> invert_core.frag
-> invert.frag
The adjustment in code are:
* in constructor to load the shader
this.shader = effect.addFragmentShader(Effect.MapTexture, "invert.frag");
* in animations objects of the slots the additions
fragmentShader: this.shader
* using the type Effect.Shader
or in full:
window.scaleInAnimation = animate({
window: window,
curve: QEasingCurve.OutCubic,
duration: this.duration,
animations: [
{
type: Effect.Scale,
from: this.inScale
},
{
type: Effect.Opacity,
from: 0
},
{
type: Effect.Shader,
fragmentShader: this.shader
}
]
});
The animation settings object supports a "uniform" value which takes the
string name of the uniform. For this uniform the location is resolved
and stored in the meta data of the AnimationEffect. This requires the
type Effect.ShaderUniform.
An example animation:
window.scaleInAnimation = animate({
window: window,
curve: QEasingCurve.Linear,
duration: this.duration,
animations: [
{
type: Effect.ShaderUniform,
fragmentShader: this.shader,
uniform: "uForOpening",
from: 1.0,
to: 1.0
}
]
});
Furthermore a new setUniform scriptable method is added to the
ScriptedEffect. This allows to update uniforms when the configuration
changes.
The call takes a generic QJSValue which supports:
* float
* array of 2, 3 or 4 components
* string as color
* variant as color
An example usage to read a color from the configuration and set it as a
uniform:
effect.setUniform(this.shaderId,
"uEffectColor",
effect.readConfig("Color", "white"))
The animate and set calls are extended for an optional GLShader* to
allow specifying a custom shader to use during the animation.
To properly support rendering a complete window in the effect the
AnimationEffect gets based on the DeformEffect. If a shader is used
during the animation the window gets redirected.
For the animation with shaders two new enum values are added to the
AnimationType enum:
* Shader
* ShaderUniform
The Shader animation type is for specifying that the animation uses a
shader. During the animation a uniform "animationProgress" is set on the
shader.
The ShaderUniform animation type behaves exactly like the Shader type,
but also animates a user provided uniform. The meta data of the
animation is interpreted as a uniform location for a float uniform and
during the animation this uniform is updated with the interpolated
animation data.
For a redirected window a custom shader can be specified to draw the
redirected texture to screen. This is useful for inheriting effects to
customize the rendering.
Two new int uniforms TextureWidth and TextureHeight are added and set
from DeformEffect when rendering the texture.
Currently, fullscreen geometry restore is computed from maximized
geometry restore. However, the latter is set only when the window is
maximized.
Also, updateGeometryRestoresForFullscreen() can be called when the
window has not been moved. Avoid updating geometry restore if the output
has not been changed.
If the animation reaches the end, desktop grid may render the screen
incorrectly. Make sure that PAINT_SCREEN_BACKGROUND_FIRST and flags as
such are set even if animation has reached the end.
Also, while on this, simplify the paintWindow() method by removing
redundant effect status checks.
effects can specify in their json file "X-KWin-Border-Activate":true
and will be listed in the edge menus.
Don't hardcode desktop grid and overview in the kcms
At the moment, if user switches between virtual desktops using a
gesture, panels will loose blurred background because WindowForceBlurRole
is not set.
This change refactors setup code so the slide effect always forces blur
and background contrast when sliding between virtual desktops using a
gesture or animation.
Possibility to implement realtime screenedges gestures in scripted effects,
implement it in the windowsaperture show desktop effect.
* Expose registerRealtimeScreenEdge to JavaScript, the callback will be
a JS function.
* Add the concept of freezeInTime() in the animation js bindings,
it will either create an animation frozen at a given time or freeze a running animation
that can be restored and ran to completition at any time
* add an edges property only for showdesktop as it's not directly on the effect configuration
If desktop wrapping is disabled and user swipes to left but there's no
desktop to left, the slide effect can get stuck active because there's
no desktopChanged() nor desktopChangingCancelled() signal emitted.
This change makes the VirtualDesktopManager explicitly cancel
interactive desktop switching session if the current desktop has not
changed.
Currently, there's a separate pass to filter out windows not ready for
compositing or windows that must be invisible. That has two issues: we
could merge that pass with the pass that populates stacking_order and
"windows" can detach.
libinput_device_get_user_data() can be used to get the associated Device
object with libinput_device. That way, we won't need to maintain a
private list of all input devices.
If the window filter rejects a window, that window won't be in the
stacking_order and henceforth won't be painted, so finalDrawWindow()
does extra work of checking again if the window is accepted.
Adds a test that has two windows and one activates the other.
There are two versions: qt5 and qt6.
The Qt 6 code is vastly different since Qt 6 knows about
xdg_activation_v1, so it makes sense to have both for now.