kwin/effects/fallapart/fallapart.cpp

201 lines
6.5 KiB
C++
Raw Normal View History

2020-08-02 22:22:19 +00:00
/*
KWin - the KDE window manager
This file is part of the KDE project.
2020-08-02 22:22:19 +00:00
SPDX-FileCopyrightText: 2007 Lubos Lunak <l.lunak@kde.org>
2020-08-02 22:22:19 +00:00
SPDX-License-Identifier: GPL-2.0-or-later
*/
#include "fallapart.h"
// KConfigSkeleton
#include "fallapartconfig.h"
#include <cmath>
namespace KWin
{
bool FallApartEffect::supported()
{
return effects->isOpenGLCompositing() && effects->animationsSupported();
}
FallApartEffect::FallApartEffect()
2011-01-30 14:34:42 +00:00
{
initConfig<FallApartConfig>();
2011-01-30 14:34:42 +00:00
reconfigure(ReconfigureAll);
connect(effects, &EffectsHandler::windowClosed, this, &FallApartEffect::slotWindowClosed);
connect(effects, &EffectsHandler::windowDeleted, this, &FallApartEffect::slotWindowDeleted);
connect(effects, &EffectsHandler::windowDataChanged, this, &FallApartEffect::slotWindowDataChanged);
2011-01-30 14:34:42 +00:00
}
2011-01-30 14:34:42 +00:00
void FallApartEffect::reconfigure(ReconfigureFlags)
{
FallApartConfig::self()->read();
blockSize = FallApartConfig::blockSize();
2011-01-30 14:34:42 +00:00
}
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
void FallApartEffect::prePaintScreen(ScreenPrePaintData& data, std::chrono::milliseconds presentTime)
2011-01-30 14:34:42 +00:00
{
if (!windows.isEmpty())
data.mask |= PAINT_SCREEN_WITH_TRANSFORMED_WINDOWS;
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
effects->prePaintScreen(data, presentTime);
2011-01-30 14:34:42 +00:00
}
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
void FallApartEffect::prePaintWindow(EffectWindow* w, WindowPrePaintData& data, std::chrono::milliseconds presentTime)
2011-01-30 14:34:42 +00:00
{
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
auto animationIt = windows.find(w);
if (animationIt != windows.end() && isRealWindow(w)) {
if (animationIt->progress < 1) {
int time = 0;
if (animationIt->lastPresentTime.count()) {
time = (presentTime - animationIt->lastPresentTime).count();
}
animationIt->lastPresentTime = presentTime;
animationIt->progress += time / animationTime(1000.);
data.setTransformed();
2011-01-30 14:34:42 +00:00
w->enablePainting(EffectWindow::PAINT_DISABLED_BY_DELETE);
// Request the window to be divided into cells
2011-01-30 14:34:42 +00:00
data.quads = data.quads.makeGrid(blockSize);
} else {
windows.remove(w);
w->unrefWindow();
}
}
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
effects->prePaintWindow(w, data, presentTime);
2011-01-30 14:34:42 +00:00
}
2011-01-30 14:34:42 +00:00
void FallApartEffect::paintWindow(EffectWindow* w, int mask, QRegion region, WindowPaintData& data)
{
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
auto animationIt = windows.constFind(w);
if (animationIt != windows.constEnd() && isRealWindow(w)) {
const qreal t = animationIt->progress;
WindowQuadList new_quads;
int cnt = 0;
2011-01-30 14:34:42 +00:00
foreach (WindowQuad quad, data.quads) { // krazy:exclude=foreach
// make fragments move in various directions, based on where
// they are (left pieces generally move to the left, etc.)
2011-01-30 14:34:42 +00:00
QPointF p1(quad[ 0 ].x(), quad[ 0 ].y());
double xdiff = 0;
2011-01-30 14:34:42 +00:00
if (p1.x() < w->width() / 2)
xdiff = -(w->width() / 2 - p1.x()) / w->width() * 100;
if (p1.x() > w->width() / 2)
xdiff = (p1.x() - w->width() / 2) / w->width() * 100;
double ydiff = 0;
2011-01-30 14:34:42 +00:00
if (p1.y() < w->height() / 2)
ydiff = -(w->height() / 2 - p1.y()) / w->height() * 100;
if (p1.y() > w->height() / 2)
ydiff = (p1.y() - w->height() / 2) / w->height() * 100;
double modif = t * t * 64;
2011-01-30 14:34:42 +00:00
srandom(cnt); // change direction randomly but consistently
xdiff += (rand() % 21 - 10);
ydiff += (rand() % 21 - 10);
for (int j = 0;
j < 4;
++j) {
quad[ j ].move(quad[ j ].x() + xdiff * modif, quad[ j ].y() + ydiff * modif);
}
// also make the fragments rotate around their center
2011-01-30 14:34:42 +00:00
QPointF center((quad[ 0 ].x() + quad[ 1 ].x() + quad[ 2 ].x() + quad[ 3 ].x()) / 4,
(quad[ 0 ].y() + quad[ 1 ].y() + quad[ 2 ].y() + quad[ 3 ].y()) / 4);
double adiff = (rand() % 720 - 360) / 360. * 2 * M_PI; // spin randomly
for (int j = 0;
j < 4;
++j) {
double x = quad[ j ].x() - center.x();
double y = quad[ j ].y() - center.y();
2011-01-30 14:34:42 +00:00
double angle = atan2(y, x);
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
angle += animationIt->progress * adiff;
2011-01-30 14:34:42 +00:00
double dist = sqrt(x * x + y * y);
x = dist * cos(angle);
y = dist * sin(angle);
quad[ j ].move(center.x() + x, center.y() + y);
}
2011-01-30 14:34:42 +00:00
new_quads.append(quad);
++cnt;
}
2011-01-30 14:34:42 +00:00
data.quads = new_quads;
data.multiplyOpacity(interpolate(1.0, 0.0, t));
}
2011-01-30 14:34:42 +00:00
effects->paintWindow(w, mask, region, data);
}
void FallApartEffect::postPaintScreen()
2011-01-30 14:34:42 +00:00
{
if (!windows.isEmpty())
effects->addRepaintFull();
effects->postPaintScreen();
2011-01-30 14:34:42 +00:00
}
2011-01-30 14:34:42 +00:00
bool FallApartEffect::isRealWindow(EffectWindow* w)
{
// TODO: isSpecialWindow is rather generic, maybe tell windowtypes separately?
/*
qCDebug(KWINEFFECTS) << "--" << w->caption() << "--------------------------------";
qCDebug(KWINEFFECTS) << "Tooltip:" << w->isTooltip();
qCDebug(KWINEFFECTS) << "Toolbar:" << w->isToolbar();
qCDebug(KWINEFFECTS) << "Desktop:" << w->isDesktop();
qCDebug(KWINEFFECTS) << "Special:" << w->isSpecialWindow();
qCDebug(KWINEFFECTS) << "TopMenu:" << w->isTopMenu();
qCDebug(KWINEFFECTS) << "Notific:" << w->isNotification();
qCDebug(KWINEFFECTS) << "Splash:" << w->isSplash();
qCDebug(KWINEFFECTS) << "Normal:" << w->isNormalWindow();
*/
if (w->isPopupWindow()) {
return false;
}
if (w->isX11Client() && !w->isManaged()) {
return false;
}
2011-01-30 14:34:42 +00:00
if (!w->isNormalWindow())
return false;
return true;
2011-01-30 14:34:42 +00:00
}
void FallApartEffect::slotWindowClosed(EffectWindow* c)
2011-01-30 14:34:42 +00:00
{
if (!isRealWindow(c))
return;
if (!c->isVisible())
return;
const void* e = c->data(WindowClosedGrabRole).value<void*>();
if (e && e != this)
return;
c->setData(WindowClosedGrabRole, QVariant::fromValue(static_cast<void*>(this)));
Provide expected presentation time to effects Effects are given the interval between two consecutive frames. The main flaw of this approach is that if the Compositor transitions from the idle state to "active" state, i.e. when there is something to repaint, effects may see a very large interval between the last painted frame and the current. In order to address this issue, the Scene invalidates the timer that is used to measure time between consecutive frames before the Compositor is about to become idle. While this works perfectly fine with Xinerama-style rendering, with per screen rendering, determining whether the compositor is about to idle is rather a tedious task mostly because a single output can't be used for the test. Furthermore, since the Compositor schedules pointless repaints just to ensure that it's idle, it might take several attempts to figure out whether the scene timer must be invalidated if you use (true) per screen rendering. Ideally, all effects should use a timeline helper that is aware of the underlying render loop and its timings. However, this option is off the table because it will involve a lot of work to implement it. Alternative and much simpler option is to pass the expected presentation time to effects rather than time between consecutive frames. This means that effects are responsible for determining how much animation timelines have to be advanced. Typically, an effect would have to store the presentation timestamp provided in either prePaint{Screen,Window} and use it in the subsequent prePaint{Screen,Window} call to estimate the amount of time passed between the next and the last frames. Unfortunately, this is an API incompatible change. However, it shouldn't take a lot of work to port third-party binary effects, which don't use the AnimationEffect class, to the new API. On the bright side, we no longer need to be concerned about the Compositor getting idle. We do still try to determine whether the Compositor is about to idle, primarily, because the OpenGL render backend swaps buffers on present, but that will change with the ongoing compositing timing rework.
2020-11-20 15:44:04 +00:00
windows[ c ].progress = 0;
c->refWindow();
2011-01-30 14:34:42 +00:00
}
void FallApartEffect::slotWindowDeleted(EffectWindow* c)
2011-01-30 14:34:42 +00:00
{
windows.remove(c);
}
void FallApartEffect::slotWindowDataChanged(EffectWindow* w, int role)
{
if (role != WindowClosedGrabRole) {
return;
}
if (w->data(role).value<void*>() == this) {
return;
}
auto it = windows.find(w);
if (it == windows.end()) {
return;
}
it.key()->unrefWindow();
windows.erase(it);
}
bool FallApartEffect::isActive() const
{
return !windows.isEmpty();
}
} // namespace