At the moment, the SurfaceItem needs to track individual properties that
may contribute to the buffer source box. That's error prone.
To fix that, this change makes the SurfaceInterface indicate when the
source box has changed and the SurfaceItem should sync its source box,
discard quads, etc.
The buffer source box is synchronized when the surface-to-buffer matrix
changes. However, when using 100% scaling, it's likely that the
surface-to-buffer matrix will be identity and therefore no corresponding
signal to indicate the change will be emitted.
To fix that, we need to update the buffer source box also when the
buffer size changes.
If the surface item's contents is scaled, i.e. its scale factor doesn't
match the output's scale, GL_LINEAR will be applied to smooth the
contents. The unfortunate thing is that it's possible some of the
changed pixels will bleed to the neighbor ones.
In order to handle that scenario better, this change makes the
SurfaceItem expand the damage if there's scale factor mismatch.
bufferSourceBox and bufferTransform properties were introduced to detect
if the surface contents is going to be scaled. bufferSourceBox covers
both crop transform from wp_viewport and scale factor from wl_surface.
bufferTransform is same as wl_surface's buffer transform property.
* speeds up incremental builds as changes to a header will not always
need the full mocs_compilation.cpp for all the target's headers rebuild,
while having a moc file sourced into a source file only adds minor
extra costs, due to small own code and the used headers usually
already covered by the source file, being for the same class/struct
* seems to not slow down clean builds, due to empty mocs_compilation.cpp
resulting in those quickly processed, while the minor extra cost of the
sourced moc files does not outweigh that in summary.
Measured times actually improved by some percent points.
(ideally CMake would just skip empty mocs_compilation.cpp & its object
file one day)
* enables compiler to see all methods of a class in same compilation unit
to do some sanity checks
* potentially more inlining in general, due to more in the compilation unit
* allows to keep using more forward declarations in the header, as with the
moc code being sourced into the cpp file there definitions can be ensured
and often are already for the needs of the normal class methods
The ClientBuffer type is empty now, most of the things have been
upstreamed to the GraphicsBuffer type. So let's drop it to simplify the
type hierarchy.
Currently, the normal window lifecycle looks as follows: create Window,
wait until it's shown, add it to Workspace, wait until it's closed,
create a Deleted, copy properties from the original window to the
deleted one, destroy the original window, wait until the last deleted
window reference is dropped.
There are a couple of issues with this design: we can't nicely
encapsulate X11 or Wayland specific implementation details if they need
to be accessed for closed windows; manual copying of properties is
cumbersome and error prone and we've had a dozen of cases where effects
worked incorrectly because some properties had not been copied.
The goal of this patch is to drop Deleted and extend the lifetime of the
original window, but with a special state set: Window::isDeleted().
The main danger is that somebody can try to do something with deleted
windows that they should not do, but on the other hand, such code needs
to be guarded with relevant checks too.
The main motivation behind this change is to share rendering code
between windows and the cursor, specifically the Item class which
requires a Scene.
Note that Scene subclasses are responsible for issuing
ItemRenderer::renderItem() calls. The main reason for that is the
current architecture of the effects system, specifically we need to call
some effects hooks before and after painting a window.
This is needed to establish explicit connection between an item and the
scene it belongs to. For now, the scene must be known at the item
construction time. Perhaps it can be improved in the future by items
inheriting their scene from the parent item, but the scene would need to
be refactored more so there's a root item or something like that.
The goal is to create surface items for things that are not in the
workspace scene. RenderBackend perhaps is not a great place for these
factory functions. On the other hand, this change merely rewires code
from Scene to RenderBackend. I think that in distant future we could
make surface items pick surface texture type on their own, for what it's
worth that's what we would do in QtQuick.