On 04/03/2024 14:14, Pekka Paalanen wrote:
On Mon, 4 Mar 2024 13:24:56 +0000
Terry Barnaby <ter...@beam.ltd.uk> wrote:

On 04/03/2024 09:41, Pekka Paalanen wrote:
On Mon, 4 Mar 2024 08:12:10 +0000
Terry Barnaby <ter...@beam.ltd.uk> wrote:
While I am trying to investigate my issue in the QtWayland arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the equation
to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test
out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt Wayland.
However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based
system in that when a subsurface of a subsurface is used, the Gstreamer
video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then waylandsink
creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.
This is intentional. From wl_subsurface specification:

        Even if a sub-surface is in desynchronized mode, it will behave as
        in synchronized mode, if its parent surface behaves as in
        synchronized mode. This rule is applied recursively throughout the
        tree of surfaces. This means, that one can set a sub-surface into
        synchronized mode, and then assume that all its child and grand-child
        sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
    - sub-surface B, synchronized
      - sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq
Ah, thanks for the info, that may be why this is not working even in Qt
then.

This seems a dropoff in Wayland to me. If a software module wants to
display Video into an area on the screen at its own rate, setting that
surface to de-synced mode is no use in the general case with this
policy.
It is of use, if you don't have unnecessary sub-surfaces in synchronized
mode in between, or you set all those extra sub-surfaces to
desynchronized as well.

Well they may not be necessary from the Wayland perspective, but from the higher level software they are useful to modularise/separate/provide a join for the software modules especially when software modules are separate like Qt and GStreamer.



I would have thought that if a subsurface was explicitly set to
de-synced mode then that would be honoured. I can't see a usage case for
it to be ignored and its commits synchronised up the tree ?
Resizing the window is the main use case.

In order to resize surface A, you also need to resize and paint surface
B, and for surface B you also need to resize and paint surface C. Then
you need to guarantee that all the updates from surface C, B and A are
applied atomically on screen.

Either you have component APIs good enough to negotiate the
stop-resize-paint-resume on your own, or if the sub-components are
free-running regardless of frame callbacks, component A can just
temporarily set surface B to synchronized, resize and reposition it,
and resume.

I would have thought that the Wayland server could/would synchronise screen updates when a higher level surface is resized/moved by itself.

As the software components are separately developed systems it is difficult to sync between them without changing them, but may be possible.



So is there a way to actually display Video on a subsurface many levels
deep in a surface hierarchy, would setting all of the surfaces up to the
subsurface just below the desktop top level one work (although not ideal
as it would mean overriding other software modules surfaces at the
Wayland level) ?
Setting to what?

I meant setting all the subsurfaces in the tree to be desynced. You have answered this one above. I can try that by setting the Video widget to desyned and see how that works with my simple application and a QtWidget, although Qt might object.



Or can desynced subsurfaces really only work to one level deep ?
You can set your middle sub-surface to desynchronized, too, at least
from Wayland perspective. I don't know if Qt let's you.

There are implementation issues with nested sub-surfaces in Weston, but
this does not seem to be one of them.

As above, I will try this. Thanks for the info.



If it is just one subsurface level deep that video can be displayed, I
guess I will have to get GStreamers waylandsink to create its subsurface
off the top most surface and add calls to manage its surface from my
app.
That should have been the first idea to try.

Is Gst waylandsink API the kind that it internally creates a new
wl_surface for itself and makes it a sub-surface of the given surface,
or is there an option to tell Gst to just push frames into a given
wl_surface?

If the former, then waylandsink is supposed to somehow give you an API
to set the sub-surface position and z-order wrt. its parent and
siblings. If the latter, you would create wl_subsurface yourself and
keep control of it to set the sub-surface position and z-order.

Either way, the optimal result is one top-level wl_surface, with one
sub wl_surface drawn by Gst, and no surfaces in between in the
hierarchy.

Yes, the Gst waylandsink API creates a new subsurface for itself from the GUI's managed surface to separate itself from the GUI (Qt/Gnomes) surfaces. It doesn't allow you to provide a surface to directly use. I don't think it allows the surface to be moved/resized although it can display video at an offset and size as far as I know (although it may actually change the surface to do this I will have a look). It doesn't allow the z-order to be changed I think. It expects the GUI to change its surface and I guess assumes its subsurface would effectively move in z and xy position due to the GUI moving/raising/lowering its surface (the parent) in a similar manner to how X11 would have done this.

I will try the middle desync and/or this method by managing the waylandsink surface outside of waylandsink if I can and if it doesn't mess up either Qt's or waylandsink's operations.

Thanks for the input.



Or maybe get waylandsinks subsurface and manipulate it behind
waylandsinks back. Not sure what this will do to the Qt level though.
Using the QWidgets subsurface as its base should have allowed isolation
to a degree between the Qt and wayland sink sub systems, its a much more
modular approach.
You cannot punt between-components integration to Wayland. The
sub-surfaces design tried and failed, and as a result the sub-surface
protocol is both complex and insufficient. You still need explicit
communication between your client-side components when you resize.

There is no documentation on this Wayland restriction in waylandsink or
other places, I can try and feed this back.

Oh for X11 !
Yeah, the blink of garbage when windows are resized, or the blue
colorkey following far behind when a window is moved.

Never had that and it did work :)




Thanks,
pq


Reply via email to