While I am trying to investigate my issue in the QtWayland arena via the Qt Jira Bug system, I thought I would try taking Qt out of the equation to simplify the application a bit more to try and gain some understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test out how this should work. This is at: https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native Wayland. It is rough and could easily be doing things wrong wrt Wayland. However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based system in that when a subsurface of a subsurface is used, the Gstreamer video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel desktop surface (Gui), a subsurface to that (Video) and then waylandsink creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from gstreamer are only shown shown when the top subsurface is manually committed with gvideo->update() every second, otherwise the video pipeline is stalled. Waylandsink is stuck in a loop awaiting a Wayland callback after committing its subsurface. If the Video Widget is a top level widget (UseWidgetTop=1) this works fine (only one subsurface deep ?).

I have tried using both Gstreamer's waylandsink and glimagesink elements, both show the same issue.

This seems to suggest that the de-synced subsurface system is not working properly with Weston, I miss-understand how this should work or I have a program error.
This has been tested on Fedora37 running Weston 10.0.1 under KDE/Plasma/X11.

1. Should de-synced subsurfaces under other subsurfaces work under Weston 10.0.1 ?

2. Do I miss-understand how this should work ?

3. Do I have some coding issue (sorry the code is a bit complex with Wayland callbacks and C++ timers etc) ?


Reply via email to