Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-19 Thread Terry Barnaby

On 05/03/2024 12:26, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 17:59:25 +
Terry Barnaby  wrote:


On 04/03/2024 15:50, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 14:51:52 +
Terry Barnaby  wrote:
  

On 04/03/2024 14:14, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 13:24:56 +
Terry Barnaby  wrote:
 

On 04/03/2024 09:41, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:


While I am trying to investigate my issue in the QtWayland arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the equation
to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test
out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt Wayland.
However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based
system in that when a subsurface of a subsurface is used, the Gstreamer
video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then waylandsink
creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.

This is intentional. From wl_subsurface specification:

  Even if a sub-surface is in desynchronized mode, it will behave as
  in synchronized mode, if its parent surface behaves as in
  synchronized mode. This rule is applied recursively throughout the
  tree of surfaces. This means, that one can set a sub-surface into
  synchronized mode, and then assume that all its child and grand-child
  sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
  - sub-surface B, synchronized
- sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq

Ah, thanks for the info, that may be why this is not working even in Qt
then.

This seems a dropoff in Wayland to me. If a software module wants to
display Video into an area on the screen at its own rate, setting that
surface to de-synced mode is no use in the general case with this
policy.

It is of use, if you don't have unnecessary sub-surfaces in synchronized
mode in between, or you set all those extra sub-surfaces to
desynchronized as well.

Well they may not be necessary from the Wayland perspective, but from
the higher level software they are useful to modularise/separate/provide
a join for the software modules especially when software modules are
separate like Qt and GStreamer.

Sorry to hear that.
  

I would have thought that if a subsurface was explicitly set to
de-synced mode then that would be honoured. I can't see a usage case for
it to be ignored and its commits synchronised up the tree ?

Resizing the window is the main use case.

In order to resize surface A, you also need to 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-08 Thread Terry Barnaby

On 08/03/2024 15:23, Pekka Paalanen wrote:

On Fri, 8 Mar 2024 14:50:30 +
Terry Barnaby  wrote:


On 05/03/2024 12:26, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 17:59:25 +
Terry Barnaby  wrote:
  

...


I would have thought it better/more useful to have a Wayland API call
like "stopCommiting" so that an application can sort things out for this
and other things, providing more application control. But I really have
only very limited knowledge of the Wayland system. I just keep hitting
its restrictions.
  

Right, Wayland does not work that way. Wayland sees any client as a
single entity, regardless of its internal composition of libraries and
others.

When Wayland delivers any event, whether it is an explicit resize event
or an input event (or maybe the client just spontaneously decides to),
that causes the client to want to resize a window, it is then up to the
client itself to make sure it resizes everything it needs to, and keeps
everything atomic so that the end user does not see glitches on screen.

Sub-surfaces' synchronous mode was needed to let clients batch the
updates of multiple surfaces into a single atomic commit. It is the
desync mode that was a non-mandatory add-on. The synchronous mode was
needed, because there was no other way to batch multiple
wl_surface.commit requests to apply simultaneously guaranteed. Without
it, if you commit surface A and then surface B, nothing will guarantee
that the compositor would not show A updated and B not on screen for a
moment.

Wayland intentionally did not include any mechanism in its design
intended for communication between a single client's internal
components. Why use a display server as an IPC middle-man for something
that should be process-internal communication. After all, Wayland is
primarily a protocol - inter-process communication.

Well as you say it is up to the client to perform all of the surface
resize work. So it seems to me, if the client had an issue with pixel
perfect resizing it could always set any of its desynced surfaces to
sync mode, or just stop the update to them, while it resizes. I don't
see why Wayland needs to ignore the clients request to set a subsurface
desynced down the tree.

You're right, but it's in the spec now. I've gained a bit more
experience in the decade after writing the sub-surface spec.

You can still work around it by setting all sub-surfaces always desync.


Oh you wrote it, thanks for the work!

So maybe time for version n+1 then :)

Actually allowing sub/subsurfaces to work in desync should not break any 
existing clients as they cannot use it yet. Obviously new clients 
written for it would not work on older Wayland servers though.


Its difficult to desync all the higher surfaces in a Qt or probably 
other Widget set application, they are controlled by Qt and Qt does not 
give you access to the subsurfaces it has created. It would be better to 
have had a wl_surface_set_desync(wl_surface*) rather than a 
wl_subsurface_set_desync(wl_subsurface*).


With clients using lots of libraries/subsystems it is better to not use 
their internal workings unless you have to. Normally you try and work at 
the least common denominator, in this case the Wayland display system as 
that is the shared module they both use (at least when driving a Wayland 
display server). This is why it is nice to have a surface that is almost 
totally independent of others and just is shown/not shown, is over/below 
etc. other surfaces like an XWindow. The Wayland surfaces are mainly 
this as far as I can see, apart from this desync mode although maybe 
there are others.


I have asked in the Qt forums if they could provide some sort of API to 
allow the setting of desync up the tree, but this may not happen and it 
might be difficult for them as it could mess up their applications 
rendering. It also does not match other display system API's that they 
support. The higher level QWidgets ideally need synced surfaces, its 
just the Video surfaces that need desync. Really I think this is the 
Wayland servers job.






In fact does it return an error to the client
when the Wayland server ignores this command ?

There is no "return error" in Wayland. Either a request succeeds, or
the client is disconnected with an error. It's all asynchronous, too.

Any possibility for graceful failure must be designed into protocol
extensions at one step higher level. If there is room for a graceful
failure, it will be mentioned in the XML spec with explicit messages to
communicate it.

Which command do you mean?


I meant the wl_subsurface_set_desync() API call on a sub/subsurface that 
doesn't work. As no errors were returned it took a long time to find out 
why things weren't working, just some lower level threads locked up.


Personally I think these sort of occasional, performance irrelevant, 
types of methods/requests/commands should be synchronous (maybe under an 
asynchronous comms system) and return an error. Makes developing clients 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-08 Thread Pekka Paalanen
On Fri, 8 Mar 2024 14:50:30 +
Terry Barnaby  wrote:

> On 05/03/2024 12:26, Pekka Paalanen wrote:
> > On Mon, 4 Mar 2024 17:59:25 +
> > Terry Barnaby  wrote:
> >  

...

> >> I would have thought it better/more useful to have a Wayland API call
> >> like "stopCommiting" so that an application can sort things out for this
> >> and other things, providing more application control. But I really have
> >> only very limited knowledge of the Wayland system. I just keep hitting
> >> its restrictions.
> >>  
> > Right, Wayland does not work that way. Wayland sees any client as a
> > single entity, regardless of its internal composition of libraries and
> > others.
> >
> > When Wayland delivers any event, whether it is an explicit resize event
> > or an input event (or maybe the client just spontaneously decides to),
> > that causes the client to want to resize a window, it is then up to the
> > client itself to make sure it resizes everything it needs to, and keeps
> > everything atomic so that the end user does not see glitches on screen.
> >
> > Sub-surfaces' synchronous mode was needed to let clients batch the
> > updates of multiple surfaces into a single atomic commit. It is the
> > desync mode that was a non-mandatory add-on. The synchronous mode was
> > needed, because there was no other way to batch multiple
> > wl_surface.commit requests to apply simultaneously guaranteed. Without
> > it, if you commit surface A and then surface B, nothing will guarantee
> > that the compositor would not show A updated and B not on screen for a
> > moment.
> >
> > Wayland intentionally did not include any mechanism in its design
> > intended for communication between a single client's internal
> > components. Why use a display server as an IPC middle-man for something
> > that should be process-internal communication. After all, Wayland is
> > primarily a protocol - inter-process communication.  
> 
> Well as you say it is up to the client to perform all of the surface 
> resize work. So it seems to me, if the client had an issue with pixel 
> perfect resizing it could always set any of its desynced surfaces to 
> sync mode, or just stop the update to them, while it resizes. I don't 
> see why Wayland needs to ignore the clients request to set a subsurface 
> desynced down the tree.

You're right, but it's in the spec now. I've gained a bit more
experience in the decade after writing the sub-surface spec.

You can still work around it by setting all sub-surfaces always desync.

> In fact does it return an error to the client 
> when the Wayland server ignores this command ?

There is no "return error" in Wayland. Either a request succeeds, or
the client is disconnected with an error. It's all asynchronous, too.

Any possibility for graceful failure must be designed into protocol
extensions at one step higher level. If there is room for a graceful
failure, it will be mentioned in the XML spec with explicit messages to
communicate it.

Which command do you mean?

There is no "ignore" with wl_surface nor wl_subsurface.
wl_surface.commit is always acted upon, but the sub-surface sync mode
determines whether the state update goes to the screen or to a cache.
No state update is ignored unless you destroy your objects. The frame
callbacks that seem to go unanswered are not ignored, they are just
sitting in the cache waiting to apply when the parent surface actually
updates on screen.


Thanks,
pq


pgpdoXGYQYgtl.pgp
Description: OpenPGP digital signature


Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-08 Thread Terry Barnaby

On 05/03/2024 12:26, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 17:59:25 +
Terry Barnaby  wrote:


On 04/03/2024 15:50, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 14:51:52 +
Terry Barnaby  wrote:

On 04/03/2024 14:14, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 13:24:56 +
Terry Barnaby  wrote:

On 04/03/2024 09:41, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:
While I am trying to investigate my issue in the QtWayland 
arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the 
equation

to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application 
to test

out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt 
Wayland.

However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my 
Qt based
system in that when a subsurface of a subsurface is used, the 
Gstreamer

video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then 
waylandsink

creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.

This is intentional. From wl_subsurface specification:

Even if a sub-surface is in desynchronized mode, it will behave as
in synchronized mode, if its parent surface behaves as in
synchronized mode. This rule is applied recursively throughout the
tree of surfaces. This means, that one can set a sub-surface into
synchronized mode, and then assume that all its child and 
grand-child

sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent 
controls
everything as if there was just a single wl_surface. If the 
parent sets
its sub-surface as desynchronized, it explicitly gives the 
sub-surface
the permission to update on screen regardless of the parent's 
updates.
When the sub-surface is in synchronized mode, the parent surface 
wants

to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly 
desynchronized)

- sub-surface B, synchronized
- sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, 
all C

updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A 
updates to

be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control 
surface
A, and delegate a part of surface A to component B which happens 
to the

using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq
Ah, thanks for the info, that may be why this is not working even 
in Qt

then.

This seems a dropoff in Wayland to me. If a software module wants to
display Video into an area on the screen at its own rate, setting 
that

surface to de-synced mode is no use in the general case with this
policy.
It is of use, if you don't have unnecessary sub-surfaces in 
synchronized

mode in between, or you set all those extra sub-surfaces to
desynchronized as well.

Well they may not be necessary from the Wayland perspective, but from
the higher level software they are useful to 
modularise/separate/provide

a join for the software modules especially when software modules are
separate like Qt and GStreamer.

Sorry to hear that.

I would have thought that if a subsurface was explicitly set to
de-synced mode then that would be honoured. I can't see a usage 
case for

it to be ignored and its commits synchronised up the tree ?

Resizing the window is the main use case.

In order to resize surface A, you also need to resize and paint 
surface

B, and for surface B you also need 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-05 Thread Pekka Paalanen
On Mon, 4 Mar 2024 17:59:25 +
Terry Barnaby  wrote:

> On 04/03/2024 15:50, Pekka Paalanen wrote:
> > On Mon, 4 Mar 2024 14:51:52 +
> > Terry Barnaby  wrote:
> >  
> >> On 04/03/2024 14:14, Pekka Paalanen wrote:  
> >>> On Mon, 4 Mar 2024 13:24:56 +
> >>> Terry Barnaby  wrote:
> >>> 
>  On 04/03/2024 09:41, Pekka Paalanen wrote:  
> > On Mon, 4 Mar 2024 08:12:10 +
> > Terry Barnaby  wrote:
> >
> >> While I am trying to investigate my issue in the QtWayland arena via 
> >> the
> >> Qt Jira Bug system, I thought I would try taking Qt out of the equation
> >> to simplify the application a bit more to try and gain some
> >> understanding of what is going on and how this should all work.
> >>
> >> So I have created a pure GStreamer/Wayland/Weston application to test
> >> out how this should work. This is at:
> >> https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz
> >>
> >> This tries to implement a C++ Widget style application using native
> >> Wayland. It is rough and could easily be doing things wrong wrt 
> >> Wayland.
> >> However it does work to a reasonable degree.
> >>
> >> However, I appear to see the same sort of issue I see with my Qt based
> >> system in that when a subsurface of a subsurface is used, the Gstreamer
> >> video is not seen.
> >>
> >> This example normally (UseWidgetTop=0) has a top level xdg_toplevel
> >> desktop surface (Gui), a subsurface to that (Video) and then 
> >> waylandsink
> >> creates a subsurface to that which it sets to de-sync mode.
> >>
> >> When this example is run with UseWidgetTop=0 the video frames from
> >> gstreamer are only shown shown when the top subsurface is manually
> >> committed with gvideo->update() every second, otherwise the video
> >> pipeline is stalled.  
> > This is intentional. From wl_subsurface specification:
> >
> >  Even if a sub-surface is in desynchronized mode, it will 
> > behave as
> >  in synchronized mode, if its parent surface behaves as in
> >  synchronized mode. This rule is applied recursively throughout 
> > the
> >  tree of surfaces. This means, that one can set a sub-surface 
> > into
> >  synchronized mode, and then assume that all its child and 
> > grand-child
> >  sub-surfaces are synchronized, too, without explicitly setting 
> > them.
> >
> > This is derived from the design decision that a wl_surface and its
> > immediate sub-surfaces form a seamlessly integrated unit that works
> > like a single wl_surface without sub-surfaces would. wl_subsurface
> > state is state in the sub-surface's parent, so that the parent controls
> > everything as if there was just a single wl_surface. If the parent sets
> > its sub-surface as desynchronized, it explicitly gives the sub-surface
> > the permission to update on screen regardless of the parent's updates.
> > When the sub-surface is in synchronized mode, the parent surface wants
> > to be updated in sync with the sub-surface in an atomic fashion.
> >
> > When your surface stack looks like:
> >
> > - main surface A, top-level, root surface (implicitly desynchronized)
> >  - sub-surface B, synchronized
> >- sub-surface C, desynchronized
> >
> > Updates to surface C are immediately made part of surface B, because
> > surface C is in desynchronized mode. If B was the root surface, all C
> > updates would simply go through.
> >
> > However, surface B is a part of surface A, and surface B is in
> > synchronized mode. This means that the client wants surface A updates to
> > be explicit and atomic. Nothing must change on screen until A is
> > explicitly committed itself. So any update to surface B requires a
> > commit on surface A to become visible. Surface C does not get to
> > override the atomicity requirement of surface A updates.
> >
> > This has been designed so that software component A can control surface
> > A, and delegate a part of surface A to component B which happens to the
> > using a sub-surface: surface B. If surface B parts are further
> > delegated to another component C, then component A can still be sure
> > that nothing updates on surface A until it says so. Component A sets
> > surface B to synchronized to ensure that.
> >
> > That's the rationale behind the Wayland design.
> >
> >
> > Thanks,
> > pq  
>  Ah, thanks for the info, that may be why this is not working even in Qt
>  then.
> 
>  This seems a dropoff in Wayland to me. If a software module wants to
>  display Video into an area on the screen at its own rate, setting that
>  surface to de-synced mode is no use in the general case with this
>  

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Terry Barnaby

On 04/03/2024 15:50, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 14:51:52 +
Terry Barnaby  wrote:


On 04/03/2024 14:14, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 13:24:56 +
Terry Barnaby  wrote:
  

On 04/03/2024 09:41, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:
 

While I am trying to investigate my issue in the QtWayland arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the equation
to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test
out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt Wayland.
However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based
system in that when a subsurface of a subsurface is used, the Gstreamer
video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then waylandsink
creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.

This is intentional. From wl_subsurface specification:

 Even if a sub-surface is in desynchronized mode, it will behave as
 in synchronized mode, if its parent surface behaves as in
 synchronized mode. This rule is applied recursively throughout the
 tree of surfaces. This means, that one can set a sub-surface into
 synchronized mode, and then assume that all its child and grand-child
 sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
 - sub-surface B, synchronized
   - sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq

Ah, thanks for the info, that may be why this is not working even in Qt
then.

This seems a dropoff in Wayland to me. If a software module wants to
display Video into an area on the screen at its own rate, setting that
surface to de-synced mode is no use in the general case with this
policy.

It is of use, if you don't have unnecessary sub-surfaces in synchronized
mode in between, or you set all those extra sub-surfaces to
desynchronized as well.

Well they may not be necessary from the Wayland perspective, but from
the higher level software they are useful to modularise/separate/provide
a join for the software modules especially when software modules are
separate like Qt and GStreamer.

Sorry to hear that.


I would have thought that if a subsurface was explicitly set to
de-synced mode then that would be honoured. I can't see a usage case for
it to be ignored and its commits synchronised up the tree ?

Resizing the window is the main use case.

In order to resize surface A, you also need to resize and paint surface
B, and for surface B you also need to resize and paint surface C. Then
you need to guarantee that 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Pekka Paalanen
On Mon, 4 Mar 2024 14:51:52 +
Terry Barnaby  wrote:

> On 04/03/2024 14:14, Pekka Paalanen wrote:
> > On Mon, 4 Mar 2024 13:24:56 +
> > Terry Barnaby  wrote:
> >  
> >> On 04/03/2024 09:41, Pekka Paalanen wrote:  
> >>> On Mon, 4 Mar 2024 08:12:10 +
> >>> Terry Barnaby  wrote:
> >>> 
>  While I am trying to investigate my issue in the QtWayland arena via the
>  Qt Jira Bug system, I thought I would try taking Qt out of the equation
>  to simplify the application a bit more to try and gain some
>  understanding of what is going on and how this should all work.
> 
>  So I have created a pure GStreamer/Wayland/Weston application to test
>  out how this should work. This is at:
>  https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz
> 
>  This tries to implement a C++ Widget style application using native
>  Wayland. It is rough and could easily be doing things wrong wrt Wayland.
>  However it does work to a reasonable degree.
> 
>  However, I appear to see the same sort of issue I see with my Qt based
>  system in that when a subsurface of a subsurface is used, the Gstreamer
>  video is not seen.
> 
>  This example normally (UseWidgetTop=0) has a top level xdg_toplevel
>  desktop surface (Gui), a subsurface to that (Video) and then waylandsink
>  creates a subsurface to that which it sets to de-sync mode.
> 
>  When this example is run with UseWidgetTop=0 the video frames from
>  gstreamer are only shown shown when the top subsurface is manually
>  committed with gvideo->update() every second, otherwise the video
>  pipeline is stalled.  
> >>> This is intentional. From wl_subsurface specification:
> >>>
> >>> Even if a sub-surface is in desynchronized mode, it will behave as
> >>> in synchronized mode, if its parent surface behaves as in
> >>> synchronized mode. This rule is applied recursively throughout the
> >>> tree of surfaces. This means, that one can set a sub-surface into
> >>> synchronized mode, and then assume that all its child and 
> >>> grand-child
> >>> sub-surfaces are synchronized, too, without explicitly setting 
> >>> them.
> >>>
> >>> This is derived from the design decision that a wl_surface and its
> >>> immediate sub-surfaces form a seamlessly integrated unit that works
> >>> like a single wl_surface without sub-surfaces would. wl_subsurface
> >>> state is state in the sub-surface's parent, so that the parent controls
> >>> everything as if there was just a single wl_surface. If the parent sets
> >>> its sub-surface as desynchronized, it explicitly gives the sub-surface
> >>> the permission to update on screen regardless of the parent's updates.
> >>> When the sub-surface is in synchronized mode, the parent surface wants
> >>> to be updated in sync with the sub-surface in an atomic fashion.
> >>>
> >>> When your surface stack looks like:
> >>>
> >>> - main surface A, top-level, root surface (implicitly desynchronized)
> >>> - sub-surface B, synchronized
> >>>   - sub-surface C, desynchronized
> >>>
> >>> Updates to surface C are immediately made part of surface B, because
> >>> surface C is in desynchronized mode. If B was the root surface, all C
> >>> updates would simply go through.
> >>>
> >>> However, surface B is a part of surface A, and surface B is in
> >>> synchronized mode. This means that the client wants surface A updates to
> >>> be explicit and atomic. Nothing must change on screen until A is
> >>> explicitly committed itself. So any update to surface B requires a
> >>> commit on surface A to become visible. Surface C does not get to
> >>> override the atomicity requirement of surface A updates.
> >>>
> >>> This has been designed so that software component A can control surface
> >>> A, and delegate a part of surface A to component B which happens to the
> >>> using a sub-surface: surface B. If surface B parts are further
> >>> delegated to another component C, then component A can still be sure
> >>> that nothing updates on surface A until it says so. Component A sets
> >>> surface B to synchronized to ensure that.
> >>>
> >>> That's the rationale behind the Wayland design.
> >>>
> >>>
> >>> Thanks,
> >>> pq  
> >> Ah, thanks for the info, that may be why this is not working even in Qt
> >> then.
> >>
> >> This seems a dropoff in Wayland to me. If a software module wants to
> >> display Video into an area on the screen at its own rate, setting that
> >> surface to de-synced mode is no use in the general case with this
> >> policy.  
> > It is of use, if you don't have unnecessary sub-surfaces in synchronized
> > mode in between, or you set all those extra sub-surfaces to
> > desynchronized as well.  
> 
> Well they may not be necessary from the Wayland perspective, but from 
> the higher level software they are useful to modularise/separate/provide 
> a join for the software modules 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Terry Barnaby

On 04/03/2024 14:14, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 13:24:56 +
Terry Barnaby  wrote:


On 04/03/2024 09:41, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:
  

While I am trying to investigate my issue in the QtWayland arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the equation
to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test
out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt Wayland.
However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based
system in that when a subsurface of a subsurface is used, the Gstreamer
video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then waylandsink
creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.

This is intentional. From wl_subsurface specification:

Even if a sub-surface is in desynchronized mode, it will behave as
in synchronized mode, if its parent surface behaves as in
synchronized mode. This rule is applied recursively throughout the
tree of surfaces. This means, that one can set a sub-surface into
synchronized mode, and then assume that all its child and grand-child
sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
- sub-surface B, synchronized
  - sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq

Ah, thanks for the info, that may be why this is not working even in Qt
then.

This seems a dropoff in Wayland to me. If a software module wants to
display Video into an area on the screen at its own rate, setting that
surface to de-synced mode is no use in the general case with this
policy.

It is of use, if you don't have unnecessary sub-surfaces in synchronized
mode in between, or you set all those extra sub-surfaces to
desynchronized as well.


Well they may not be necessary from the Wayland perspective, but from 
the higher level software they are useful to modularise/separate/provide 
a join for the software modules especially when software modules are 
separate like Qt and GStreamer.






I would have thought that if a subsurface was explicitly set to
de-synced mode then that would be honoured. I can't see a usage case for
it to be ignored and its commits synchronised up the tree ?

Resizing the window is the main use case.

In order to resize surface A, you also need to resize and paint surface
B, and for surface B you also need to resize and paint surface C. Then
you need to guarantee that all the updates from surface C, B and A are
applied atomically on screen.

Either you have component APIs good enough to 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Pekka Paalanen
On Mon, 4 Mar 2024 13:24:56 +
Terry Barnaby  wrote:

> On 04/03/2024 09:41, Pekka Paalanen wrote:
> > On Mon, 4 Mar 2024 08:12:10 +
> > Terry Barnaby  wrote:
> >  
> >> While I am trying to investigate my issue in the QtWayland arena via the
> >> Qt Jira Bug system, I thought I would try taking Qt out of the equation
> >> to simplify the application a bit more to try and gain some
> >> understanding of what is going on and how this should all work.
> >>
> >> So I have created a pure GStreamer/Wayland/Weston application to test
> >> out how this should work. This is at:
> >> https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz
> >>
> >> This tries to implement a C++ Widget style application using native
> >> Wayland. It is rough and could easily be doing things wrong wrt Wayland.
> >> However it does work to a reasonable degree.
> >>
> >> However, I appear to see the same sort of issue I see with my Qt based
> >> system in that when a subsurface of a subsurface is used, the Gstreamer
> >> video is not seen.
> >>
> >> This example normally (UseWidgetTop=0) has a top level xdg_toplevel
> >> desktop surface (Gui), a subsurface to that (Video) and then waylandsink
> >> creates a subsurface to that which it sets to de-sync mode.
> >>
> >> When this example is run with UseWidgetTop=0 the video frames from
> >> gstreamer are only shown shown when the top subsurface is manually
> >> committed with gvideo->update() every second, otherwise the video
> >> pipeline is stalled.  
> > This is intentional. From wl_subsurface specification:
> >
> >Even if a sub-surface is in desynchronized mode, it will behave as
> >in synchronized mode, if its parent surface behaves as in
> >synchronized mode. This rule is applied recursively throughout the
> >tree of surfaces. This means, that one can set a sub-surface into
> >synchronized mode, and then assume that all its child and grand-child
> >sub-surfaces are synchronized, too, without explicitly setting them.
> >
> > This is derived from the design decision that a wl_surface and its
> > immediate sub-surfaces form a seamlessly integrated unit that works
> > like a single wl_surface without sub-surfaces would. wl_subsurface
> > state is state in the sub-surface's parent, so that the parent controls
> > everything as if there was just a single wl_surface. If the parent sets
> > its sub-surface as desynchronized, it explicitly gives the sub-surface
> > the permission to update on screen regardless of the parent's updates.
> > When the sub-surface is in synchronized mode, the parent surface wants
> > to be updated in sync with the sub-surface in an atomic fashion.
> >
> > When your surface stack looks like:
> >
> > - main surface A, top-level, root surface (implicitly desynchronized)
> >- sub-surface B, synchronized
> >  - sub-surface C, desynchronized
> >
> > Updates to surface C are immediately made part of surface B, because
> > surface C is in desynchronized mode. If B was the root surface, all C
> > updates would simply go through.
> >
> > However, surface B is a part of surface A, and surface B is in
> > synchronized mode. This means that the client wants surface A updates to
> > be explicit and atomic. Nothing must change on screen until A is
> > explicitly committed itself. So any update to surface B requires a
> > commit on surface A to become visible. Surface C does not get to
> > override the atomicity requirement of surface A updates.
> >
> > This has been designed so that software component A can control surface
> > A, and delegate a part of surface A to component B which happens to the
> > using a sub-surface: surface B. If surface B parts are further
> > delegated to another component C, then component A can still be sure
> > that nothing updates on surface A until it says so. Component A sets
> > surface B to synchronized to ensure that.
> >
> > That's the rationale behind the Wayland design.
> >
> >
> > Thanks,
> > pq  
> 
> Ah, thanks for the info, that may be why this is not working even in Qt 
> then.
> 
> This seems a dropoff in Wayland to me. If a software module wants to 
> display Video into an area on the screen at its own rate, setting that 
> surface to de-synced mode is no use in the general case with this 
> policy.

It is of use, if you don't have unnecessary sub-surfaces in synchronized
mode in between, or you set all those extra sub-surfaces to
desynchronized as well.

> I would have thought that if a subsurface was explicitly set to 
> de-synced mode then that would be honoured. I can't see a usage case for 
> it to be ignored and its commits synchronised up the tree ?

Resizing the window is the main use case.

In order to resize surface A, you also need to resize and paint surface
B, and for surface B you also need to resize and paint surface C. Then
you need to guarantee that all the updates from surface C, B and A are
applied atomically on screen.

Either you have 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Terry Barnaby

On 04/03/2024 09:41, Pekka Paalanen wrote:

On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:


While I am trying to investigate my issue in the QtWayland arena via the
Qt Jira Bug system, I thought I would try taking Qt out of the equation
to simplify the application a bit more to try and gain some
understanding of what is going on and how this should all work.

So I have created a pure GStreamer/Wayland/Weston application to test
out how this should work. This is at:
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz

This tries to implement a C++ Widget style application using native
Wayland. It is rough and could easily be doing things wrong wrt Wayland.
However it does work to a reasonable degree.

However, I appear to see the same sort of issue I see with my Qt based
system in that when a subsurface of a subsurface is used, the Gstreamer
video is not seen.

This example normally (UseWidgetTop=0) has a top level xdg_toplevel
desktop surface (Gui), a subsurface to that (Video) and then waylandsink
creates a subsurface to that which it sets to de-sync mode.

When this example is run with UseWidgetTop=0 the video frames from
gstreamer are only shown shown when the top subsurface is manually
committed with gvideo->update() every second, otherwise the video
pipeline is stalled.

This is intentional. From wl_subsurface specification:

   Even if a sub-surface is in desynchronized mode, it will behave as
   in synchronized mode, if its parent surface behaves as in
   synchronized mode. This rule is applied recursively throughout the
   tree of surfaces. This means, that one can set a sub-surface into
   synchronized mode, and then assume that all its child and grand-child
   sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
   - sub-surface B, synchronized
 - sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq


Ah, thanks for the info, that may be why this is not working even in Qt 
then.


This seems a dropoff in Wayland to me. If a software module wants to 
display Video into an area on the screen at its own rate, setting that 
surface to de-synced mode is no use in the general case with this 
policy. I would have thought that if a subsurface was explicitly set to 
de-synced mode then that would be honoured. I can't see a usage case for 
it to be ignored and its commits synchronised up the tree ?


So is there a way to actually display Video on a subsurface many levels 
deep in a surface hierarchy, would setting all of the surfaces up to the 
subsurface just below the desktop top level one work (although not ideal 
as it would mean overriding other software modules surfaces at the 
Wayland level) ?


Or can desynced subsurfaces really only work to one level deep ?

If it is just one subsurface level deep that video can be displayed, I 
guess I will have to get GStreamers waylandsink to create its subsurface 
off the top most surface and add calls to manage its surface from my 
app. Or maybe get waylandsinks subsurface and manipulate it behind 
waylandsinks back. Not sure what this will do to the Qt level though. 
Using the QWidgets subsurface as its base should have allowed isolation 
to a degree between the Qt and wayland sink sub 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Pekka Paalanen
On Mon, 4 Mar 2024 08:12:10 +
Terry Barnaby  wrote:

> While I am trying to investigate my issue in the QtWayland arena via the 
> Qt Jira Bug system, I thought I would try taking Qt out of the equation 
> to simplify the application a bit more to try and gain some 
> understanding of what is going on and how this should all work.
> 
> So I have created a pure GStreamer/Wayland/Weston application to test 
> out how this should work. This is at: 
> https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz
> 
> This tries to implement a C++ Widget style application using native 
> Wayland. It is rough and could easily be doing things wrong wrt Wayland. 
> However it does work to a reasonable degree.
> 
> However, I appear to see the same sort of issue I see with my Qt based 
> system in that when a subsurface of a subsurface is used, the Gstreamer 
> video is not seen.
> 
> This example normally (UseWidgetTop=0) has a top level xdg_toplevel 
> desktop surface (Gui), a subsurface to that (Video) and then waylandsink 
> creates a subsurface to that which it sets to de-sync mode.
> 
> When this example is run with UseWidgetTop=0 the video frames from 
> gstreamer are only shown shown when the top subsurface is manually 
> committed with gvideo->update() every second, otherwise the video 
> pipeline is stalled.

This is intentional. From wl_subsurface specification:

  Even if a sub-surface is in desynchronized mode, it will behave as
  in synchronized mode, if its parent surface behaves as in
  synchronized mode. This rule is applied recursively throughout the
  tree of surfaces. This means, that one can set a sub-surface into
  synchronized mode, and then assume that all its child and grand-child
  sub-surfaces are synchronized, too, without explicitly setting them.

This is derived from the design decision that a wl_surface and its
immediate sub-surfaces form a seamlessly integrated unit that works
like a single wl_surface without sub-surfaces would. wl_subsurface
state is state in the sub-surface's parent, so that the parent controls
everything as if there was just a single wl_surface. If the parent sets
its sub-surface as desynchronized, it explicitly gives the sub-surface
the permission to update on screen regardless of the parent's updates.
When the sub-surface is in synchronized mode, the parent surface wants
to be updated in sync with the sub-surface in an atomic fashion.

When your surface stack looks like:

- main surface A, top-level, root surface (implicitly desynchronized)
  - sub-surface B, synchronized
- sub-surface C, desynchronized

Updates to surface C are immediately made part of surface B, because
surface C is in desynchronized mode. If B was the root surface, all C
updates would simply go through.

However, surface B is a part of surface A, and surface B is in
synchronized mode. This means that the client wants surface A updates to
be explicit and atomic. Nothing must change on screen until A is
explicitly committed itself. So any update to surface B requires a
commit on surface A to become visible. Surface C does not get to
override the atomicity requirement of surface A updates.

This has been designed so that software component A can control surface
A, and delegate a part of surface A to component B which happens to the
using a sub-surface: surface B. If surface B parts are further
delegated to another component C, then component A can still be sure
that nothing updates on surface A until it says so. Component A sets
surface B to synchronized to ensure that.

That's the rationale behind the Wayland design.


Thanks,
pq

> Waylandsink is stuck in a loop awaiting a Wayland 
> callback after committing its subsurface.
> If the Video Widget is a top level widget (UseWidgetTop=1) this works 
> fine (only one subsurface deep ?).
> 
> I have tried using both Gstreamer's waylandsink and glimagesink 
> elements, both show the same issue.
> 
> This seems to suggest that the de-synced subsurface system is not 
> working properly with Weston, I miss-understand how this should work or 
> I have a program error.
> This has been tested on Fedora37 running Weston 10.0.1 under KDE/Plasma/X11.
> 
> 1. Should de-synced subsurfaces under other subsurfaces work under 
> Weston 10.0.1 ?
> 
> 2. Do I miss-understand how this should work ?
> 
> 3. Do I have some coding issue (sorry the code is a bit complex with 
> Wayland callbacks and C++ timers etc) ?
> 
> 



pgp5dAHgSLs4H.pgp
Description: OpenPGP digital signature


Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-04 Thread Terry Barnaby
While I am trying to investigate my issue in the QtWayland arena via the 
Qt Jira Bug system, I thought I would try taking Qt out of the equation 
to simplify the application a bit more to try and gain some 
understanding of what is going on and how this should all work.


So I have created a pure GStreamer/Wayland/Weston application to test 
out how this should work. This is at: 
https://portal.beam.ltd.uk/public//test022-wayland-video-example.tar.gz


This tries to implement a C++ Widget style application using native 
Wayland. It is rough and could easily be doing things wrong wrt Wayland. 
However it does work to a reasonable degree.


However, I appear to see the same sort of issue I see with my Qt based 
system in that when a subsurface of a subsurface is used, the Gstreamer 
video is not seen.


This example normally (UseWidgetTop=0) has a top level xdg_toplevel 
desktop surface (Gui), a subsurface to that (Video) and then waylandsink 
creates a subsurface to that which it sets to de-sync mode.


When this example is run with UseWidgetTop=0 the video frames from 
gstreamer are only shown shown when the top subsurface is manually 
committed with gvideo->update() every second, otherwise the video 
pipeline is stalled. Waylandsink is stuck in a loop awaiting a Wayland 
callback after committing its subsurface.
If the Video Widget is a top level widget (UseWidgetTop=1) this works 
fine (only one subsurface deep ?).


I have tried using both Gstreamer's waylandsink and glimagesink 
elements, both show the same issue.


This seems to suggest that the de-synced subsurface system is not 
working properly with Weston, I miss-understand how this should work or 
I have a program error.

This has been tested on Fedora37 running Weston 10.0.1 under KDE/Plasma/X11.

1. Should de-synced subsurfaces under other subsurfaces work under 
Weston 10.0.1 ?


2. Do I miss-understand how this should work ?

3. Do I have some coding issue (sorry the code is a bit complex with 
Wayland callbacks and C++ timers etc) ?





Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-02 Thread Terry Barnaby

Hi Pekka,


Did you try making the "middle" QWidget *not* have a wl_surface of its
own?
Hack on Qt, then? Sorry, but I don't understand this insistence that
what sounds like a Qt bug must be workaround-able via Wayland.
Hmm, that does not sound right to me, but then again, I don't know Qt.

Wayland certainly does not impose such demand.


Well the way this is supposed to work, I believe, is:

1. There are two separate systems in operation here: Qt doing the
   general GUI and GStreamer waylandsink displaying the video. These
   systems know nothing of one another.
2. The link between these two systems is a Wayland surface in the
   Wayland server. QWidget will manage this surface (raise, lower,
   position etc.) and can draw into it if it wants.
3. Waylandsink creates a subsurface of that QWidget Wayland surface,
   sets it to be de-synced and and then proceeds to draw into this at
   the video frame rate.
4. There's quite a lot of hardware engine working going on in the
   background. For example video buffers may be in special memory like
   in a video or 2D hardware engine pipeline etc. Qt may be using
   separate 3D engine hardware etc.

I am not experienced with Wayland, but I think a "middle" surface is 
needed so this can be moved, raised,/lowered etc. relative to the 
applications main QWidgets and the waylandsink does not need to know 
about this (apart from resizes). Another option would be to modify 
waylandsink to do the necessary things with its subsurface. But having a 
separate shared surface from the Qt applications main drawing surface 
seems safer and I am trying to keep with what I think is the accepted 
method with minimal changes to upstream code.


This Gstreamer video display method came from the older X11 way of doing 
this with XWindows.


As stated the reason this is not working with Qt6/Wayland/Weston is 
probably a Qt6 bug/issue/feature. However a way to understand what is 
happening is to look at the shared Wayland level and maybe there is a 
way with Wayland protocol commands of overcoming the issue so I can work 
around the problem I am having in a short time (timescales!) before a 
more proper fix is worked out. For example in X11 an XMapWindow() or 
XRaiseWindow() request or positioning/size requests may have worked and 
I wondered if I could do the same sort of thing in Wayland.


Even if the QtWayland issue is fixed, I may have to do something at the 
Wayland level as I'm not sure if subsurfaces are effectively moved, 
raised/lowered etc. when their parent surface is changed Wayland.


Anyway as David has suggested, I have raised an issue on the Qt Jira 
bugs list at: https://bugreports.qt.io/browse/QTBUG-122941.


Terry


On 29/02/2024 13:39, Pekka Paalanen wrote:

On Wed, 28 Feb 2024 18:04:28 +
Terry Barnaby  wrote:


Hi Pekka,

Some questions below:

Thanks

Terry
On 26/02/2024 15:56, Pekka Paalanen wrote:

Ok. What Wayland API requests cause a surface to actually be mapped

(Sorry don't really know Wayland protocol) ?

Hi Terry,

the basic protocol object is wl_surface. The wl_surface needs to be
given a "role". This is a specific protocol term. xdg_surface and
xdg_toplevel can give a wl_surface the role of a top-level window,
which means it can get mapped when you play by the rules set in the
xdg_toplevel specification.

Sub-surface is another role.

So the rules are always role specific, but at some point they all
require content on the wl_surface, which is given with the attach
request followed by a commit request. Role rules specify when and how
that can be done.

Yes, I have heard that. But what I don't knoe is from the client:

  1. How do I find out the surfaces role ?

It is what you (or Qt, or Gst) set it to. There is no way to query it
(or any other thing) back by Wayland.

If you look at a protocol dump (e.g. WAYLAND_DEBUG=client in
environment), you can could follow the protocol messages and trace back
what the role was set to.


  2. How would I set the surface to have a role such that it would be
 mapped and thus visible ? Just wondering if I can work around what I
 think is a QtWayland bug/issue/feature to make sure by second
 Widgets surface is mapped/visible so that the waylandsink subsurface
 can work. With X11 there were API calls to change the Windows state
 and I was looking for something similar with Wayland.

There is no simple answer to this. You pick a role you need, and then
play by the protocol spec.

You do not have any surfaces without roles, though, so this would not
help you anyway. Roles cannot be changed, only set once per wl_surface
life time. Sub-surface is a role.


I need to find some way to actually display video, simply and
efficiently on an embedded platform, in a Qt application in the year 2024 :)

I have tried lots of work arounds but none have worked due to either Qt
issues, Wayland restrictions, Gstreamer restrictions, Weston
issues/restrictions, NXP hardware engine issues/restrictions etc. Any

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-29 Thread Pekka Paalanen
On Wed, 28 Feb 2024 18:04:28 +
Terry Barnaby  wrote:

> Hi Pekka,
> 
> Some questions below:
> 
> Thanks
> 
> Terry
> On 26/02/2024 15:56, Pekka Paalanen wrote:
> > Ok. What Wayland API requests cause a surface to actually be mapped  
> >> (Sorry don't really know Wayland protocol) ?  
> > Hi Terry,
> >
> > the basic protocol object is wl_surface. The wl_surface needs to be
> > given a "role". This is a specific protocol term. xdg_surface and
> > xdg_toplevel can give a wl_surface the role of a top-level window,
> > which means it can get mapped when you play by the rules set in the
> > xdg_toplevel specification.
> >
> > Sub-surface is another role.
> >
> > So the rules are always role specific, but at some point they all
> > require content on the wl_surface, which is given with the attach
> > request followed by a commit request. Role rules specify when and how
> > that can be done.  
> 
> Yes, I have heard that. But what I don't knoe is from the client:
> 
>  1. How do I find out the surfaces role ?

It is what you (or Qt, or Gst) set it to. There is no way to query it
(or any other thing) back by Wayland.

If you look at a protocol dump (e.g. WAYLAND_DEBUG=client in
environment), you can could follow the protocol messages and trace back
what the role was set to.

>  2. How would I set the surface to have a role such that it would be
> mapped and thus visible ? Just wondering if I can work around what I
> think is a QtWayland bug/issue/feature to make sure by second
> Widgets surface is mapped/visible so that the waylandsink subsurface
> can work. With X11 there were API calls to change the Windows state
> and I was looking for something similar with Wayland.

There is no simple answer to this. You pick a role you need, and then
play by the protocol spec.

You do not have any surfaces without roles, though, so this would not
help you anyway. Roles cannot be changed, only set once per wl_surface
life time. Sub-surface is a role.

> 
> I need to find some way to actually display video, simply and 
> efficiently on an embedded platform, in a Qt application in the year 2024 :)
> 
> I have tried lots of work arounds but none have worked due to either Qt 
> issues, Wayland restrictions, Gstreamer restrictions, Weston 
> issues/restrictions, NXP hardware engine issues/restrictions etc. Any 
> ideas gratefully received!
> 

Did you try making the "middle" QWidget *not* have a wl_surface of its
own?

> >  
> >> The higher level surfaces are created/managed by QtWayland, but maybe I
> >> can override somehow.
> >>  
> > That does not feel like a good idea to me. But I also cannot really
> > help you, because this all seems to be pointing at Qt which I know
> > nothing about.  
> 
> Yes, thats probably true. But I need to get this to work somehow, even 
> if a kludge for now.

Hack on Qt, then? Sorry, but I don't understand this insistence that
what sounds like a Qt bug must be workaround-able via Wayland.


> >  
> >>> 
>  As mentioned before, If I use QPainter to draw into the video QWidget it
>  actually draws into the top QWidgets 16 surface using Wayland protocol.
>  I would have thought it would draw into its own QWidget surface 18 as it
>  has Qt::WA_NativeWindow set ?  
> > This question seems to be the essence. If Qt worked like you expected,
> > then I think the whole program would work.
> >
> > However, there is no need (from Wayland perspective) to have a
> > wl_surface as "surface 18" in the middle. What would be best is if you
> > could somehow have that "middle" QWidget but without it's own
> > wl_surface. Have the QWidget control the GStreamer wl_surface position
> > through wl_subsurface interface, while GStreamer plays the video
> > through wl_surface interface.
> >
> > Wayland does not relay sub-surface resizing or positioning between two
> > client-side components at all. There is not even a way query a
> > sub-surface position. So the positioning and sizing is all done in
> > Qt<->GStreamer somehow without Wayland in between. Only the end result
> > gets sent over Wayland to display: Qt sets up the position and
> > GStreamer sets up the size and content.  
> 
> I think this middle surface is needed so that Qt can manage the 
> "Windows" at this level, like raise, lower, resize et. and Wayland 

Hmm, that does not sound right to me, but then again, I don't know Qt.

Wayland certainly does not impose such demand.

> sink's subsurface that is below this is separate and can be de-synced 
> for the video display etc. I normally (on X11 and with Qt5/Wayland), 
> respond to QWidget resizes and use the Gstreamer API calls to 
> position/resize the waylandsink's sub-surface. This all works quite 
> nicely under X11 and worked (although not nicely) under Qt5/Wayland.
> 
> >  
>  I assume that Qtwayland is not "activating" the video QWidget's surface
>  or using it for some reason (send subsurface expose events ?) ?
>  
> >>> If that's true, 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-28 Thread Terry Barnaby

Hi Pekka,

Some questions below:

Thanks

Terry
On 26/02/2024 15:56, Pekka Paalanen wrote:

Ok. What Wayland API requests cause a surface to actually be mapped

(Sorry don't really know Wayland protocol) ?

Hi Terry,

the basic protocol object is wl_surface. The wl_surface needs to be
given a "role". This is a specific protocol term. xdg_surface and
xdg_toplevel can give a wl_surface the role of a top-level window,
which means it can get mapped when you play by the rules set in the
xdg_toplevel specification.

Sub-surface is another role.

So the rules are always role specific, but at some point they all
require content on the wl_surface, which is given with the attach
request followed by a commit request. Role rules specify when and how
that can be done.


Yes, I have heard that. But what I don't knoe is from the client:

1. How do I find out the surfaces role ?
2. How would I set the surface to have a role such that it would be
   mapped and thus visible ? Just wondering if I can work around what I
   think is a QtWayland bug/issue/feature to make sure by second
   Widgets surface is mapped/visible so that the waylandsink subsurface
   can work. With X11 there were API calls to change the Windows state
   and I was looking for something similar with Wayland.

I need to find some way to actually display video, simply and 
efficiently on an embedded platform, in a Qt application in the year 2024 :)


I have tried lots of work arounds but none have worked due to either Qt 
issues, Wayland restrictions, Gstreamer restrictions, Weston 
issues/restrictions, NXP hardware engine issues/restrictions etc. Any 
ideas gratefully received!






The higher level surfaces are created/managed by QtWayland, but maybe I
can override somehow.


That does not feel like a good idea to me. But I also cannot really
help you, because this all seems to be pointing at Qt which I know
nothing about.


Yes, thats probably true. But I need to get this to work somehow, even 
if a kludge for now.





  

As mentioned before, If I use QPainter to draw into the video QWidget it
actually draws into the top QWidgets 16 surface using Wayland protocol.
I would have thought it would draw into its own QWidget surface 18 as it
has Qt::WA_NativeWindow set ?

This question seems to be the essence. If Qt worked like you expected,
then I think the whole program would work.

However, there is no need (from Wayland perspective) to have a
wl_surface as "surface 18" in the middle. What would be best is if you
could somehow have that "middle" QWidget but without it's own
wl_surface. Have the QWidget control the GStreamer wl_surface position
through wl_subsurface interface, while GStreamer plays the video
through wl_surface interface.

Wayland does not relay sub-surface resizing or positioning between two
client-side components at all. There is not even a way query a
sub-surface position. So the positioning and sizing is all done in
Qt<->GStreamer somehow without Wayland in between. Only the end result
gets sent over Wayland to display: Qt sets up the position and
GStreamer sets up the size and content.


I think this middle surface is needed so that Qt can manage the 
"Windows" at this level, like raise, lower, resize et. and Wayland 
sink's subsurface that is below this is separate and can be de-synced 
for the video display etc. I normally (on X11 and with Qt5/Wayland), 
respond to QWidget resizes and use the Gstreamer API calls to 
position/resize the waylandsink's sub-surface. This all works quite 
nicely under X11 and worked (although not nicely) under Qt5/Wayland.






I assume that Qtwayland is not "activating" the video QWidget's surface
or using it for some reason (send subsurface expose events ?) ?
  

If that's true, then it is very relevant. A sub-surface becomes mapped
and visible when:

- its parent surface is mapped and visible, and

- the parent surface is committed after the sub-surface has been
created and associated, and

- if the sub-surface is in synchronized mode, there also needs to be a
parent surface commit after every sub-surface commit you want to
become visible. So if you do the first sub-surface sync-mode commit
with a buffer after the parent has already committed the
sub-surface's creation, the parent surface needs too commit again.

This applies recursively, too, and you have a sub-sub-surface there.

Do you actually need to sub-surface in the middle? Have you tried
without it?

I am not doing anything with Wayland directly. Qt is managing the higher
level surfaces/subsurfaces and then GStreamers waylandsink is passed one
of these Qt managed surfaces and it creates the subsurface 44. Looking
at waylandsink it should set this subsurface to be desynced so it can
draw into this surface without synchronising to the parents surface
managed by Qt.

Right, and desync is not enough if its parent is not mapped.


All I am trying to do is use the technique as mentioned in various
forums/lists to get 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-26 Thread Pekka Paalanen
On Mon, 26 Feb 2024 15:18:27 +
Terry Barnaby  wrote:

> Hi Pekka,
> 
> Thanks for the response. Notes below:
> 
> Terry
> 
> On 26/02/2024 13:28, Pekka Paalanen wrote:
> > On Sun, 25 Feb 2024 08:04:30 +
> > Terry Barnaby  wrote:
> >  
> >> Hi,
> >>
> >> I have investigated a bit further. I have built my own Weston server to
> >> run under X11 on Fedora37 so I can add printf's and debug more easily
> >> than using a cross compiled iMX8mp target system etc. I added a new
> >> dsmc-shell to Weston which is identical to kiosk-shell (apart from
> >> names) so I could play with that.
> >>
> >> When It run my simple QWidget test program (test016-qt6-video-example)
> >> which creates one top level QWidget with a child QWidget to display the
> >> Gstreamer video in it, I see the following surfaces/subsurfaces when
> >> desktop_surface_committed() is called in the dsmc-shell.
> >>
> >> Surface: 16 1044,620 mapped: 1 opaque: 0
> >>    View: 0x29182b0
> >>    Surface: 18 0,0 mapped: 0 opaque: 0
> >>     Surface: 44 640,480 mapped: 1 opaque: 0
> >>     Surface: 18 0,0 mapped: 0 opaque: 0
> >>    Surface: 17 0,0 mapped: 0 opaque: 0
> >>    Surface: 16 1044,620 mapped: 1 opaque: 0  
> > Btw. the sub-surface list also contains the parent weston_surface in
> > it, that's why surface 18 and 16 show up twice, I guess. It's used for
> > z-ordering.  
> 
> Yes, that was my view.
> 
> 
> >  
> >> Surface 16 is used by the top level QWidget, surface 18 is used by the
> >> Video QWidget and surface 44 is the GStreamer video display surface (I
> >> think!). This list is generated traversing the weston_surface's views
> >> and subsurface_list lists. The mapped is the "is_mapped" field of the
> >> surface.
> >> Only the top level weston_surface has a weston_view in the views list it
> >> that is any relevance. "weston-debug scene-graph" only shows the tope
> >> most surface, no subsurfaces.  
> > Right.
> >
> > A sub-surface requires its parent surface to be mapped in order to show
> > up on screen. This applies recursively, so surface 18 not being mapped
> > prevents surface 44 from showing up.
> >
> > IIRC, more or less only "fully mapped" weston_surfaces (as in, if it's
> > a sub-surface, the whole sub-surface ancestry path up to a top-level is
> > mapped) have a weston_view. weston_view defines where on screen a
> > weston_surface will be presented, so without a view it cannot show up.  
> 
> Ok. What Wayland API requests cause a surface to actually be mapped 
> (Sorry don't really know Wayland protocol) ?

Hi Terry,

the basic protocol object is wl_surface. The wl_surface needs to be
given a "role". This is a specific protocol term. xdg_surface and
xdg_toplevel can give a wl_surface the role of a top-level window,
which means it can get mapped when you play by the rules set in the
xdg_toplevel specification.

Sub-surface is another role.

So the rules are always role specific, but at some point they all
require content on the wl_surface, which is given with the attach
request followed by a commit request. Role rules specify when and how
that can be done.

> The higher level surfaces are created/managed by QtWayland, but maybe I 
> can override somehow.
> 

That does not feel like a good idea to me. But I also cannot really
help you, because this all seems to be pointing at Qt which I know
nothing about.

> >  
> >> As mentioned before, If I use QPainter to draw into the video QWidget it
> >> actually draws into the top QWidgets 16 surface using Wayland protocol.
> >> I would have thought it would draw into its own QWidget surface 18 as it
> >> has Qt::WA_NativeWindow set ?

This question seems to be the essence. If Qt worked like you expected,
then I think the whole program would work.

However, there is no need (from Wayland perspective) to have a
wl_surface as "surface 18" in the middle. What would be best is if you
could somehow have that "middle" QWidget but without it's own
wl_surface. Have the QWidget control the GStreamer wl_surface position
through wl_subsurface interface, while GStreamer plays the video
through wl_surface interface.

Wayland does not relay sub-surface resizing or positioning between two
client-side components at all. There is not even a way query a
sub-surface position. So the positioning and sizing is all done in
Qt<->GStreamer somehow without Wayland in between. Only the end result
gets sent over Wayland to display: Qt sets up the position and
GStreamer sets up the size and content.

> >>
> >> I assume that Qtwayland is not "activating" the video QWidget's surface
> >> or using it for some reason (send subsurface expose events ?) ?
> >>  
> > If that's true, then it is very relevant. A sub-surface becomes mapped
> > and visible when:
> >
> > - its parent surface is mapped and visible, and
> >
> > - the parent surface is committed after the sub-surface has been
> >created and associated, and
> >
> > - if the sub-surface is in synchronized mode, there also needs to be a
> >parent 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-26 Thread Terry Barnaby

Hi Pekka,

Thanks for the response. Notes below:

Terry

On 26/02/2024 13:28, Pekka Paalanen wrote:

On Sun, 25 Feb 2024 08:04:30 +
Terry Barnaby  wrote:


Hi,

I have investigated a bit further. I have built my own Weston server to
run under X11 on Fedora37 so I can add printf's and debug more easily
than using a cross compiled iMX8mp target system etc. I added a new
dsmc-shell to Weston which is identical to kiosk-shell (apart from
names) so I could play with that.

When It run my simple QWidget test program (test016-qt6-video-example)
which creates one top level QWidget with a child QWidget to display the
Gstreamer video in it, I see the following surfaces/subsurfaces when
desktop_surface_committed() is called in the dsmc-shell.

Surface: 16 1044,620 mapped: 1 opaque: 0
   View: 0x29182b0
   Surface: 18 0,0 mapped: 0 opaque: 0
    Surface: 44 640,480 mapped: 1 opaque: 0
    Surface: 18 0,0 mapped: 0 opaque: 0
   Surface: 17 0,0 mapped: 0 opaque: 0
   Surface: 16 1044,620 mapped: 1 opaque: 0

Btw. the sub-surface list also contains the parent weston_surface in
it, that's why surface 18 and 16 show up twice, I guess. It's used for
z-ordering.


Yes, that was my view.





Surface 16 is used by the top level QWidget, surface 18 is used by the
Video QWidget and surface 44 is the GStreamer video display surface (I
think!). This list is generated traversing the weston_surface's views
and subsurface_list lists. The mapped is the "is_mapped" field of the
surface.
Only the top level weston_surface has a weston_view in the views list it
that is any relevance. "weston-debug scene-graph" only shows the tope
most surface, no subsurfaces.

Right.

A sub-surface requires its parent surface to be mapped in order to show
up on screen. This applies recursively, so surface 18 not being mapped
prevents surface 44 from showing up.

IIRC, more or less only "fully mapped" weston_surfaces (as in, if it's
a sub-surface, the whole sub-surface ancestry path up to a top-level is
mapped) have a weston_view. weston_view defines where on screen a
weston_surface will be presented, so without a view it cannot show up.


Ok. What Wayland API requests cause a surface to actually be mapped 
(Sorry don't really know Wayland protocol) ?


The higher level surfaces are created/managed by QtWayland, but maybe I 
can override somehow.






As mentioned before, If I use QPainter to draw into the video QWidget it
actually draws into the top QWidgets 16 surface using Wayland protocol.
I would have thought it would draw into its own QWidget surface 18 as it
has Qt::WA_NativeWindow set ?

I assume that Qtwayland is not "activating" the video QWidget's surface
or using it for some reason (send subsurface expose events ?) ?


If that's true, then it is very relevant. A sub-surface becomes mapped
and visible when:

- its parent surface is mapped and visible, and

- the parent surface is committed after the sub-surface has been
   created and associated, and

- if the sub-surface is in synchronized mode, there also needs to be a
   parent surface commit after every sub-surface commit you want to
   become visible. So if you do the first sub-surface sync-mode commit
   with a buffer after the parent has already committed the
   sub-surface's creation, the parent surface needs too commit again.

This applies recursively, too, and you have a sub-sub-surface there.

Do you actually need to sub-surface in the middle? Have you tried
without it?


I am not doing anything with Wayland directly. Qt is managing the higher 
level surfaces/subsurfaces and then GStreamers waylandsink is passed one 
of these Qt managed surfaces and it creates the subsurface 44. Looking 
at waylandsink it should set this subsurface to be desynced so it can 
draw into this surface without synchronising to the parents surface 
managed by Qt.


All I am trying to do is use the technique as mentioned in various 
forums/lists to get GStreamer to display a video such that it "appears" 
where a QWidget is on the screen. Mind you all of this info is very 
rough and ready. For X11 it appears stable, but for Qt/Wayland the info, 
and it appears functionality, is not quite all there.


When you say a sub-surface in the middle I presume you mean the surface 
18 of the lower QWidget ? Well I have tried using the top most QWidget's 
surface 16 and the video is displayed, although all over the 
application. I really need to manage this surface so it can be raised, 
lowered and resizes amongst the other QWidgets somehow. I have tried 
using direct Wayland API calls to create a subsurface manually from the 
top surface but so far I have just got protocol errors while trying 
this. It may be my bad Wayland client code or how it is interfering with 
Qt's Wayland interface.


I have even tried using a separate top level surface. Unfortunately as 
the standard Wayland protocols do not allow an application to move or 
manage top level Windows this is not useful. I guess I could create an 
extra 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-26 Thread Pekka Paalanen
On Sun, 25 Feb 2024 08:04:30 +
Terry Barnaby  wrote:

> Hi,
> 
> I have investigated a bit further. I have built my own Weston server to 
> run under X11 on Fedora37 so I can add printf's and debug more easily 
> than using a cross compiled iMX8mp target system etc. I added a new 
> dsmc-shell to Weston which is identical to kiosk-shell (apart from 
> names) so I could play with that.
> 
> When It run my simple QWidget test program (test016-qt6-video-example) 
> which creates one top level QWidget with a child QWidget to display the 
> Gstreamer video in it, I see the following surfaces/subsurfaces when 
> desktop_surface_committed() is called in the dsmc-shell.
> 
> Surface: 16 1044,620 mapped: 1 opaque: 0
>   View: 0x29182b0
>   Surface: 18 0,0 mapped: 0 opaque: 0
>    Surface: 44 640,480 mapped: 1 opaque: 0
>    Surface: 18 0,0 mapped: 0 opaque: 0
>   Surface: 17 0,0 mapped: 0 opaque: 0
>   Surface: 16 1044,620 mapped: 1 opaque: 0

Btw. the sub-surface list also contains the parent weston_surface in
it, that's why surface 18 and 16 show up twice, I guess. It's used for
z-ordering.

> 
> Surface 16 is used by the top level QWidget, surface 18 is used by the 
> Video QWidget and surface 44 is the GStreamer video display surface (I 
> think!). This list is generated traversing the weston_surface's views 
> and subsurface_list lists. The mapped is the "is_mapped" field of the 
> surface.
> Only the top level weston_surface has a weston_view in the views list it 
> that is any relevance. "weston-debug scene-graph" only shows the tope 
> most surface, no subsurfaces.

Right.

A sub-surface requires its parent surface to be mapped in order to show
up on screen. This applies recursively, so surface 18 not being mapped
prevents surface 44 from showing up.

IIRC, more or less only "fully mapped" weston_surfaces (as in, if it's
a sub-surface, the whole sub-surface ancestry path up to a top-level is
mapped) have a weston_view. weston_view defines where on screen a
weston_surface will be presented, so without a view it cannot show up.

> 
> As mentioned before, If I use QPainter to draw into the video QWidget it 
> actually draws into the top QWidgets 16 surface using Wayland protocol. 
> I would have thought it would draw into its own QWidget surface 18 as it 
> has Qt::WA_NativeWindow set ?
> 
> I assume that Qtwayland is not "activating" the video QWidget's surface 
> or using it for some reason (send subsurface expose events ?) ?
> 

If that's true, then it is very relevant. A sub-surface becomes mapped
and visible when:

- its parent surface is mapped and visible, and

- the parent surface is committed after the sub-surface has been
  created and associated, and

- if the sub-surface is in synchronized mode, there also needs to be a
  parent surface commit after every sub-surface commit you want to
  become visible. So if you do the first sub-surface sync-mode commit
  with a buffer after the parent has already committed the
  sub-surface's creation, the parent surface needs too commit again.

This applies recursively, too, and you have a sub-sub-surface there.

Do you actually need to sub-surface in the middle? Have you tried
without it?


Thanks,
pq

> 
> I note in the qtwayland change logs, for the earlier QT5 for subsurface 
> changes:
> dist/changes-5.6.3: - [QTBUG-52118] Fixed subsurface positions sometimes 
> not being committed.
> dist/changes-5.11.2: - [QTBUG-69643] Fixed a bug where subsurfaces would 
> not be rendered if clients added them before a WaylandQuickItem was 
> created for the parent surface
> dist/changes-5.12.0: - [QTBUG-49809] Added support for 
> wl_subsurface.place_above and place_below in WaylandQuickItem.
> dist/changes-5.15.2: - [QTBUG-86176] We now send subsurface expose 
> events when a different toplevel (such as a dialog) is configured.
> 
> Could any of these be related ?
> 
> Terry


pgpJnC5Zfi2QJ.pgp
Description: OpenPGP digital signature


Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-25 Thread Terry Barnaby

Hi,

I have investigated a bit further. I have built my own Weston server to 
run under X11 on Fedora37 so I can add printf's and debug more easily 
than using a cross compiled iMX8mp target system etc. I added a new 
dsmc-shell to Weston which is identical to kiosk-shell (apart from 
names) so I could play with that.


When It run my simple QWidget test program (test016-qt6-video-example) 
which creates one top level QWidget with a child QWidget to display the 
Gstreamer video in it, I see the following surfaces/subsurfaces when 
desktop_surface_committed() is called in the dsmc-shell.


Surface: 16 1044,620 mapped: 1 opaque: 0
 View: 0x29182b0
 Surface: 18 0,0 mapped: 0 opaque: 0
  Surface: 44 640,480 mapped: 1 opaque: 0
  Surface: 18 0,0 mapped: 0 opaque: 0
 Surface: 17 0,0 mapped: 0 opaque: 0
 Surface: 16 1044,620 mapped: 1 opaque: 0

Surface 16 is used by the top level QWidget, surface 18 is used by the 
Video QWidget and surface 44 is the GStreamer video display surface (I 
think!). This list is generated traversing the weston_surface's views 
and subsurface_list lists. The mapped is the "is_mapped" field of the 
surface.
Only the top level weston_surface has a weston_view in the views list it 
that is any relevance. "weston-debug scene-graph" only shows the tope 
most surface, no subsurfaces.


As mentioned before, If I use QPainter to draw into the video QWidget it 
actually draws into the top QWidgets 16 surface using Wayland protocol. 
I would have thought it would draw into its own QWidget surface 18 as it 
has Qt::WA_NativeWindow set ?


I assume that Qtwayland is not "activating" the video QWidget's surface 
or using it for some reason (send subsurface expose events ?) ?



I note in the qtwayland change logs, for the earlier QT5 for subsurface 
changes:
dist/changes-5.6.3: - [QTBUG-52118] Fixed subsurface positions sometimes 
not being committed.
dist/changes-5.11.2: - [QTBUG-69643] Fixed a bug where subsurfaces would 
not be rendered if clients added them before a WaylandQuickItem was 
created for the parent surface
dist/changes-5.12.0: - [QTBUG-49809] Added support for 
wl_subsurface.place_above and place_below in WaylandQuickItem.
dist/changes-5.15.2: - [QTBUG-86176] We now send subsurface expose 
events when a different toplevel (such as a dialog) is configured.


Could any of these be related ?

Terry
On 23/02/2024 09:29, Terry Barnaby wrote:

Hi David,

Many thanks for the reply and the info on how to get the ID.

I have added a basic example with some debug output at: 
https://portal.beam.ltd.uk/public//test016-qt6-video-example.tar.gz


If there are any ideas of things I could look at/investigate I am all 
ears!


In a previous email I stated:
I have tried using "weston-debug scene-graph" and I am coming to the 
conclusion that qtwayland 6.5.0 is not really using native Wayland 
surfaces when Qt::WA_NativeWindow is used. From what I can see (and I 
could easily be wrong) the Wayland protocol shows wl_surfaces being 
created and two QWidget's QPlatformNativeInterface 
nativeResourceForWindow("surface", windowHandle()) function does 
return different wl_surface pointers but even at the QWidget level 
(ignoring gstreamer), a QPainter paint into each of these QWidgets 
actually uses Wayland to draw into just the one top level surface and 
"weston-debug scene-graph" shows only one application xdg_toplevel 
surface and no subsurfaces. I don't know how to determine the Wayland 
surface ID from a wl_surface pointer unfortunately to really check this.


If my Video QWidget(0) is a top level QWidget, then video is shown 
and "weston-debug scene-graph" shows the application xdg_toplevel and 
two wl_subsurfaces as children.


Unfortunately I think "weston-debug scene-graph" only shows surfaces 
that are actually "active" so I can't see all of the surfaces that 
Weston actually knows about (is there a method of doing this ?).


My feeling is that although Qtwayland is creating native surfaces, it 
actually only uses the one top level one and presumably doesn't 
"activate" (set a role, do something ?) with the other surfaces.


Does anyone know a good list/place where I can ask such detailed 
qtwayland questions ?


I guess I can work around this by manually creating a Wayland 
subsurface from the Qt top level surface and handing that to 
waylandsink and then manage this subsurface, like hiding, showing and 
resizing, when the QWidget is hidden/shown/resized.


Or could there be a way of "activating" the child QWidget's Wayland 
surface ? 



Terry

On 23/02/2024 08:35, David Edmundson wrote:
On Fri, Feb 23, 2024 at 6:15 AM Terry Barnaby  
wrote:

I don't know how to determine the Wayland surface ID from a
wl_surface pointer unfortunately to really check this.

wl_proxy_get_id(static_cast(myWlSurface));


Possibly when QWidget is below in hierarcy to be a child of of a 
parent,

as described in

That's fine.

A QWidget with WA_NativeWindow will create a QWindow with a parent. A
QWindow with a 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-23 Thread Terry Barnaby

Hi David,

Many thanks for the reply and the info on how to get the ID.

I have added a basic example with some debug output at: 
https://portal.beam.ltd.uk/public//test016-qt6-video-example.tar.gz


If there are any ideas of things I could look at/investigate I am all ears!

In a previous email I stated:
I have tried using "weston-debug scene-graph" and I am coming to the 
conclusion that qtwayland 6.5.0 is not really using native Wayland 
surfaces when Qt::WA_NativeWindow is used. From what I can see (and I 
could easily be wrong) the Wayland protocol shows wl_surfaces being 
created and two QWidget's QPlatformNativeInterface 
nativeResourceForWindow("surface", windowHandle()) function does 
return different wl_surface pointers but even at the QWidget level 
(ignoring gstreamer), a QPainter paint into each of these QWidgets 
actually uses Wayland to draw into just the one top level surface and 
"weston-debug scene-graph" shows only one application xdg_toplevel 
surface and no subsurfaces. I don't know how to determine the Wayland 
surface ID from a wl_surface pointer unfortunately to really check this.


If my Video QWidget(0) is a top level QWidget, then video is shown and 
"weston-debug scene-graph" shows the application xdg_toplevel and two 
wl_subsurfaces as children.


Unfortunately I think "weston-debug scene-graph" only shows surfaces 
that are actually "active" so I can't see all of the surfaces that 
Weston actually knows about (is there a method of doing this ?).


My feeling is that although Qtwayland is creating native surfaces, it 
actually only uses the one top level one and presumably doesn't 
"activate" (set a role, do something ?) with the other surfaces.


Does anyone know a good list/place where I can ask such detailed 
qtwayland questions ?


I guess I can work around this by manually creating a Wayland 
subsurface from the Qt top level surface and handing that to 
waylandsink and then manage this subsurface, like hiding, showing and 
resizing, when the QWidget is hidden/shown/resized.


Or could there be a way of "activating" the child QWidget's Wayland 
surface ? 



Terry

On 23/02/2024 08:35, David Edmundson wrote:

On Fri, Feb 23, 2024 at 6:15 AM Terry Barnaby  wrote:

I don't know how to determine the Wayland surface ID from a
wl_surface pointer unfortunately to really check this.

wl_proxy_get_id(static_cast(myWlSurface));



Possibly when QWidget is below in hierarcy to be a child of of a parent,
as described in

That's fine.

A QWidget with WA_NativeWindow will create a QWindow with a parent. A
QWindow with a parent will create a subsurface in wayland terms.
But it is a subsurface where Qt is managing it and you're also
committing on it, which can be a bit confusing and going through
widgets to create a subsurface isn't really needed.
There's a bunch of other options there.


---

Can you link your test app. You can send me a private email and I'll
take a look.  It doesn't seem like a core wayland problem more a
Qt/application setup issue so far. Then we can follow it up on Qt's
Jira if there is a Qt issue.

David Edmundson  - QtWayland Maintainer





Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-23 Thread Marius Vlad
Hi,
On Fri, Feb 23, 2024 at 06:14:11AM +, Terry Barnaby wrote:
> I have tried using "weston-debug scene-graph" and I am coming to the
> conclusion that qtwayland 6.5.0 is not really using native Wayland surfaces
> when Qt::WA_NativeWindow is used. From what I can see (and I could easily be
> wrong) the Wayland protocol shows wl_surfaces being created and two
> QWidget's QPlatformNativeInterface nativeResourceForWindow("surface",
> windowHandle()) function does return different wl_surface pointers but even
> at the QWidget level (ignoring gstreamer), a QPainter paint into each of
> these QWidgets actually uses Wayland to draw into just the one top level
> surface and "weston-debug scene-graph" shows only one application
> xdg_toplevel surface and no subsurfaces. I don't know how to determine the
> Wayland surface ID from a wl_surface pointer unfortunately to really check
> this.
I suppose this is to expected given that you don't actually see the video. 
> 
> If my Video QWidget(0) is a top level QWidget, then video is shown and
> "weston-debug scene-graph" shows the application xdg_toplevel and two
> wl_subsurfaces as children.
> 
> Unfortunately I think "weston-debug scene-graph" only shows surfaces that
> are actually "active" so I can't see all of the surfaces that Weston
> actually knows about (is there a method of doing this ?).
Mapped or not, Weston will print out views associated with a surface, if
those views are part of a layer. I don't know what active means in this
case, but you won't be activating wl_surfaces but rather the top-level
xdg-shell window.  Depending on the Weston version it would explicit say
that or not (surface/view being not mapped).
> 
> My feeling is that although Qtwayland is creating native surfaces, it
> actually only uses the one top level one and presumably doesn't "activate"
> (set a role, do something ?) with the other surfaces.
WAYLAND_DEBUG=1 could tell if it creates or not subsurfaces underneath.
> 
> Does anyone know a good list/place where I can ask such detailed qtwayland
> questions ?
https://bugreports.qt.io/projects/QTBUG/issues/QTBUG-122683?filter=allopenissues
> 
> I guess I can work around this by manually creating a Wayland subsurface
> from the Qt top level surface and handing that to waylandsink and then
> manage this subsurface, like hiding, showing and resizing, when the QWidget
> is hidden/shown/resized.
> 
> Or could there be a way of "activating" the child QWidget's Wayland surface
> ?
> 
> 
> 
> On 22/02/2024 18:44, Terry Barnaby wrote:
> > Hi Marius,
> > 
> > Many thanks for the info.
> > 
> > Some notes/questions below:
> > 
> > Terry
> > On 22/02/2024 17:49, Marius Vlad wrote:
> > > Hi,
> > > On Thu, Feb 22, 2024 at 03:21:01PM +, Terry Barnaby wrote:
> > > > Hi,
> > > > 
> > > > We are developing a video processing system that runs on an NXP imx8
> > > > processor using a Yocto embedded Linux system that has Qt6, GStreamer,
> > > > Wayland and Weston.
> > > > 
> > > > We are having a problem displaying the video stream from GStreamer on a
> > > > QWidget. In the past we had this working with Qt5 and older GStreamer,
> > > > Wayland and Weston.
> > > > 
> > > > A simple test program also shows the issue on Fedora37 with QT6 and
> > > > KDE/Plasma/Wayland.
> > > I'm tempted to say if this happens on a desktop with the same Qt
> > > version and
> > > other compositors to be an issue with Qt rather than waylandsink or
> > > the compositor. Note that on NXP they have their own modified Weston
> > > version.
> > 
> > That is my current feeling and is one reason why I tried it on Fedora
> > with whatever Wayland compositor KDE/Plasma is using.
> > 
> > 
> > > > The technique we are using is to get the Wayland surface from
> > > > the QWidget is
> > > > using (It has been configured to use a Qt::WA_NativeWindow) and
> > > > pass this to
> > > > the GStreamer's waylandsink which should then update this
> > > > surface with video
> > > > frames (via hardware). This works when the QWidget is a top
> > > > level Window
> > > > widget (QWidget(0)), but if this QWidget is below others in the
> > > > hierarchy no
> > > > video is seen and the gstreamer pipeline line is stalled.
> > > So the assumption is that aren't there other widgets which obscures this
> > > one, when you move it below others?
> > 
> > My simple test example has two QWidgets with the one for video being
> > created as a child of the first so it should be above all others. I have
> > even tried drawing in it to make sure and it displays its Qt drawn
> > contents fine, just not the video stream.
> > 
> > 
> > > > It appears that waylandsink does:
> > > > 
> > > > Creates a surface callback:
> > > > 
> > > >    callback = wl_surface_frame (surface);
> > > > 
> > > >    wl_callback_add_listener (callback, _callback_listener, self);
> > > > 
> > > > Then adds a buffer to a surface:
> > > > 
> > > >  gst_wl_buffer_attach (buffer, priv->video_surface_wrapper);
> > > >  

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-23 Thread David Edmundson
On Fri, Feb 23, 2024 at 6:15 AM Terry Barnaby  wrote:
>I don't know how to determine the Wayland surface ID from a
> wl_surface pointer unfortunately to really check this.

wl_proxy_get_id(static_cast(myWlSurface));


> >> Possibly when QWidget is below in hierarcy to be a child of of a parent,
> >> as described in

That's fine.

A QWidget with WA_NativeWindow will create a QWindow with a parent. A
QWindow with a parent will create a subsurface in wayland terms.
But it is a subsurface where Qt is managing it and you're also
committing on it, which can be a bit confusing and going through
widgets to create a subsurface isn't really needed.
There's a bunch of other options there.


---

Can you link your test app. You can send me a private email and I'll
take a look.  It doesn't seem like a core wayland problem more a
Qt/application setup issue so far. Then we can follow it up on Qt's
Jira if there is a Qt issue.

David Edmundson  - QtWayland Maintainer


Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-22 Thread Terry Barnaby
I have tried using "weston-debug scene-graph" and I am coming to the 
conclusion that qtwayland 6.5.0 is not really using native Wayland 
surfaces when Qt::WA_NativeWindow is used. From what I can see (and I 
could easily be wrong) the Wayland protocol shows wl_surfaces being 
created and two QWidget's QPlatformNativeInterface 
nativeResourceForWindow("surface", windowHandle()) function does return 
different wl_surface pointers but even at the QWidget level (ignoring 
gstreamer), a QPainter paint into each of these QWidgets actually uses 
Wayland to draw into just the one top level surface and "weston-debug 
scene-graph" shows only one application xdg_toplevel surface and no 
subsurfaces. I don't know how to determine the Wayland surface ID from a 
wl_surface pointer unfortunately to really check this.


If my Video QWidget(0) is a top level QWidget, then video is shown and 
"weston-debug scene-graph" shows the application xdg_toplevel and two 
wl_subsurfaces as children.


Unfortunately I think "weston-debug scene-graph" only shows surfaces 
that are actually "active" so I can't see all of the surfaces that 
Weston actually knows about (is there a method of doing this ?).


My feeling is that although Qtwayland is creating native surfaces, it 
actually only uses the one top level one and presumably doesn't 
"activate" (set a role, do something ?) with the other surfaces.


Does anyone know a good list/place where I can ask such detailed 
qtwayland questions ?


I guess I can work around this by manually creating a Wayland subsurface 
from the Qt top level surface and handing that to waylandsink and then 
manage this subsurface, like hiding, showing and resizing, when the 
QWidget is hidden/shown/resized.


Or could there be a way of "activating" the child QWidget's Wayland 
surface ?




On 22/02/2024 18:44, Terry Barnaby wrote:

Hi Marius,

Many thanks for the info.

Some notes/questions below:

Terry
On 22/02/2024 17:49, Marius Vlad wrote:

Hi,
On Thu, Feb 22, 2024 at 03:21:01PM +, Terry Barnaby wrote:

Hi,

We are developing a video processing system that runs on an NXP imx8
processor using a Yocto embedded Linux system that has Qt6, GStreamer,
Wayland and Weston.

We are having a problem displaying the video stream from GStreamer on a
QWidget. In the past we had this working with Qt5 and older GStreamer,
Wayland and Weston.

A simple test program also shows the issue on Fedora37 with QT6 and
KDE/Plasma/Wayland.
I'm tempted to say if this happens on a desktop with the same Qt 
version and

other compositors to be an issue with Qt rather than waylandsink or
the compositor. Note that on NXP they have their own modified Weston 
version.


That is my current feeling and is one reason why I tried it on Fedora 
with whatever Wayland compositor KDE/Plasma is using.



The technique we are using is to get the Wayland surface from the 
QWidget is
using (It has been configured to use a Qt::WA_NativeWindow) and pass 
this to
the GStreamer's waylandsink which should then update this surface 
with video
frames (via hardware). This works when the QWidget is a top level 
Window
widget (QWidget(0)), but if this QWidget is below others in the 
hierarchy no

video is seen and the gstreamer pipeline line is stalled.

So the assumption is that aren't there other widgets which obscures this
one, when you move it below others?


My simple test example has two QWidgets with the one for video being 
created as a child of the first so it should be above all others. I 
have even tried drawing in it to make sure and it displays its Qt 
drawn contents fine, just not the video stream.




It appears that waylandsink does:

Creates a surface callback:

   callback = wl_surface_frame (surface);

   wl_callback_add_listener (callback, _callback_listener, self);

Then adds a buffer to a surface:

 gst_wl_buffer_attach (buffer, priv->video_surface_wrapper);
 wl_surface_set_buffer_scale (priv->video_surface_wrapper, 
priv->scale);
 wl_surface_damage_buffer (priv->video_surface_wrapper, 0, 0, 
G_MAXINT32,

G_MAXINT32);
 wl_surface_commit (priv->video_surface_wrapper);

But never gets a callback and just sits in a loop awaiting that 
callback.


I assume that the surface waylandsink is using, which is created 
using the
original QWidget surface (sub-surface ? with window ?) is not 
"active" for

some reason.

Possibly when QWidget is below in hierarcy to be a child of of a parent,
as described in 
https://wayland.app/protocols/xdg-shell#xdg_toplevel:request:set_parent,

so I assume to have a different surface than the parent one. This would
be easy to determine with WAYLAND_DEBUG. Seems unlikely to a itself a
sub-surface of a surface.


I haven't really got the gist of whats going on, but waylandsink 
certainly creates a subsurface from the QWidget surface, in fact it 
seems to create a few things.


I assume a subsurface is used so the video can be displayed in that 
subsurface separately from the parent (de synced 

Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-22 Thread Terry Barnaby

Hi Marius,

Many thanks for the info.

Some notes/questions below:

Terry
On 22/02/2024 17:49, Marius Vlad wrote:

Hi,
On Thu, Feb 22, 2024 at 03:21:01PM +, Terry Barnaby wrote:

Hi,

We are developing a video processing system that runs on an NXP imx8
processor using a Yocto embedded Linux system that has Qt6, GStreamer,
Wayland and Weston.

We are having a problem displaying the video stream from GStreamer on a
QWidget. In the past we had this working with Qt5 and older GStreamer,
Wayland and Weston.

A simple test program also shows the issue on Fedora37 with QT6 and
KDE/Plasma/Wayland.

I'm tempted to say if this happens on a desktop with the same Qt version and
other compositors to be an issue with Qt rather than waylandsink or
the compositor. Note that on NXP they have their own modified Weston version.


That is my current feeling and is one reason why I tried it on Fedora 
with whatever Wayland compositor KDE/Plasma is using.




The technique we are using is to get the Wayland surface from the QWidget is
using (It has been configured to use a Qt::WA_NativeWindow) and pass this to
the GStreamer's waylandsink which should then update this surface with video
frames (via hardware). This works when the QWidget is a top level Window
widget (QWidget(0)), but if this QWidget is below others in the hierarchy no
video is seen and the gstreamer pipeline line is stalled.

So the assumption is that aren't there other widgets which obscures this
one, when you move it below others?


My simple test example has two QWidgets with the one for video being 
created as a child of the first so it should be above all others. I have 
even tried drawing in it to make sure and it displays its Qt drawn 
contents fine, just not the video stream.




It appears that waylandsink does:

Creates a surface callback:

   callback = wl_surface_frame (surface);

   wl_callback_add_listener (callback, _callback_listener, self);

Then adds a buffer to a surface:

     gst_wl_buffer_attach (buffer, priv->video_surface_wrapper);
     wl_surface_set_buffer_scale (priv->video_surface_wrapper, priv->scale);
     wl_surface_damage_buffer (priv->video_surface_wrapper, 0, 0, G_MAXINT32,
G_MAXINT32);
     wl_surface_commit (priv->video_surface_wrapper);

But never gets a callback and just sits in a loop awaiting that callback.

I assume that the surface waylandsink is using, which is created using the
original QWidget surface (sub-surface ? with window ?) is not "active" for
some reason.

Possibly when QWidget is below in hierarcy to be a child of of a parent,
as described in 
https://wayland.app/protocols/xdg-shell#xdg_toplevel:request:set_parent,
so I assume to have a different surface than the parent one. This would
be easy to determine with WAYLAND_DEBUG. Seems unlikely to a itself a
sub-surface of a surface.


I haven't really got the gist of whats going on, but waylandsink 
certainly creates a subsurface from the QWidget surface, in fact it 
seems to create a few things.


I assume a subsurface is used so the video can be displayed in that 
subsurface separately from the parent (de synced from it).





I am trying to debug this, but this graphics stack is quite complicated with
waylandsink, qtwayland, wayland-lib and Weston not to mention the NXP
hardware levels. My thoughts are that it is something qtwayland is doing
with the surface stack or thread locking issues (gstreamer uses separate
threads). I also don't understand Wayland or Weston in detail. So some
questions:

1. Anyone seen something like this ?

Someone else reported something similar but that by causing damage,
or moving pointer to make the video sub-surface to show up:
https://gitlab.freedesktop.org/wayland/weston/-/issues/843.


Thanks, I will have a look. Moving the mouse cursor in my case (at least 
with Weston) does not affect things.




2. Anyone any idea one where to look ?

3. Given the wl_surface in the Qt app or in waylandsink is there a way I can
print out its state and the surface hierarchy easily ?

In Weston there's something called scene-graph. You can grab it by
starting Weston with with the --debug argument, then you can print
with `weston-debug scene-graph` command. A more recent Weston version
would indent sub-surfaces by their (main) surface parent.


Thanks, that could be useful.



4. Any idea on any debug methods to use ?

WAYLAND_DEBUG=1 as env variable.


Any idea on how to get a surfaces ID from a C pointer so I can match up 
the QtWidget/waylandsink surface with the Wayland debug output ?




Cheers

Terry






Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-02-22 Thread Marius Vlad
Hi,
On Thu, Feb 22, 2024 at 03:21:01PM +, Terry Barnaby wrote:
> Hi,
> 
> We are developing a video processing system that runs on an NXP imx8
> processor using a Yocto embedded Linux system that has Qt6, GStreamer,
> Wayland and Weston.
> 
> We are having a problem displaying the video stream from GStreamer on a
> QWidget. In the past we had this working with Qt5 and older GStreamer,
> Wayland and Weston.
> 
> A simple test program also shows the issue on Fedora37 with QT6 and
> KDE/Plasma/Wayland.
I'm tempted to say if this happens on a desktop with the same Qt version and
other compositors to be an issue with Qt rather than waylandsink or
the compositor. Note that on NXP they have their own modified Weston version.
> 
> The technique we are using is to get the Wayland surface from the QWidget is
> using (It has been configured to use a Qt::WA_NativeWindow) and pass this to
> the GStreamer's waylandsink which should then update this surface with video
> frames (via hardware). This works when the QWidget is a top level Window
> widget (QWidget(0)), but if this QWidget is below others in the hierarchy no
> video is seen and the gstreamer pipeline line is stalled.
So the assumption is that aren't there other widgets which obscures this
one, when you move it below others?
> 
> It appears that waylandsink does:
> 
> Creates a surface callback:
> 
>   callback = wl_surface_frame (surface);
> 
>   wl_callback_add_listener (callback, _callback_listener, self);
> 
> Then adds a buffer to a surface:
> 
>     gst_wl_buffer_attach (buffer, priv->video_surface_wrapper);
>     wl_surface_set_buffer_scale (priv->video_surface_wrapper, priv->scale);
>     wl_surface_damage_buffer (priv->video_surface_wrapper, 0, 0, G_MAXINT32,
> G_MAXINT32);
>     wl_surface_commit (priv->video_surface_wrapper);
> 
> But never gets a callback and just sits in a loop awaiting that callback.
> 
> I assume that the surface waylandsink is using, which is created using the
> original QWidget surface (sub-surface ? with window ?) is not "active" for
> some reason.
Possibly when QWidget is below in hierarcy to be a child of of a parent, 
as described in 
https://wayland.app/protocols/xdg-shell#xdg_toplevel:request:set_parent,
so I assume to have a different surface than the parent one. This would
be easy to determine with WAYLAND_DEBUG. Seems unlikely to a itself a 
sub-surface of a surface.
> 
> 
> I am trying to debug this, but this graphics stack is quite complicated with
> waylandsink, qtwayland, wayland-lib and Weston not to mention the NXP
> hardware levels. My thoughts are that it is something qtwayland is doing
> with the surface stack or thread locking issues (gstreamer uses separate
> threads). I also don't understand Wayland or Weston in detail. So some
> questions:
> 
> 1. Anyone seen something like this ?
Someone else reported something similar but that by causing damage, 
or moving pointer to make the video sub-surface to show up: 
https://gitlab.freedesktop.org/wayland/weston/-/issues/843.
> 
> 2. Anyone any idea one where to look ?
> 
> 3. Given the wl_surface in the Qt app or in waylandsink is there a way I can
> print out its state and the surface hierarchy easily ?
In Weston there's something called scene-graph. You can grab it by
starting Weston with with the --debug argument, then you can print 
with `weston-debug scene-graph` command. A more recent Weston version
would indent sub-surfaces by their (main) surface parent.
> 
> 4. Any idea on any debug methods to use ?
WAYLAND_DEBUG=1 as env variable.
> 
> Cheers
> 
> Terry
> 
> 


signature.asc
Description: PGP signature