Re: why not flow control in wl_connection_flush?

2024-03-02 Thread jleivent
On Fri, 1 Mar 2024 11:59:36 +0200
Pekka Paalanen  wrote:

> ...
> The real problem here is though, how do you architect your app or
> toolkit so that it can stop and postpone what it is doing with Wayland
> when you detect that the socket is not draining fast enough? You might
> be calling into Wayland using libraries that do not support this.
> 
> Returning to the main event loop is the natural place to check and
> postpone, but this whole issue stems from the reality that apps do not
> return to their main loop often enough or do not know how to wait with
> sending even in the main loop.

I am concluding from this discussion that I don't think clients would
be constructed not to cause problems if they attempt to send too fast.

I think I may add an option to wl_connection_flush in my copy of
libwayland so that I can turn on client waiting on flush from an env
var.  It looks like it the change would be pretty small.  Unless you
think it would be worth making this a MR on its own?

If the client is single threaded, this will cause the whole client to
wait, which probably won't be a problem, considering the type of
clients that might try to be that fast.

If the client isn't single threaded, then it may cause a thread to wait
that the client doesn't expect to wait, which could be a problem for
that client, admittedly.



Re: Wayland debugging with Qtwayland, gstreamer waylandsink, wayland-lib and Weston

2024-03-02 Thread Terry Barnaby

Hi Pekka,


Did you try making the "middle" QWidget *not* have a wl_surface of its
own?
Hack on Qt, then? Sorry, but I don't understand this insistence that
what sounds like a Qt bug must be workaround-able via Wayland.
Hmm, that does not sound right to me, but then again, I don't know Qt.

Wayland certainly does not impose such demand.


Well the way this is supposed to work, I believe, is:

1. There are two separate systems in operation here: Qt doing the
   general GUI and GStreamer waylandsink displaying the video. These
   systems know nothing of one another.
2. The link between these two systems is a Wayland surface in the
   Wayland server. QWidget will manage this surface (raise, lower,
   position etc.) and can draw into it if it wants.
3. Waylandsink creates a subsurface of that QWidget Wayland surface,
   sets it to be de-synced and and then proceeds to draw into this at
   the video frame rate.
4. There's quite a lot of hardware engine working going on in the
   background. For example video buffers may be in special memory like
   in a video or 2D hardware engine pipeline etc. Qt may be using
   separate 3D engine hardware etc.

I am not experienced with Wayland, but I think a "middle" surface is 
needed so this can be moved, raised,/lowered etc. relative to the 
applications main QWidgets and the waylandsink does not need to know 
about this (apart from resizes). Another option would be to modify 
waylandsink to do the necessary things with its subsurface. But having a 
separate shared surface from the Qt applications main drawing surface 
seems safer and I am trying to keep with what I think is the accepted 
method with minimal changes to upstream code.


This Gstreamer video display method came from the older X11 way of doing 
this with XWindows.


As stated the reason this is not working with Qt6/Wayland/Weston is 
probably a Qt6 bug/issue/feature. However a way to understand what is 
happening is to look at the shared Wayland level and maybe there is a 
way with Wayland protocol commands of overcoming the issue so I can work 
around the problem I am having in a short time (timescales!) before a 
more proper fix is worked out. For example in X11 an XMapWindow() or 
XRaiseWindow() request or positioning/size requests may have worked and 
I wondered if I could do the same sort of thing in Wayland.


Even if the QtWayland issue is fixed, I may have to do something at the 
Wayland level as I'm not sure if subsurfaces are effectively moved, 
raised/lowered etc. when their parent surface is changed Wayland.


Anyway as David has suggested, I have raised an issue on the Qt Jira 
bugs list at: https://bugreports.qt.io/browse/QTBUG-122941.


Terry


On 29/02/2024 13:39, Pekka Paalanen wrote:

On Wed, 28 Feb 2024 18:04:28 +
Terry Barnaby  wrote:


Hi Pekka,

Some questions below:

Thanks

Terry
On 26/02/2024 15:56, Pekka Paalanen wrote:

Ok. What Wayland API requests cause a surface to actually be mapped

(Sorry don't really know Wayland protocol) ?

Hi Terry,

the basic protocol object is wl_surface. The wl_surface needs to be
given a "role". This is a specific protocol term. xdg_surface and
xdg_toplevel can give a wl_surface the role of a top-level window,
which means it can get mapped when you play by the rules set in the
xdg_toplevel specification.

Sub-surface is another role.

So the rules are always role specific, but at some point they all
require content on the wl_surface, which is given with the attach
request followed by a commit request. Role rules specify when and how
that can be done.

Yes, I have heard that. But what I don't knoe is from the client:

  1. How do I find out the surfaces role ?

It is what you (or Qt, or Gst) set it to. There is no way to query it
(or any other thing) back by Wayland.

If you look at a protocol dump (e.g. WAYLAND_DEBUG=client in
environment), you can could follow the protocol messages and trace back
what the role was set to.


  2. How would I set the surface to have a role such that it would be
 mapped and thus visible ? Just wondering if I can work around what I
 think is a QtWayland bug/issue/feature to make sure by second
 Widgets surface is mapped/visible so that the waylandsink subsurface
 can work. With X11 there were API calls to change the Windows state
 and I was looking for something similar with Wayland.

There is no simple answer to this. You pick a role you need, and then
play by the protocol spec.

You do not have any surfaces without roles, though, so this would not
help you anyway. Roles cannot be changed, only set once per wl_surface
life time. Sub-surface is a role.


I need to find some way to actually display video, simply and
efficiently on an embedded platform, in a Qt application in the year 2024 :)

I have tried lots of work arounds but none have worked due to either Qt
issues, Wayland restrictions, Gstreamer restrictions, Weston
issues/restrictions, NXP hardware engine issues/restrictions etc. Any