2014-08-11 16:12 GMT+03:00 Rutledge Shawn <shawn.rutle...@digia.com>:
>
> On 11 Aug 2014, at 12:57 PM, Giulio Camuffo wrote:
>
>> 2014-08-11 13:29 GMT+03:00 Rutledge Shawn <shawn.rutle...@digia.com>:
>>>
>>> On 11 Aug 2014, at 11:34 AM, Giulio Camuffo wrote:
>>>
>>>> 2014-08-11 12:20 GMT+03:00 Rutledge Shawn <shawn.rutle...@digia.com>:
>>>>>
>>>>> On 11 Aug 2014, at 9:10 AM, Pier Luigi wrote:
>>>>> (top-posting fixed)
>>>>>> 2014-08-11 8:13 GMT+02:00 Steve (YiLiang) Zhou <sz...@telecomsys.com>:
>>>>>>> Dear all,
>>>>>>>
>>>>>>> My app has a mainwindow and a QDialog which is a child of mainwindow. 
>>>>>>> And I
>>>>>>> want to set the app to the position 0,0.
>>>>>>>
>>>>>>> I use both setGeometry and move to  0,0. No luck , both failed. The 
>>>>>>> window’s
>>>>>>> position is unfixed and may appear to anywhere on the screen.
>>>>>
>>>>> I was wondering about that too.  I understand that it's generally good 
>>>>> policy to leave positioning of generic windows up to the window manager, 
>>>>> but sometimes you want to write a dock or taskbar which anchors itself to 
>>>>> screen edges, and can animate in and out of view; or a splash screen 
>>>>> which is centered on one screen.  What is the right way to do that on 
>>>>> Wayland?
>>>>
>>>> The right way is to have a protocol designed for that. A taskbar
>>>> should use some taskbar_protocol with a request like
>>>> put_on_edge(edge), and the compositor will then move the surface on
>>>> the edge and do slide in/out or whatever effect it wants to.
>>>
>>> I understand the advantage of taking a higher-level approach.  But then 
>>> someone thinks of something for which the scenario-specific protocol 
>>> doesn't suffice.  If windows could move themselves, it might be more 
>>> flexible.  It may be too low-level, but it's hard to think of any other 
>>> protocol that is universal enough, which I suppose is why it's not 
>>> standardized.
>>
>> The problem is that windows don't always have a meaningful position.
>> If a window is shown on two outputs at the same time, maybe one of
>> which a remote one, what is the window position?
>
> On X11 (and other window systems) all outputs are mapped into the "virtual 
> desktop" space, side-by-side or overlapping or whatever, so that there is a 
> unified coordinate system.  On Wayland there is not this assumption?

I'm not sure I follow. How does that fix the problem of the same
window being shown at the same time at 10x10 of an output and at 0x0
of another one?

>
>> And what is the
>> position of a window rotated 45 degrees?
>
> Something could be made up; perhaps the position should always be the 
> centroid instead of the upper-left? (although in other use cases that would 
> be less convenient)  Rotation doesn't make sense without a center of rotation 
> either.
>
>>> What about when a window provides its own "skinned" window decorations: 
>>> there will probably be some area in which you can drag to move the window, 
>>> as you normally can on the titlebar.  Is there another protocol for that?  
>>> How would that be different from a generic protocol which windows could use 
>>> to position themselves?
>>
>> wl_shell_surface/xdg_surface have a "move" request. The clients call
>> that and then the compositor actually does the moving.
>
> So interactive moving only, but nothing to ask programmatically for a window 
> to be moved by some delta.

Actually, both. Clients can ask the compositor to move a surface by a
dx,dy when attaching a buffer.
_______________________________________________
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to