Chase Douglas wrote:

The client won't see the third finger if it touches outside its window.
In the wayland case, only the WM has all the info needed to determine if
a touch is part of a global gesture. The WM needs to make the decision,
not the client.

I'm pretty certain all touch events *MUST* go to the same surface until all touches are released. Otherwise it will be quite impossible to do gestures reliably, like the user could not do them to objects near the edge of a window.

If the clients can look at things first, this would allow the compositor to do things like "one finger can be used to change desktops if the underlying program does not use it".

That would be bad UI design because then global gestures would fire only
sometimes. Further, it would break global gestures if touches occur over
a broken application.

I consider it bad design that global actions are "complex" (like needing 3 fingers) or global shortcuts require lots of shift keys held down, just to avoid collisions with applications.

I also think you are making up "user confusion" that does not exist in the real world to make an excuse for this. Users will find it pretty obvious if the same action that scrolls a document also scrolls the entire screen if you don't touch a document, IMHO.

I think we can look at other OSes as case studies. iOS and OS X employ
effective global gestures imo, and they take precedence over the
application receiving touch or gesture events.

I think it is pretty clear that "what other OS's do" is not always what Wayland wants to do. Most of them copied ideas from X.
_______________________________________________
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to