Re: RFC: multitouch support v2

2011-12-23 Thread Alex Elsayed
Chase Douglas  writes:
> I don't think anyone is forgetting about indirect devices, at least I'm
> not :). However, their use scenarios are a bit easier to deal with
> because there's no pointer emulation.
> 
> We will also want the ability to have a touchscreen "attached" to a real
> keyboard. Imagine you have a tablet with a bluetooth keyboard. Where you
> touch should change the focus for the keyboard input.

(sorry for deleting history, posting from the Gmane web interface)

Perhaps the best solution is simply to punt grouping input devices to
udev (so that users can write custom rules if need be), and say that
all input sources share focus with fellow members of their group.
This might be compilicated by the desire to move input devices
between groups, though, and before long we end up with something
rather like XInput 2.

I personally think that it would be a *really* good idea to bring Peter
Hutterer into this, seeing as he has literally done this for years, and
certainly has a good idea of where the pitfalls are.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-23 Thread Chase Douglas
On 12/22/2011 06:54 PM, Alex Elsayed wrote:
> Kristian Høgsberg  writes:
>> Right... in the MPX sense, right?  So you could have a keyboard and
>> mouse combo controlling one pointer/kb focus and the touch screen
>> being its own master device.  Then maybe you could have one person
>> using the touch screen UI, and another person using the kb/mouse
>> combo.  That's kind of far fetched, of course, but I think the main
>> point is that there's no inherent association between a kb/mouse combo
>> and a touch screen.  On the other hand, what about a setup with two
>> mouse/kb combos (master devices) and a touch screen... you'd expect
>> tapping a window on the touch screen to set kb focus, but if you have
>> multiple master kbs, which kb focus do you set?  Maybe we're just
>> doomed for trying to make both pointer and direct touch interaction
>> work in the same UI.
> 
> One use case you seem to be forgetting is that there are mouse-type
> devices like recent Synaptics touchpads that *also* do multitouch.
> Multitouch != touchscreen. One way to solve this might be to make 
> touchscreens a pointer device *with no associated keyboard device*,
> or at least none attached to actual hardware. In XInput, you can create
> a new master pair with a real pointer, but only an XTest keyboard. A 
> dummy, if you will.

I don't think anyone is forgetting about indirect devices, at least I'm
not :). However, their use scenarios are a bit easier to deal with
because there's no pointer emulation.

We will also want the ability to have a touchscreen "attached" to a real
keyboard. Imagine you have a tablet with a bluetooth keyboard. Where you
touch should change the focus for the keyboard input.

-- Chase
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Alex Elsayed
Kristian Høgsberg  writes:
> Right... in the MPX sense, right?  So you could have a keyboard and
> mouse combo controlling one pointer/kb focus and the touch screen
> being its own master device.  Then maybe you could have one person
> using the touch screen UI, and another person using the kb/mouse
> combo.  That's kind of far fetched, of course, but I think the main
> point is that there's no inherent association between a kb/mouse combo
> and a touch screen.  On the other hand, what about a setup with two
> mouse/kb combos (master devices) and a touch screen... you'd expect
> tapping a window on the touch screen to set kb focus, but if you have
> multiple master kbs, which kb focus do you set?  Maybe we're just
> doomed for trying to make both pointer and direct touch interaction
> work in the same UI.

One use case you seem to be forgetting is that there are mouse-type
devices like recent Synaptics touchpads that *also* do multitouch.
Multitouch != touchscreen. One way to solve this might be to make 
touchscreens a pointer device *with no associated keyboard device*,
or at least none attached to actual hardware. In XInput, you can create
a new master pair with a real pointer, but only an XTest keyboard. A 
dummy, if you will.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Kristian Høgsberg
On Wed, Dec 21, 2011 at 12:34 PM, Tiago Vignatti
 wrote:
> From: Tiago Vignatti 
>
> Hi,
>
> Following Kristian suggestions, I updated the patchset with the following:
> - driver now accumulates input coordinates to send along touch_down
> - updated the protocol touch_down event with surface field, meaning the focus
>  surface of a touch device
> - compositor now uses a touch_focus pointer (self explicative), where it's
>  picked when the first finger is down; all further events go there until it
>  gets released
> - not doing pointer emulation for now; that will come next.
>
> I still using the cairo image as PoC. So the client surfaces now are picked
> correctly, giving a quite smooth effect for pinch/zoom gesture introduced on
> the last patches. My tree rebased with latest master branch is here:
>
>    http://cgit.freedesktop.org/~vignatti/wayland-demos/log/?h=multitouch-v2
>
> I'll be in vacation for two weeks, starting from tomorrow. See y'all in the
> next year and enjoy the holiday. Peace!

There are still a lot of problems with these patches, but I want to
move forward on this.  I picked the first two patches and edited them
to get the basics into shape, then added a few fixes on top.   The
evdev part was mostly fine, but the event delivery had a number of
problems:

 - we can't overwrite device->x,y, that's the pointer position
 - we have to deliver the surfaces relative coordinates, not the
global coordinates to clients
 - touch_up was never delivered since you clear touch_focus_resource
just before testing it for NULL

Moreover, we need to track each touchpoint position in the evdev part
so that when we get only ABS_MT_POSITION_X (or Y), we have the other
coordinate.  Finally, we need to handle losing both the touch focus
surface or the input device client resource without crashing.

Most of these issues are obvious if you look at the event stream (use
WAYLAND_DEBUG=1) and the missing transformation to surface coordinates
would have been clear with a simple multi-touch-paint test client,
like the one I added.

Kristian
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Michael Hasselmann
On Thu, 2011-12-22 at 11:40 -0800, Chase Douglas wrote:
> >> Off the top of my head, I would think Wayland should automatically
> >> create the equivalent of X master pointer devices for each touchscreen
> >> device. There shouldn't be a sprite for touchscreens, though the WM
> >> could do fancy effects like MS Surface if you wanted it to.
> > 
> > Right... in the MPX sense, right?  So you could have a keyboard and
> > mouse combo controlling one pointer/kb focus and the touch screen
> > being its own master device.  Then maybe you could have one person
> > using the touch screen UI, and another person using the kb/mouse
> > combo.  That's kind of far fetched, of course, but I think the main
> > point is that there's no inherent association between a kb/mouse combo
> > and a touch screen.  On the other hand, what about a setup with two
> > mouse/kb combos (master devices) and a touch screen... you'd expect
> > tapping a window on the touch screen to set kb focus, but if you have
> > multiple master kbs, which kb focus do you set?  Maybe we're just
> > doomed for trying to make both pointer and direct touch interaction
> > work in the same UI.
> 
> In the past I've tried to think of a good solution for this. I haven't
> had enough time to come up with one yet :). I wouldn't advocate holding
> up all of wayland to get this right, but I do think it needs to be
> rethought from the ground up long term.

My proposal: Forget briefly about focus. Instead, keyboards and mice get
registered to a surface, and if you touch that surface, it activates the
registered keyboards and mice. If you registered multiple keyboards/mice
to the same surface, then yes, all of them become active at the same
time. This should, however, be the exception (depending on the UX
design). Even on a multi-user setup, I would expect users to their own
work areas (surfaces) to interact with.

The whole idea of single focus widgets does not seem to work with multi
input, so how about just giving up on this (artificial?) constraint?

regards,
Michael

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Chase Douglas
On 12/22/2011 08:59 AM, Kristian Høgsberg wrote:
> 2011/12/22 Chase Douglas :
>> On 12/22/2011 07:53 AM, Kristian Høgsberg wrote:
>>> 2011/12/22 Chase Douglas :
 I don't know wayland's protocol yet, but shouldn't enter/leave events
 have some kind of device identifier in them? I would think that should
 alleviate any client-side confusion.
>>>
>>> I don't think so.  To be clear, the problem I'm thinking of is where
>>> the toolkit does select for touch events, but only to do client side
>>> pointer emulation in the toolkit.  What should a client do in case the
>>> pointer is hovering over a button in one window, when it then receives
>>> a touch down in another window?  The toolkit only maintains one
>>> pointer focus (which is current in that other window), and what
>>> happens when you receive touch events in a different window?  What
>>> kind of pointer events do you synthesize?  We can't move the system
>>> pointer to match the touch position.
>>
>> In X we move the cursor sprite to the first touch location, always. This
>> is because you have moved the master pointer, so the sprite needs to be
>> in sync with the master pointer location.
> 
> How do you move the sprite without doing pointer emulation?  If the
> sprite enters a window, you have to send enter/leave events, and
> motion events as it moves around.

I believe we do send enter/leave events for pointer emulated touch
events in all cases.

> When I say that I don't know if we
> need pointer emulation, I mean that there is no sprite associated with
> the touch events, there are no enter/leave events or buttons events.
> When you touch a surface, you only get a touch_down event, then
> touch_motion and then touch_up.

Sounds right for Wayland.

>> Off the top of my head, I would think Wayland should automatically
>> create the equivalent of X master pointer devices for each touchscreen
>> device. There shouldn't be a sprite for touchscreens, though the WM
>> could do fancy effects like MS Surface if you wanted it to.
> 
> Right... in the MPX sense, right?  So you could have a keyboard and
> mouse combo controlling one pointer/kb focus and the touch screen
> being its own master device.  Then maybe you could have one person
> using the touch screen UI, and another person using the kb/mouse
> combo.  That's kind of far fetched, of course, but I think the main
> point is that there's no inherent association between a kb/mouse combo
> and a touch screen.  On the other hand, what about a setup with two
> mouse/kb combos (master devices) and a touch screen... you'd expect
> tapping a window on the touch screen to set kb focus, but if you have
> multiple master kbs, which kb focus do you set?  Maybe we're just
> doomed for trying to make both pointer and direct touch interaction
> work in the same UI.

In the past I've tried to think of a good solution for this. I haven't
had enough time to come up with one yet :). I wouldn't advocate holding
up all of wayland to get this right, but I do think it needs to be
rethought from the ground up long term.

One possibility:

Have one "logical" pointing device for all relative input devices, and
one "logical" pointing device for each absolute input device. Then have
a configurable mapping of logical pointing devices and to logical
keyboard devices. The main difference between X and this approach is
essentially the default policy. In X, by default there is only one
master pointer and everything is attached to it.

BTW, when I think about solutions to this issue, the first question I
ask myself is whether it would work for a device like the MS Surface
table where there's a different person on each side. Assume the surface
is subdivided into four virtual input devices, so touching on your side
of the table is a different device than touching on someone else's side.
Then I imagine each person has their own keyboard. With one display
server, everyone should be able to control their own "plot" of screen
area on their side of the table.

The reason I think of this particular use case is because I think there
is a strong possibility of happening in the next 10 years.

-- Chase
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Kristian Høgsberg
2011/12/22 Chase Douglas :
> On 12/22/2011 07:53 AM, Kristian Høgsberg wrote:
>> 2011/12/22 Chase Douglas :
>>> I don't know wayland's protocol yet, but shouldn't enter/leave events
>>> have some kind of device identifier in them? I would think that should
>>> alleviate any client-side confusion.
>>
>> I don't think so.  To be clear, the problem I'm thinking of is where
>> the toolkit does select for touch events, but only to do client side
>> pointer emulation in the toolkit.  What should a client do in case the
>> pointer is hovering over a button in one window, when it then receives
>> a touch down in another window?  The toolkit only maintains one
>> pointer focus (which is current in that other window), and what
>> happens when you receive touch events in a different window?  What
>> kind of pointer events do you synthesize?  We can't move the system
>> pointer to match the touch position.
>
> In X we move the cursor sprite to the first touch location, always. This
> is because you have moved the master pointer, so the sprite needs to be
> in sync with the master pointer location.

How do you move the sprite without doing pointer emulation?  If the
sprite enters a window, you have to send enter/leave events, and
motion events as it moves around.  When I say that I don't know if we
need pointer emulation, I mean that there is no sprite associated with
the touch events, there are no enter/leave events or buttons events.
When you touch a surface, you only get a touch_down event, then
touch_motion and then touch_up.

> Off the top of my head, I would think Wayland should automatically
> create the equivalent of X master pointer devices for each touchscreen
> device. There shouldn't be a sprite for touchscreens, though the WM
> could do fancy effects like MS Surface if you wanted it to.

Right... in the MPX sense, right?  So you could have a keyboard and
mouse combo controlling one pointer/kb focus and the touch screen
being its own master device.  Then maybe you could have one person
using the touch screen UI, and another person using the kb/mouse
combo.  That's kind of far fetched, of course, but I think the main
point is that there's no inherent association between a kb/mouse combo
and a touch screen.  On the other hand, what about a setup with two
mouse/kb combos (master devices) and a touch screen... you'd expect
tapping a window on the touch screen to set kb focus, but if you have
multiple master kbs, which kb focus do you set?  Maybe we're just
doomed for trying to make both pointer and direct touch interaction
work in the same UI.

>> I guess you could synthesize a leave event for the window the pointer
>> is in but remember the window and position.  Then synthesize an enter
>> event for the window with the touch event and send button down and
>> motion events etc.   Then when the touch session is over (all touch
>> points up), the toolkit synthesizes an enter event for the window and
>> position the pointer is actually in.
>
> That sounds hacky, and trying to fit multiple input devices into a
> single input device world.

Most toolkits (and users tbh) still live in a single input device
world.  But for toolkits that support multiple pointers entering and
leaving their windows and buttons etc, there's no need to play tricks
with the pointer focus like that, of course.

> I would suggest not sending or synthesizing enter/leave events for touch
> events. The toolkit can switch focus to the last window with a touch
> event or a pointer motion event.

Yeah, again, this is all about client-side (toolkit) pointer
emulation.  We don't send enter/leave events for touch, that wouldn't
make sense.  But if you have a toolkit, where the higher level logic
doesn't understand multiple pointers/devices, you have to do something
to avoid confusing it when it thinks the pointer is in one window and
then suddenly it gets motion events in another.

Kristian
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Chase Douglas
On 12/22/2011 07:53 AM, Kristian Høgsberg wrote:
> 2011/12/22 Chase Douglas :
>> I don't know wayland's protocol yet, but shouldn't enter/leave events
>> have some kind of device identifier in them? I would think that should
>> alleviate any client-side confusion.
> 
> I don't think so.  To be clear, the problem I'm thinking of is where
> the toolkit does select for touch events, but only to do client side
> pointer emulation in the toolkit.  What should a client do in case the
> pointer is hovering over a button in one window, when it then receives
> a touch down in another window?  The toolkit only maintains one
> pointer focus (which is current in that other window), and what
> happens when you receive touch events in a different window?  What
> kind of pointer events do you synthesize?  We can't move the system
> pointer to match the touch position.

In X we move the cursor sprite to the first touch location, always. This
is because you have moved the master pointer, so the sprite needs to be
in sync with the master pointer location.

Off the top of my head, I would think Wayland should automatically
create the equivalent of X master pointer devices for each touchscreen
device. There shouldn't be a sprite for touchscreens, though the WM
could do fancy effects like MS Surface if you wanted it to.

> I guess you could synthesize a leave event for the window the pointer
> is in but remember the window and position.  Then synthesize an enter
> event for the window with the touch event and send button down and
> motion events etc.   Then when the touch session is over (all touch
> points up), the toolkit synthesizes an enter event for the window and
> position the pointer is actually in.

That sounds hacky, and trying to fit multiple input devices into a
single input device world.

I would suggest not sending or synthesizing enter/leave events for touch
events. The toolkit can switch focus to the last window with a touch
event or a pointer motion event.

-- Chase
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Kristian Høgsberg
2011/12/22 Chase Douglas :
> On 12/22/2011 07:15 AM, Kristian Høgsberg wrote:
>> On Thu, Dec 22, 2011 at 1:45 AM, Chase Douglas
>>  wrote:
>>> On 12/21/2011 09:34 AM, Tiago Vignatti wrote:
 From: Tiago Vignatti 

 Hi,

 Following Kristian suggestions, I updated the patchset with the following:
 - driver now accumulates input coordinates to send along touch_down
 - updated the protocol touch_down event with surface field, meaning the 
 focus
   surface of a touch device
 - compositor now uses a touch_focus pointer (self explicative), where it's
   picked when the first finger is down; all further events go there until 
 it
   gets released
 - not doing pointer emulation for now; that will come next.
>>>
>>> Do we really want pointer emulation in the window server? I can tell you
>>> from first-hand experience it's a nightmare. Toolkits should be updated
>>> to handle touch events properly, with an option to receive touch events
>>> and emulate pointer events for applications that aren't ready for touch
>>> event handling.
>>
>> I don't think we do.  I'm not 100% sure yet, which is why I want to
>> focus on just the basic touch events for now.  I agree that since you
>> have to port a toolkit to Wayland anyway, you can just do pointer
>> emulation (if you must, real touch support is better, of course) in
>> the toolkit when you port it.
>>
>> The one thing that makes me not quite sure is that client-side pointer
>> emulation won't be able to move the pointer sprite in response to
>> touch point 0 moving.  And maybe we don't need that.  On the other
>> hand, if the toolkit synthesizes enter/leave events in response to
>> touch events, it's going to be confusing when the actual pointer
>> enters a different surface.  It's also possible to make server-side
>> pointer emulation a per-client thing, similar to what Peter did for X.
>>  If a client subscribes to touch events, we don't do pointer
>> emulation.
>
> There's a niggle there. If a client selects for touch and pointer
> events, it will only receive touch events. However, if a client grabs
> touch and pointer events through a passive grab, the touch grab is
> handled first, and then the pointer grab second if the touch grab is
> rejected.
>
> I don't know wayland's protocol yet, but shouldn't enter/leave events
> have some kind of device identifier in them? I would think that should
> alleviate any client-side confusion.

I don't think so.  To be clear, the problem I'm thinking of is where
the toolkit does select for touch events, but only to do client side
pointer emulation in the toolkit.  What should a client do in case the
pointer is hovering over a button in one window, when it then receives
a touch down in another window?  The toolkit only maintains one
pointer focus (which is current in that other window), and what
happens when you receive touch events in a different window?  What
kind of pointer events do you synthesize?  We can't move the system
pointer to match the touch position.

I guess you could synthesize a leave event for the window the pointer
is in but remember the window and position.  Then synthesize an enter
event for the window with the touch event and send button down and
motion events etc.   Then when the touch session is over (all touch
points up), the toolkit synthesizes an enter event for the window and
position the pointer is actually in.

Kristian
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Chase Douglas
On 12/22/2011 07:15 AM, Kristian Høgsberg wrote:
> On Thu, Dec 22, 2011 at 1:45 AM, Chase Douglas
>  wrote:
>> On 12/21/2011 09:34 AM, Tiago Vignatti wrote:
>>> From: Tiago Vignatti 
>>>
>>> Hi,
>>>
>>> Following Kristian suggestions, I updated the patchset with the following:
>>> - driver now accumulates input coordinates to send along touch_down
>>> - updated the protocol touch_down event with surface field, meaning the 
>>> focus
>>>   surface of a touch device
>>> - compositor now uses a touch_focus pointer (self explicative), where it's
>>>   picked when the first finger is down; all further events go there until it
>>>   gets released
>>> - not doing pointer emulation for now; that will come next.
>>
>> Do we really want pointer emulation in the window server? I can tell you
>> from first-hand experience it's a nightmare. Toolkits should be updated
>> to handle touch events properly, with an option to receive touch events
>> and emulate pointer events for applications that aren't ready for touch
>> event handling.
> 
> I don't think we do.  I'm not 100% sure yet, which is why I want to
> focus on just the basic touch events for now.  I agree that since you
> have to port a toolkit to Wayland anyway, you can just do pointer
> emulation (if you must, real touch support is better, of course) in
> the toolkit when you port it.
> 
> The one thing that makes me not quite sure is that client-side pointer
> emulation won't be able to move the pointer sprite in response to
> touch point 0 moving.  And maybe we don't need that.  On the other
> hand, if the toolkit synthesizes enter/leave events in response to
> touch events, it's going to be confusing when the actual pointer
> enters a different surface.  It's also possible to make server-side
> pointer emulation a per-client thing, similar to what Peter did for X.
>  If a client subscribes to touch events, we don't do pointer
> emulation.

There's a niggle there. If a client selects for touch and pointer
events, it will only receive touch events. However, if a client grabs
touch and pointer events through a passive grab, the touch grab is
handled first, and then the pointer grab second if the touch grab is
rejected.

I don't know wayland's protocol yet, but shouldn't enter/leave events
have some kind of device identifier in them? I would think that should
alleviate any client-side confusion.

-- Chase
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-22 Thread Kristian Høgsberg
On Thu, Dec 22, 2011 at 1:45 AM, Chase Douglas
 wrote:
> On 12/21/2011 09:34 AM, Tiago Vignatti wrote:
>> From: Tiago Vignatti 
>>
>> Hi,
>>
>> Following Kristian suggestions, I updated the patchset with the following:
>> - driver now accumulates input coordinates to send along touch_down
>> - updated the protocol touch_down event with surface field, meaning the focus
>>   surface of a touch device
>> - compositor now uses a touch_focus pointer (self explicative), where it's
>>   picked when the first finger is down; all further events go there until it
>>   gets released
>> - not doing pointer emulation for now; that will come next.
>
> Do we really want pointer emulation in the window server? I can tell you
> from first-hand experience it's a nightmare. Toolkits should be updated
> to handle touch events properly, with an option to receive touch events
> and emulate pointer events for applications that aren't ready for touch
> event handling.

I don't think we do.  I'm not 100% sure yet, which is why I want to
focus on just the basic touch events for now.  I agree that since you
have to port a toolkit to Wayland anyway, you can just do pointer
emulation (if you must, real touch support is better, of course) in
the toolkit when you port it.

The one thing that makes me not quite sure is that client-side pointer
emulation won't be able to move the pointer sprite in response to
touch point 0 moving.  And maybe we don't need that.  On the other
hand, if the toolkit synthesizes enter/leave events in response to
touch events, it's going to be confusing when the actual pointer
enters a different surface.  It's also possible to make server-side
pointer emulation a per-client thing, similar to what Peter did for X.
 If a client subscribes to touch events, we don't do pointer
emulation.

Anyway, for now the focus is on getting the basic touch events done
and get all the details worked out there.

Kristian
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: RFC: multitouch support v2

2011-12-21 Thread Chase Douglas
On 12/21/2011 09:34 AM, Tiago Vignatti wrote:
> From: Tiago Vignatti 
> 
> Hi,
> 
> Following Kristian suggestions, I updated the patchset with the following:
> - driver now accumulates input coordinates to send along touch_down
> - updated the protocol touch_down event with surface field, meaning the focus
>   surface of a touch device
> - compositor now uses a touch_focus pointer (self explicative), where it's
>   picked when the first finger is down; all further events go there until it
>   gets released
> - not doing pointer emulation for now; that will come next.

Do we really want pointer emulation in the window server? I can tell you
from first-hand experience it's a nightmare. Toolkits should be updated
to handle touch events properly, with an option to receive touch events
and emulate pointer events for applications that aren't ready for touch
event handling.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel