Kristian Høgsberg <krh@...> writes: > Right... in the MPX sense, right? So you could have a keyboard and > mouse combo controlling one pointer/kb focus and the touch screen > being its own master device. Then maybe you could have one person > using the touch screen UI, and another person using the kb/mouse > combo. That's kind of far fetched, of course, but I think the main > point is that there's no inherent association between a kb/mouse combo > and a touch screen. On the other hand, what about a setup with two > mouse/kb combos (master devices) and a touch screen... you'd expect > tapping a window on the touch screen to set kb focus, but if you have > multiple master kbs, which kb focus do you set? Maybe we're just > doomed for trying to make both pointer and direct touch interaction > work in the same UI.
One use case you seem to be forgetting is that there are mouse-type devices like recent Synaptics touchpads that *also* do multitouch. Multitouch != touchscreen. One way to solve this might be to make touchscreens a pointer device *with no associated keyboard device*, or at least none attached to actual hardware. In XInput, you can create a new master pair with a real pointer, but only an XTest keyboard. A dummy, if you will. _______________________________________________ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel