On Tue, Jan 27, 2015 at 03:30:03PM -0500, Lyude wrote:
> Hello,
> As some of you may be aware, I've been working with Carlos Garnacho in
> order to implement support in GTK+ for the current draft protocol for
> using tablets on Wayland. You can find the latest preview branch of
> weston, modified to work with this protocol, here:
> 
>       https://github.com/Lyude/weston
> 
> And you can find the WIP branch for GTK+ support here:
> 
>       https://github.com/Lyude/gtk-
> 
> So, right now we've run into something that we believe needs some more
> discussion in terms of how it will be implemented: handling tablet
> buttons in GTK+ with the Wayland backend. Right now, the tablet-support
> branch of libinput only supports the buttons on the actual styluses for
> the tablet, not the buttons on the tablet pad, so we only need to worry
> about those for the time being.

just to give you the libinput plan for this part: we're currently writing a
"buttonset" interface that is somewhat similar to the tablet interface but
will handle the tablet buttons and axes present on the pad.

A device with multiple capabilities will send events through the interfaces
and have those exposes as such. The struct libinput_device will merge event
nodes into a single logical device. On a Intuos 5 touch for example you'd
get a single struct libinput_device with the capabilities TABLET, TOUCH and
BUTTONSET. Wheel events are sent as buttonset event with
BUTTONSET_AXIS_WHEEL, etc. Button handling will be similar to the tablet
button handling.

This is still WIP, and the wayland protocol requirements haven't been scoped
yet either.

Cheers,
   Peter
 
> The problem is right now with the X11 backend is that these buttons
> aren't exactly handled in an ideal manner. On the average GNOME setup,
> gnome-settings-daemon tells xf86-input-wacom which mouse buttons each
> button on the tablet tool should be mapped to (for example: button #1 on
> the stylus may be mapped to a right click, and button #2 may be mapped
> to a left click), and X forwards them to the clients as such. This means
> that GDK only knows which emulated mouse buttons are being pressed as
> opposed to the actual tablet tool buttons, and only forwards what it
> gets from the xf86-input-wacom. So, there isn't even actually any
> infrastructure in GDK right now to handle actual tablet tool button
> presses, just mouse button presses. This leaves us with an issue, since
> the libinput API for tablets doesn't do any such emulation, and there's
> no emulation for this in the Wayland protocol (and I think it would be a
> good idea not to add any, IMO any emulation of this manner should be
> handled by the clients, not the compositor). Since there's quite a few
> ways to go about it, me and Carlos thought it would be appropriate to
> bring this up on the mailing list.

_______________________________________________
gtk-devel-list mailing list
gtk-devel-list@gnome.org
https://mail.gnome.org/mailman/listinfo/gtk-devel-list

Reply via email to