On Tue, 2014-03-04 at 00:55 +0100, Carlos Garnacho wrote:
> Hey everyone,
> 
> In the past days I've been hacking again on the gestures branch, and
> it's reaching an state where I feel it's getting quite solid, so I would
> like to get discussion started, tentatively aiming to get this included
> early in 3.13.
> 
> Overview
> ========
> 
> The two object types this relies on are GtkEventController and
> GtkGesture. GtkEventController is a very lowlevel abstraction for
> something that just "handles events". GtkGesture is a subclass very
> centered around handling single or multiple sequences of
> press/update.../release events, by default it's restricted to handling
> touch events, although can be made to listen to mouse events, either
> though API or through the GTK_TEST_TOUCHSCREEN envvar (a NULL
> GdkEventSequence is used in those cases).
> 
> Multiple GtkGesture implementations are offered in the branch:
> 
>       * Drag: keeps track of drags, reporting the offset from the drag
>         start point.
>       * Swipe: reports x/y velocity at the end of a begin/update/end
>         sequence.
>       * LongPress: reports long presses, or those being canceled after
>         threshold/timeout excess.
>       * MultiPress: reports multiple presses, as long as they're within
>         double click threshold/timeout
>       * Rotate: reports angle changes from two touch sequences
>       * Zoom: reports distance changes from to touch sequences as a
>         factor of the initial distance.

What about the single tap/press? Do the gestures for which it makes
sense also give back the center of the operation? Do the gestures know
about each other? (like, if there's no long-press in my widget, will it
understand that I'm starting a drag or I'm slow at tapping?)

Cheers

_______________________________________________
gtk-devel-list mailing list
gtk-devel-list@gnome.org
https://mail.gnome.org/mailman/listinfo/gtk-devel-list

Reply via email to