Hi everyone,

I've just added a basic gesture recognition engine to Corner.app.  If  
you hold down control and shift[1] and move the mouse, it will treat  
the next movements as a gesture.  Gestures are translated into  
'words' where each element in the gesture is a digit:

1 - north
2 - north-east
3 - east
4 - south-east
5 - south
6 - south-west
7 - west
8 - north-west

At the moment, all it does with these is log the word to the  
terminal.  Can everybody test this, and see if they can make it log  
the words they expect from the gestures they give?

I plan on exposing this through the scripting interface later, with a  
dictionary mapping gesture words to selectors.  I intend to keep  
mouse gestures global, and use the scripting interface to communicate  
them to the active application to avoid undue modality.  For example,  
you could define a gesture for undo or redo (maybe left and right  
gestures) and these would send undo and redo messages to the active  
window.

David

[1] On my test setup, there is a weird bug in X11 where the modifiers  
are only detected by the X server if shift is held down before  
control.  Can anyone reproduce this?

_______________________________________________
Etoile-discuss mailing list
[email protected]
https://mail.gna.org/listinfo/etoile-discuss

Répondre à