I have desired such a thing too, except my solution was to use a modular
approach.
Basically a window manager that reads from a named pipe is required.
Then multiple input applications like gesture recognizers and voice recognizers
and whatnot would output into the named pipe.
On Wed, May 09, 2012 at 08:02:26AM -0400, Peter Hartman wrote:
> Hi Folks:
>
> For some time now I've been cobbling together a kind of suckless
> ecosystem for use on tablet or hybrid touch devices. I thought I'd
> release some of the code to the public, although most of it is
> proof-of-concept and cobbled together.
>
> GOAL
>
> Operate without a keyboard.
>
> OVERVIEW
>
> SLUT combines a patched dwm which is gesture aware. A certain
> gesture, or clicking on the status text, will launch a patched dmenu
> which is mouse aware, allowing you to launch various things you want
> to launch, e.g., svkbd, surf, zathura, etc. Gesture mode is toggled
> by way of a physical button on the tablet (in this case, the power
> button, although the volume buttons would work). Switching between
> applications is achieved via dmenu + a patched version of lsw. You
> should also modify the apps that you play with to be "gesture" aware,
> that is, to respond to Control-F1 through Control-F6 (see dwm's
> config.h).
>
> All the best,
> Peter
>
>
>
> --
> sic dicit magister P
> University of Toronto / Fordham University
> http://individual.utoronto.ca/peterjh
> gpg 1024D/ED6EF59B (7D1A 522F D08E 30F6 FA42 B269 B860 352B ED6E F59B)
> gpg --keyserver pgp.mit.edu --recv-keys ED6EF59B