> > > Why? Most gesture systems work by pressing some button to activate
> > > them. This is possible in E too. So like some other person wrote, it
> > > could be possible to press a modifier or a key, and then do mouse
> > > gestures that modify a window (raise, lower, iconify...)
> > >
> > hmm, i never thought of doing it without a pressed button (this would be
> > real fun, moving your mouse and oops, active app killed), but i
> > understood rasters answer as if it's not gonna happen...?
> > and it would be really nice cause sometimes 2500px (dualscreen) can be a
> > long long way to go just for closing/min/max an app using the mouse
> > 
> > i even have 2 mousebuttons left, so i would find a gesture system really
> > usefull and there 'could' be some nice effect while using the gesture
> > (painting onscreen, like a wizard in a mmo)
> > 
> > so i would say: do it, if u can ;)
> 
> but not happening because 1. complex (need write gesture training and
> recognition code) or 2. suck in another dependency and even then write all the
> infrastructure code. it's not in the TODO and won't go there. you'll have to
> wait for e18, e19, etc.

i agree. i did not ask for any specific e support. 
however, i did ask if current e infrastructure could make life a bit 
easier.... 
i am sure as hell not going to implement yet another 
recogniztion subsystem, as there are good ones (even GPLed) out there that 
i can use.

what i really though of is using ecore to help implement a module/stand 
alone progi which would be better than existing ones.
as for my example, i agree, it was lame, but as you pointed there are some 
really good applications to such a feature.

i think i'll try to come up with something, and if you can give a few 
technical pointers it would be greatly appreciated.

thanks for your input.
cheers,

-- 
========================================================================
nir.

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to