I had the idea that it would be nice to have sketch recognition and sketch beautification as a part of a paint stroke. You could call it a "gesture brush". If you draw a square, then it would recognize it and put you in a rectangle drawing mode (similarly for a circle). That seemed a bit gimmicky to me, so I thought maybe it would be better if you could have a variety of gestures and you could match them with textures and after you made the gesture it would put you in "drag dot" mode with that texture so you could place the resulting shape precisely and confirm the gesture by clicking. You could probably even allow the user to train new gestures instead of just selecting from a set of existing ones.
This still didn't really seem general enough. Instead of just using it to select textures and putting you in one stroke mode, it really should just be used to select brushes. This leaves out sketch beautification, but it means that gestures are just another event. I knew blender has some very basic gestures, but I want to expand that number to as many as can be distinguished. I also think that blender should be able to continually look for gestures (meaning you do not have to hold a key) and also give good visual feedback when one is recognized (like an after image). Gesture segmentation is also a very real concern if I want to continually recognize strokes without a button. I have some ideas on that, but I'd like to keep them to myself and test them before sharing. As for sketch beautification, gesture events would provide extra information, specifically some normalized representation of their path, that could be used by an operator to provide initial values. e.g., the size of a circle gesture could be calculated and used to draw an initial circle. Of course, this allows you to do more than sketch beautification. I imagine it should be possible to use Blender completely using "hot gestures" instead of hot keys. I had an image of somebody furiously tweaking a Android like they were polishing the screen with their finger tip. Of course that would require a very good gesture recognizer. However, I do not have to start with a good one, just a simple one, like "$1", which is easy to understand so I can use it for research: https://depts.washington.edu/aimgroup/proj/dollar/ I do not know how much this overlaps with multitouch. I imagined that this would be restricted to 1 touch and 1 stroke so that it can be used by mouse/pen users. I would think that multitouch gestures are, at this point in history, very specialized. We do not write with multiple fingers. I mean, Chinese has has tens of thousands of single touch gestures :-) Maybe in the far future we'll have thousands of multitouch gestures. On Tue, Oct 16, 2012 at 5:07 PM, Nicholas Rishel <rishel.n...@gmail.com> wrote: > Jason, > > Could you go into detail on what you're thinking? There is a good chance > that it overlaps what I have researched for multitouch, though I think what > you are describing is tangent to what I'm working on. > > Nick > _______________________________________________ > Bf-committers mailing list > Bf-committers@blender.org > http://lists.blender.org/mailman/listinfo/bf-committers _______________________________________________ Bf-committers mailing list Bf-committers@blender.org http://lists.blender.org/mailman/listinfo/bf-committers