> For this to be employed in a Molecular Playground operation it needs
> to be:
>
> a) upside down
> b) behind the participant
>
> Does that work?
I've always had the sensor in front of me - that's they way they do it
in all the PrimeSense and Kinect literature.  The most important gesture
in OpenNI is Click, which involves moving a hand directly toward and
then directly away from the sensor.  It's difficult to perform
accurately when the sensor is behind the user.  It's also important to
remember that there's no way to start tracking a hand until it performs
a gesture OpenNI recognizes - probably Click.  After that, a custom
gesture recognizer could take over based on the coordinates provided.
> What I'd like to do -- with your assistance -- is create a few files
> in org.jmol.multitouch that allows this to be used within the applet
> or Jmol application directly. Please take a look at what we did with
> SparshUI. Note that the SparshUI driver (which only works on an HP
> TouchSmart computer) is a far simpler driver and so it requires
> substantially more gesture processing. The point here is that Jmol
> already has quite nice support for multitouch. I think we just need to
> tap into that.
OK, I've had a look at the source and javadocs for the stuff in
org.jmol.multitouch.  If I'm understanding things correctly, I should
have an adapter class that extends JmolMultiTouchClientAdapter like in
JmolSparshClientAdapter.  Then I should call
actionManager.processEvent() from ActionManagerMT?  With what parameters?
> So PrimeSense is the hardware that Microsoft employed for the Kinect
> sensor. Right? Does the code work with a Kinect sensor from Microsoft?
Right - I've been developing with a Kinect, but OpenNI is generalized to
work with a range of sensors - that's why I also needed to install
SensorKinect to make it work with Kinect.
> So I think I'm catching on....
> gesture_map_default.put("Click", "select");
>
> gesture_map_default.put("Wave", "rotate_all");
> But, where are the others in this list defined?
> "rotate", "rotate_all", "select", "select_all", "select_none", "select_atom", 
> "select_molecule", "translate", "translate_all"
Easy explanation - they're not.  OpenNI alone only supports two gestures
- Click and Wave; NITE adds a few more, but I haven't tried them yet. 
gesture_map_default is mainly there for testing.  When OpenNI recognizes
a new hand, I put gesture_map_default in gesture_map for that hand_id. 
Each entry in gesture_map corresponds to a hand_id and consists of a
String identifying a gesture mapped to a String identifying an action.

In this way, it's possible to have different hands performing different
actions even if they're performing the same gesture.  So if OpenNI
recognizes hand 1 and hand 2, you could have something like (pseudocode)
gesture_map.put(1, HashMap<("Click", "select"), ("Wave", "rotate_all")>)
gesture_map.put(2, HashMap<("Click", "select_molecule"), ("Wave",
"translate_all")>)
One problem is that if OpenNI loses track of a hand and then recognizes
it again, the hand has a different id.  But you could check the size of
gesture_map to see how many hands are being tracked and then iterate
over and set the gesture->action mapping accordingly.

As I work on the adapter, it will probably be useful to switch from
String representations of actions to using the actions defined in
ActionManagerMT.


Benn

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Jmol-developers mailing list
Jmol-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jmol-developers

Reply via email to