Hi Troy, I don't know if you've succeed in configuring Embodiment to control the robot, using ImplicationLinks, but here http://www.opencog.org/wiki/Embodiment#Language_Comprehension you can find a documentation about the Language Comprehension module, which was integrated into opencog a few days ago. In the Reference and Command resolution sections you will find how a specific command can be implemented to be fired by a simple English sentence. If you have any question, please feel free to send an email to the list or post it on the #opencog channel at irc.freenode.net.
Samir On Tue, Aug 11, 2009 at 9:28 AM, Ben Goertzel <[email protected]> wrote: > > By popular demand, I'm forwarding this email to the list... > > ben g > > ---------- Forwarded message ---------- > From: Samir Araújo <[email protected]> > Date: 2009/8/10 > Subject: Coding an OpenCog implication for the robot to carry out a command > like "Do Tai Chi" > To: Troy Huang <[email protected]> > Cc: Ben Goertzel <[email protected]> > > > Hi Troy, > > in order to convert a sentence into a command, you will need to create an > ImplicationLink that has in its preconditions a test to check the presence > of the desired sentence into the AtomTable. > i.e. > > ImplicationLink > EvaluationLink *(preconditions. You can use an AndLink to evaluate > more than one precondition)* > PredicateNode:"actionDone" > ListLink > ExecutionLink > GroundedSchemaNode:"say" > ListLink > AvatarNode:"RobotOwner" > SentenceNode:"to:robot_id: *do tai chi*" > > ExecutionLink *(effect)* > GroundedSchemaNode "tai_chi" > ListLink > > (You can write ImpLinks directly in C++ or using a bind script language > like Scheme) > > You must to pass, as an argument, the Handle of the defined ImpLink to the > pattern match (opencog/query/PatternMatch::imply(Handle)).If the > preconditions are satisfied, the PatternMatch will create the effect (in > this case an ExecutionLink containing the schema that will execute the > desired action) into the AtomTable and then it will return the handle of the > execution link. Finally, you must to evaluate the returned ExecutionLink and > then call the appropriate library to make the robot execute the action. If > you're implementing the action procedure using Combo, for example, you will > need to define a built-in Combo procedure (written in C++) and send the > GroundedSchemaNode to Combo Procedure Interpreter to be executed. This > process must be repeated every time a new sentence was said to the robot. > > Well, this is a very simplified description of the whole process. Obviously > you will need to build an ImpLink that detects not only the presence of a > SentenceNode into the AtomTable, but it must to detect what was the latest > said sentence, who said, etc... So, if you send me more informations about > how you're integrating OpenCog with the robot, I can help you to prepare the > module that will evaluate the rules (ImpLinks) and execute the actions in > the robot. (are you using Embodiment? are you using Relex to parse the said > sentences? if you're not using relex, how the sentence comes from the proxy? > etc...) > > Samir > > > > > > > > > > > > > -- > Ben Goertzel, PhD > CEO, Novamente LLC and Biomind LLC > Director of Research, SIAI > [email protected] > > "Truth is a pathless land" -- Jiddu Krishnamurti > > > _______________________________________________ > Mailing list: > https://launchpad.net/~opencog-dev<https://launchpad.net/%7Eopencog-dev> > Post to : [email protected] > Unsubscribe : > https://launchpad.net/~opencog-dev<https://launchpad.net/%7Eopencog-dev> > More help : https://help.launchpad.net/ListHelp > >
_______________________________________________ Mailing list: https://launchpad.net/~opencog-dev Post to : [email protected] Unsubscribe : https://launchpad.net/~opencog-dev More help : https://help.launchpad.net/ListHelp

