On Wed, Mar 15, 2017 at 1:54 PM, Linas Vepstas wrote:
> What we cannot do is build little pieces that are disconnected from the
> chatbot: its great that we know have some modal-logic-with-correct-formuals
> code, but its disconnected from the "reality" of a working, demo-able
> chatbot.
Yah, th
Well, not to pointlessly prolong the discussion, but ...
Theory-of-mind is not that different than theory-of-world. Although I
believed that it was Pumpkin that jumped out the car window, I now have to
revise my beliefs based on new evidence.
So, I start with a statement about objective reality:
> Well, but the very first example on the wiki page is "I tell you that small
> dogs can fly" which is not the same as "I believe that small dogs can
> fly"...
>
> This promptly goes down a rabbit-hole of a theory of mind: "I believe that
> Ben thinks that small dogs can fly" or more likely: "I b
On Wed, Mar 15, 2017 at 11:47 AM, Ben Goertzel wrote:
> I guess the argument for having a link type "BeliefLink" is that Sumit
> created specific quantitative truth value formulas to deal with
> BeliefLink Of course these formulas could also be used together
> with "PredicateNode 'belief'
I guess the argument for having a link type "BeliefLink" is that Sumit
created specific quantitative truth value formulas to deal with
BeliefLink Of course these formulas could also be used together
with "PredicateNode 'belief' " as well, but so far typically when we
have specific math formu
I updated the wiki page to mention it, and also mention that a BeliefLink
is the same as EvaluationLink Predicate "belief". Not only would a port
into the current PLN infratructure be useful, but that should be followed
by a tutorial/example, and there should also be a hookup into the chatbot.
T
Hmm, OK, it's been a while since that work was done and almost as long
since I looked at it
The crux of Sumit's work was to modify the PLN truth value formulas to
work sensibly for these modal-logic operators (belief, etc.). That
part was solid and I remember it. But the choice of link types he
I just skimmed that code, and it does not seem to make use of the KR
structures described in http://wiki.opencog.org/w/Claims_and_context and
instead invents new link types .. e.g. BeliefLink. This leads to a
proliferation: you'd need BeliefLink, SayLink,
TellLink,UseToBeleiveInThePastLink, LieLi
Code and theory for extending PLN to handle modal reasoning regarding
beliefs, knowledge and so forth is here:
https://github.com/sumitsourabh/opencog/tree/patch-1/opencog/reasoning/pln/rules/epistemic-reasoning
This was carefully worked out by Sumit Sourabh and Matt Ikle' a few years ago.
The c
Here:
this page tells you about how to repesent the internal state of other
speakers (this includes beleifs, demands, ettc.)
http://wiki.opencog.org/w/Claims_and_contexts
---linas
On Wed, Mar 15, 2017 at 11:07 AM, Linas Vepstas
wrote:
>
>
> On Sat, Mar 11, 2017 at 10:25 PM, Alex wrote:
>
>>
On Sat, Mar 11, 2017 at 10:25 PM, Alex wrote:
> Hi!
>
> There can be modalities (which are usually expressed as diamonds or boxes
> (operators) in modal logic):
> DUTY_TO_PERFORM_ACTION(agent, action, time horizone) - agent should
> perform action within time horizon
> BELIEF(agent, statement, ti
Hi!
There can be modalities (which are usually expressed as diamonds or boxes
(operators) in modal logic):
DUTY_TO_PERFORM_ACTION(agent, action, time horizone) - agent should perform
action within time horizon
BELIEF(agent, statement, time instant) - agent believes in statement at the
time inst
What Ben said. And some other remarks
On Sun, Mar 12, 2017 at 4:31 AM, wrote:
> Hi!
>
> There can be modalities (which are usually expressed as diamonds or boxes
> (operators) in modal logic):
> DUTY_TO_PERFORM_ACTION(agent, action, time horizone) - agent should
> perform action within time hori
When to write
BlahBlahLink
A
B
or
EvaluationLink
PredicateNode "BlahBlah"
ListLink
A
B
is something that we don't have a clear policy for at this point.
The OpenCog architecture lets you do either one, and they basically
have the same semantics...
Hi!
There can be modalities (which are usually expressed as diamonds or boxes
(operators) in modal logic):
DUTY_TO_PERFORM_ACTION(agent, action, time horizone) - agent should perform
action within time horizon
BELIEF(agent, statement, time instant) - agent believes in statement at the
time inst
15 matches
Mail list logo