On Tue, Sep 23, 2008 at 7:26 PM, Abram Demski <[EMAIL PROTECTED]> wrote:
> Wow! I did not mean to stir up such an argument between you two!!

Abram: This argument has been going on for about 10 years, with some
"on" periods and "off" periods, so don't feel responsible for it ---
you just raised the right topic in the right time to turn it "on"
again. ;-)

> Pei,
>
> What if instead of using "node probability", the knowledge that "wrote
> an AGI book" is rare was inserted as a low frequency (high confidence)
> truth value on "human" => "wrote an AGI book"? Could NARS use that to
> do what Ben wants? More specifically, could it do so with only the
> knowledge:
>
> Ben is agi-author <high, high>
> guy is agi-author <high, high>
> Ben is human <high, high>
> guy is human <high, high>
> human is agi-author <low, high>
>
> If this was literally all NARS knew, what difference would
> adding/removing the last item make to the system's opinion of "guy is
> Ben"?

Not much, since "Ben" and "guy" play symmetric role in the knowledge.

> To answer your earlier question, I am still ignoring confidence. It
> could always be calculated from the frequencies, of course.

Not really. The frequency and confidence of a judgment is independent
(in my sense) of each other. If you mean "from the frequencies of the
premises", then it is true, but still you need to provide confidence
for the premises, too.

> But, that
> does not justify using them in the calculations the way you do.
> Perhaps once I figure out the exact formulas for everything, I will
> see if they match up to a particular value of the parameter k. Or,
> perhaps, a value of k that moves according to the specific situation.
> Hmm... actually... that could be used as a fudge factor to get
> everything to "match up"... :)

Probably not. The choice of k doesn't change the overall nature of the
uncertainty calculus.

> Also, attached is my latest revision. I have found that NARS deduction
> does not quite fit with my definitions. Induction and abduction are OK
> so far. If in the end I merely have something "close" to NARS, I will
> consider this a success-- it is an interpretation that fits well
> enough to show where NARS essentially differs from probability theory.

I'll find time to read it carefully.

Pei


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to