Pei,

Thanks for reading. Some comments attached. I don't know when I will
have time to work on the next version. My estimate is this weekend.

-Abram

On Wed, Sep 24, 2008 at 5:32 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
> Abram,
>
> Some comments are added into your writing after my first reading. It
> seems I need to read it again.
>
> Pei
>
> On Tue, Sep 23, 2008 at 7:26 PM, Abram Demski <[EMAIL PROTECTED]> wrote:
>> Wow! I did not mean to stir up such an argument between you two!!
>>
>> Pei,
>>
>> What if instead of using "node probability", the knowledge that "wrote
>> an AGI book" is rare was inserted as a low frequency (high confidence)
>> truth value on "human" => "wrote an AGI book"? Could NARS use that to
>> do what Ben wants? More specifically, could it do so with only the
>> knowledge:
>>
>> Ben is agi-author <high, high>
>> guy is agi-author <high, high>
>> Ben is human <high, high>
>> guy is human <high, high>
>> human is agi-author <low, high>
>>
>> If this was literally all NARS knew, what difference would
>> adding/removing the last item make to the system's opinion of "guy is
>> Ben"?
>>
>> To answer your earlier question, I am still ignoring confidence. It
>> could always be calculated from the frequencies, of course. But, that
>> does not justify using them in the calculations the way you do.
>> Perhaps once I figure out the exact formulas for everything, I will
>> see if they match up to a particular value of the parameter k. Or,
>> perhaps, a value of k that moves according to the specific situation.
>> Hmm... actually... that could be used as a fudge factor to get
>> everything to "match up"... :)
>>
>> Also, attached is my latest revision. I have found that NARS deduction
>> does not quite fit with my definitions. Induction and abduction are OK
>> so far. If in the end I merely have something "close" to NARS, I will
>> consider this a success-- it is an interpretation that fits well
>> enough to show where NARS essentially differs from probability theory.
>>
>> On Tue, Sep 23, 2008 at 5:54 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
>>> Yes, I know them, though I don't like any of them that I've seen. I
>>> wonder Abram can find something better.
>>>
>>> To tell you the truth, my whole idea of confidence actually came from
>>> a probabilistic formula, after my re-interpretation of it.
>>>
>>> Pei
>>>
>>> On Tue, Sep 23, 2008 at 4:35 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>>>>
>>>> Note that formally, the
>>>>
>>>> c = n/(n+k)
>>>>
>>>> equation also exists in the math of the beta distribution, which is used
>>>> in Walley's imprecise probability theory and also in PLN's indefinite
>>>> probabilities...
>>>>
>>>> So there seems some hope of making such a correspondence, based on
>>>> algebraic evidence...
>>>>
>>>> ben
>>>>
>>>> On Tue, Sep 23, 2008 at 4:29 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
>>>>>
>>>>> Abram,
>>>>>
>>>>> Can your approach gives the Confidence measurement a probabilistic
>>>>> interpretation? It is what really differs NARS from the other
>>>>> approaches.
>>>>>
>>>>> Pei
>>>>>
>>>>> On Mon, Sep 22, 2008 at 11:22 PM, Abram Demski <[EMAIL PROTECTED]>
>>>>> wrote:
>>>>> >>> This example also shows why NARS and PLN are similar on deduction, but
>>>>> >>> very different in abduction and induction.
>>>>> >>
>>>>> >> Yes.  One of my biggest practical complaints with NARS is that the
>>>>> >> induction
>>>>> >> and abduction truth value formulas don't make that much sense to me.
>>>>> >
>>>>> > Interesting in the context of these statements that my current
>>>>> > "justification" for NARS probabilistically justifies induction and
>>>>> > abduction but isn't as clear concerning deduction. (I'm working on
>>>>> > it...)
>>>>> >
>>>>> > --Abram Demski
>>>>> >
>>>>> >
>>>>> > -------------------------------------------
>>>>> > agi
>>>>> > Archives: https://www.listbox.com/member/archive/303/=now
>>>>> > RSS Feed: https://www.listbox.com/member/archive/rss/303/
>>>>> > Modify Your Subscription: https://www.listbox.com/member/?&;
>>>>> > Powered by Listbox: http://www.listbox.com
>>>>> >
>>>>>
>>>>>
>>>>> -------------------------------------------
>>>>> agi
>>>>> Archives: https://www.listbox.com/member/archive/303/=now
>>>>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>>>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>>> Powered by Listbox: http://www.listbox.com
>>>>
>>>>
>>>>
>>>> --
>>>> Ben Goertzel, PhD
>>>> CEO, Novamente LLC and Biomind LLC
>>>> Director of Research, SIAI
>>>> [EMAIL PROTECTED]
>>>>
>>>> "Nothing will ever be attempted if all possible objections must be first
>>>> overcome " - Dr Samuel Johnson
>>>>
>>>>
>>>> ________________________________
>>>> agi | Archives | Modify Your Subscription
>>>
>>>
>>> -------------------------------------------
>>> agi
>>> Archives: https://www.listbox.com/member/archive/303/=now
>>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>> Powered by Listbox: http://www.listbox.com
>>>
>>
>>
>>
>> -------------------------------------------
>> agi
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>>
>
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to