Ben,

Thanks. So the other people now see that I'm not attacking a straw man.

My solution to Hume's problem, as embedded in the experience-grounded
semantics, is to assume no predictability, but to justify induction as
adaptation. However, it is a separate topic which I've explained in my
other publications.

Here I just want to point out that the original and basic meaning of
Occam's Razor and those two common (mis)usages of it are not
necessarily the same. I fully agree with the former, but not the
latter, and I haven't seen any convincing justification of the latter.
Instead, they are often taken as granted, under the name of Occam's
Razor.

Pei

On Tue, Oct 28, 2008 at 12:37 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>
> Hi Pei,
>
> This is an interesting perspective; I just want to clarify for others on the
> list that it is a particular and controversial perspective, and contradicts
> the perspectives of many other well-informed research professionals and deep
> thinkers on relevant topics.
>
> Many serious thinkers in the area *do* consider Occam's Razor a standalone
> postulate.  This fits in naturally with the Bayesian perspective, in which
> one needs to assume *some* prior distribution, so one often assumes some
> sort of Occam prior (e.g. the Solomonoff-Levin prior, the speed prior, etc.)
> as a standalone postulate.
>
> Hume pointed out that induction (in the old sense of extrapolating from the
> past into the future) is not solvable except by introducing some kind of a
> priori assumption.  Occam's Razor, in one form or another, is a suitable a
> prior assumption to plug into this role.
>
> If you want to replace the Occam's Razor assumption with the assumption that
> "the world is predictable by systems with limited resources, and we will
> prefer explanations that consume less resources", that seems unproblematic
> as it's basically equivalent to assuming an Occam prior.
>
> On the other hand, I just want to point out that to get around Hume's
> complaint you do need to make *some* kind of assumption about the regularity
> of the world.  What kind of assumption of this nature underlies your work on
> NARS (if any)?
>
> ben
>
> On Tue, Oct 28, 2008 at 8:58 AM, Pei Wang <[EMAIL PROTECTED]> wrote:
>>
>> Triggered by several recent discussions, I'd like to make the
>> following position statement, though won't commit myself to long
>> debate on it. ;-)
>>
>> Occam's Razor, in its original form, goes like "entities must not be
>> multiplied beyond necessity", and it is often stated as "All other
>> things being equal, the simplest solution is the best" or "when
>> multiple competing theories are equal in other respects, the principle
>> recommends selecting the theory that introduces the fewest assumptions
>> and postulates the fewest entities" --- all from
>> http://en.wikipedia.org/wiki/Occam's_razor
>>
>> I fully agree with all of the above statements.
>>
>> However, to me, there are two common misunderstandings associated with
>> it in the context of AGI and philosophy of science.
>>
>> (1) To take this statement as self-evident or a stand-alone postulate
>>
>> To me, it is derived or implied by the insufficiency of resources. If
>> a system has sufficient resources, it has no good reason to prefer a
>> simpler theory.
>>
>> (2) To take it to mean "The simplest answer is usually the correct
>> answer."
>>
>> This is a very different statement, which cannot be justified either
>> analytically or empirically.  When theory A is an approximation of
>> theory B, usually the former is simpler than the latter, but less
>> "correct" or "accurate", in terms of its relation with all available
>> evidence. When we are short in resources and have a low demand on
>> accuracy, we often prefer A over B, but it does not mean that by doing
>> so we judge A as more correct than B.
>>
>> In summary, in choosing among alternative theories or conclusions, the
>> preference for simplicity comes from shortage of resources, though
>> simplicity and correctness are logically independent of each other.
>>
>> Pei
>>
>>
>> -------------------------------------------
>> agi
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>
>
>
> --
> Ben Goertzel, PhD
> CEO, Novamente LLC and Biomind LLC
> Director of Research, SIAI
> [EMAIL PROTECTED]
>
> "A human being should be able to change a diaper, plan an invasion, butcher
> a hog, conn a ship, design a building, write a sonnet, balance accounts,
> build a wall, set a bone, comfort the dying, take orders, give orders,
> cooperate, act alone, solve equations, analyze a new problem, pitch manure,
> program a computer, cook a tasty meal, fight efficiently, die gallantly.
> Specialization is for insects."  -- Robert Heinlein
>
>
> ________________________________
> agi | Archives | Modify Your Subscription


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to