The question that worries me is: **What does it matter if AIXI __is__
optimal, given that it uses infinitely many resources**??

And, what does it matter if AIXI-tl is near-optimal, given that it uses
infeasibly much resources?

These are nice theoretical systems addressing nice math problems ... but why
do you think these math problems relate closely to the REAL problem of
creating AGI given feasible computational resources?

That is the  missing link.

Schmidhuber (with OOPs) and Legg (with his work on reinforcement learning)
have tried to build links between the abstract AIX-type theory and pragmatic
AGI, but without real success so far, I feel...

-- Ben G

On Fri, Oct 31, 2008 at 10:07 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote:

> Right, but I am not talking about AIXI^tl. I agree AIXI^tl is not a
> practical approach to AGI because it has exponential time complexity. The
> important results are the non-computability of AIXI and its proof of Occam's
> Razor as a general principle (if physics is Turing computable).
>
> -- Matt Mahoney, [EMAIL PROTECTED]
>
> --- On *Fri, 10/31/08, Ben Goertzel <[EMAIL PROTECTED]>* wrote:
>
> From: Ben Goertzel <[EMAIL PROTECTED]>
> Subject: Re: [agi] "the universe is computable" ..PS
> To: agi@v2.listbox.com
> Date: Friday, October 31, 2008, 9:58 AM
>
>
> I was referring to AIXItl which is also contained in Hutter's papers/book
> and operates with merely infeasibly huge rather than infinite resources...
>
> Hutter's theorems say *nothing* about the optimal way to achieve
> intelligence (according to his definition, and under the assumption of a
> computable universe) given feasibly limited computational resources
>
> ben g
>
> On Fri, Oct 31, 2008 at 9:49 AM, Matt Mahoney <[EMAIL PROTECTED]>wrote:
>
>> --- On Fri, 10/31/08, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>>
>> Hutter's proof that Occam's Razor (in a certain form) is key to
>> intelligence depends on
>>
>> 1) a specific definition of what "intelligence" is
>>
>> 2)  a restriction to intelligent systems with a huge amount of
>> computational resources
>>
>> as well as
>>
>> 3) an assumption that the universe is in-principle computable
>>
>> To me, personally, 2 is the biggest worry.  I'm willing to accept 3 as an
>> interesting working hypothesis, and 1 as a guide for ongoing work, but it
>> seems likely to me that for intelligent systems with feasibly modest
>> computational resources, other fundamental principles are required along
>> with Occam-like ones.
>>
>> ---
>>
>> No, Hutter does not say (2). AIXI says that optimal intelligence is not
>> computable at all.
>>
>> -- Matt Mahoney, [EMAIL PROTECTED]
>>
>>
>>
>> -------------------------------------------
>> agi
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>>
>
>
>
> --
> Ben Goertzel, PhD
> CEO, Novamente LLC and Biomind LLC
> Director of Research, SIAI
> [EMAIL PROTECTED]
>
> "A human being should be able to change a diaper, plan an invasion, butcher
> a hog, conn a ship, design a building, write a sonnet, balance accounts,
> build a wall, set a bone, comfort the dying, take orders, give orders,
> cooperate, act alone, solve equations, analyze a new problem, pitch manure,
> program a computer, cook a tasty meal, fight efficiently, die gallantly.
> Specialization is for insects."  -- Robert Heinlein
>
>
>  ------------------------------
>   *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>
> ------------------------------
>   *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"A human being should be able to change a diaper, plan an invasion, butcher
a hog, conn a ship, design a building, write a sonnet, balance accounts,
build a wall, set a bone, comfort the dying, take orders, give orders,
cooperate, act alone, solve equations, analyze a new problem, pitch manure,
program a computer, cook a tasty meal, fight efficiently, die gallantly.
Specialization is for insects."  -- Robert Heinlein



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to