AIXI shows a couple interesting things...

-- truly general AI, even assuming the universe is computable, is impossible
for any finite system

-- given any finite level L of general intelligence that one desires, there
are some finite R, M so that you can create a computer with less than R
processing speed and M memory capacity, so that the computer can achieve
level L of general intelligence

This doesn't tell you *anything* about how to make AGI in practice.  It does
tell you that, in principle, creating AGI is a matter of *computational
efficiency* ... assuming the universe is computable.

The computability of the universe is something that can't really be proved,
but I argue that it's an implicit assumption underlying the whole scientific
method.  If the universe can't be usefully modelable as computable then the
whole methodology of gathering finite datasets of finite-precision data is
fundamentally limited in what it can tell us about the universe ... which
would really suck...

-- Ben G

-- Ben G

On Sat, Oct 25, 2008 at 7:21 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote:

> --- On Sat, 10/25/08, Mark Waser <[EMAIL PROTECTED]> wrote:
>
> > Ummm.  It seems like you were/are saying then that because
> > AIXI makes an
> > assumption limiting it's own applicability/proof (that
> > it requires that the
> > environment be computable) and because AIXI can make some
> > valid conclusions,
> > that that "suggests" that AIXI's limiting
> > assumptions are true of the
> > universe.  That simply doesn't work, dude, unless you
> > have a very loose
> > inductive-type definition of "suggests" that is
> > more suited for inference
> > control than anything like a logical proof.
>
> I am arguing by induction, not deduction:
>
> If the universe is computable, then Occam's Razor holds.
> Occam's Razor holds.
> Therefore the universe is computable.
>
> Of course, I have proved no such thing.
>
> -- Matt Mahoney, [EMAIL PROTECTED]
>
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"A human being should be able to change a diaper, plan an invasion, butcher
a hog, conn a ship, design a building, write a sonnet, balance accounts,
build a wall, set a bone, comfort the dying, take orders, give orders,
cooperate, act alone, solve equations, analyze a new problem, pitch manure,
program a computer, cook a tasty meal, fight efficiently, die gallantly.
Specialization is for insects."  -- Robert Heinlein



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to