Is an AGI necessarily a "super human" or is just the equivalent of a "smart
human" good enough?

I rarely find humans that "adapt by itself to unfamiliar situations".  Take
a random person and place them in some remote location without adequate
training and they would last how long?  A week maybe.  I wouldn't define
this as adapting very well to an unfamiliar situation, would you?

Most humans (99.999% IMHO) don't ever create anything absolutely new, in
terms of human knowledge, so why would this be a criteria for an AGI?

I agree that an AGI must be able to learn.  I agree an AGI must be able to
reason and solve problems without just resorting to a stored lookup table.
BUT I don't agree that an AGI has to create itself when it is obvious humans
can't either.  Even though Ben believes in "emergent intelligence", he has
always said that training by humans to at least some level is absolutely
necessary.

Even though I agree that "generalizing" is a very desirable quality for an
AGI, is this property necessary to creating an AGI?  Most people don't
generalize all that well in my opinion.

David Clark

> -----Original Message-----
> From: William Pearson [mailto:[EMAIL PROTECTED]
> Sent: March-03-08 8:21 PM
> To: agi@v2.listbox.com
> Subject: Re: [agi] Thought experiment on informationally limited
> systems
> 
> On 04/03/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> > David: >I was specifically referring to your comment ending in "BY
> ITSELF".
> >
> > >
> >  >> Jeez, Will, the point of Artificial General Intelligence is that
> it
> >  >> can start adapting to an unfamiliar situation and domain BY
> ITSELF.
> >  >
> >  > I believe this statement is just plain incorrect.
> >
> >
> > David,
> >
> >  I find that extraordinary, but I accept your sincerity. The
> definition I
> >  gave is an essential part of an AGI - if it can't adapt by itself
> sometimes
> >  to unfamiliar situations, (as humans can), and can only act on
> others'
> >  instructions, then it's a narrow AI. I wonder whether anyone else
> shares
> >  your view.
> 
> There are a number of threads here that need disentangling. All these
> answers are my opinion only.
> 
> Is a system that can adapt by itself to unfamiliar situations
> necessary for AGI? I would answer yes.
> 
> Is it the only thing an AGI needs to be able to do? No. If I had a
> system that could build houses out of bricks, stones and straw it
> would not be an AGI if it could not be taught or learn cryptography.
> General for me means the ability to learn many different skills,
> including working on its own.
> 
> Is generalising a skill logically the first thing that you need to
> make an AGI? Nope, the means and sufficient architecture to acquire
> skills and competencies are more useful early on in an agi
> development. I see generalising, in the way you talk about it, as a
> skill that can be acquired and improved upon. We certainly can change
> our ability to do so, through out our life time. If a skill can be
> changed and/or improved then something in the system must be changed,
> either data, program or something else. It is the manner and nature of
> these changes, very low level sub concious stuff (google neuro
> plasticity for what I am talking about in humans), that I think needs
> to be worked upon first. Else you are going to create a static and
> crystalline system.
> 
>  Will Pearson
> 
> -------------------------------------------
> agi
> Archives: http://www.listbox.com/member/archive/303/=now
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> http://www.listbox.com/member/?&;
> 724342
> Powered by Listbox: http://www.listbox.com

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to