--- John Scanlon <[EMAIL PROTECTED]> wrote:

> Alright, I have to say this.
> 
> I don't believe that the singularity is near, or that it will even occur.  I
> am working very hard at developing real artificial general intelligence, but
> from what I know, it will not come quickly.  It will be slow and
> incremental.  The idea that very soon we can create a system that can
> understand its own code and start programming itself is ludicrous.
> 
> Any arguments?

Not very soon, maybe 10 or 20 years.  General programming skills will first
require an adult level language model and intelligence, something that could
pass the Turing test.

Currently we can write program-writing programs only in very restricted
environments with simple, well defined goals (e.g. genetic algorithms).  This
is not sufficient for recursive self improvement.  The AGI will first need to
be at the intellectual level of the humans who built it.  This means
sufficient skills to do research, and to write programs from ambiguous natural
language specificiations and have enough world knowledge to figure out what
the customer really wanted.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to