On 10/22/07, albert medina <[EMAIL PROTECTED]> wrote:
> My question is:  AGI, as I perceive your explanation of it, is when a
> computer gains/develops an ego and begins to consciously plot its own
> existence and make its own decisions.

That would be one form of AGI, but it should also be possible to
create systems of human-equivalent and human-surpassing intelligence
that

(1) don't have instincts, goals or values of their own
and
(2) may not even be conscious, even though they carry out superhuman
cognitive processing (the possibility of this is not known yet, at
least to me)

> Do you really believe that such a thing can happen?

Yes, but I think most of us would prefer potentially superhuman
systems to not have goals/etc of their own.

> If so, is this the phenomenon you are calling "singularity"?

These days people are referring to several different things when they
use this word. For an explanation of this, see:

http://www.singinst.org/blog/2007/09/30/three-major-singularity-schools/

-- 
Aleksei Riikonen - http://www.iki.fi/aleksei

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=56212402-c57d5a

Reply via email to