--- Richard Loosemore <[EMAIL PROTECTED]> wrote:

> Why do say that "Our reign will end in a few decades" when, in fact, one 
> of the most obvious things that would happen in this future is that 
> humans will be able to *choose* what intelligence level to be 
> experiencing, on a day to day basis?  Similarly, the AGIs would be able 
> to choose to come down and experience human-level intelligence whenever 
> they liked, too.

Let's say that is true.  (I really have no disagreement here).  Suppose that
at the time of the singularity that the memories of all 10^10 humans alive at
the time, you included, are nondestructively uploaded.  Suppose that this
database is shared by all the AGI's.  Now is there really more than one AGI? 
Are you (the upload) still you?

Does it now matter if humans in biological form still exist?  You have
preserved everyone's memory and DNA, and you have the technology to
reconstruct any person from this information any time you want.

Suppose that the collective memories of all the humans make up only one
billionth of your total memory, like one second of memory out of your human
lifetime.  Would it make much difference if it was erased to make room for
something more important?

I am not saying that the extinction of humans and its replacement with godlike
intelligence is necessarily a bad thing, but it is something to be aware of.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=57756689-2193f7

Reply via email to