Matt Mahoney wrote:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
Why do say that "Our reign will end in a few decades" when, in fact, one
of the most obvious things that would happen in this future is that
humans will be able to *choose* what intelligence level to be
experiencing, on a day to day basis? Similarly, the AGIs would be able
to choose to come down and experience human-level intelligence whenever
they liked, too.
Let's say that is true. (I really have no disagreement here). Suppose that
at the time of the singularity that the memories of all 10^10 humans alive at
the time, you included, are nondestructively uploaded. Suppose that this
database is shared by all the AGI's. Now is there really more than one AGI?
Are you (the upload) still you?
Well, first point is that we all get to choose whether or not this
upload happens: I don't particularly want to duplicate myself in this
way, and I think many others would also be cautious, so your scenario is
less than likely. I do not have the slightest desire to become nothing
but a merged copy of myself within a larger entity, and I don't think
many other people would want to be nothing but that, so this merge (if
it happened at all) would just take place in parallel with everything else.
But if they did, and it was implemented exactly as you describe, then
all 10^10 minds would be merged (is that what you were meaning) and that
merged mind would be a single individual with rather a lot of baggage.
There would also be 10^10 humans carrying on as normal (per your
description of the scenario) and I cannot see any reason to call them
anything other than themselves.
Does it now matter if humans in biological form still exist? You have
preserved everyone's memory and DNA, and you have the technology to
reconstruct any person from this information any time you want.
What counts is the number of individuals, whatever form them transmute
themselves into. I suspect that most people would want to stay
individual. Whether they use human form or not, I don't know, but I
suspect that the human form will remain a baseline, with people taking
trips out to other forms, for leisure. Whether that remains so forever,
I cannot say, but you are implying here that (for some reasons that is
completely obscure to me) there would be some pressure to upload
everyone's minds to a central databank and then wipe out the originals
and reconstruct them occasionally. There would be no pressure for
people to do that, so why would it happen?
So when you say "does it matter if humans in biological form still
exist?" I say: it will matter to those humans, probably, so they will
still exist.
Suppose that the collective memories of all the humans make up only one
billionth of your total memory, like one second of memory out of your human
lifetime. Would it make much difference if it was erased to make room for
something more important?
This question is not coherent, as far as I can see. "My" total memory?
Important to whom? Under what assumptions do you suggest this situation.
You seem to be presenting me with a scenario out of the blue, talking as
if it were in some sense an inevitability (which it clearly is not) and
then asking me to comment on it. That's a loaded question, surely?
I am not saying that the extinction of humans and its replacement with godlike
intelligence is necessarily a bad thing, but it is something to be aware of.
Nothing in what I have said implies that there would be any such thing
as an "extinction of humans and its replacement with godlike
intelligence", and you have not tried to establish that this is
inevitable or likely, so the issue strikes me as pointless. We might as
well discuss the pros and cons of all the humans becoming fans of only
one baseball team, for all eternity, and the terrible pain this would
cause to all the other teams ..... :-)
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=57826788-6dce1e