Stathis Papaioannou wrote:
On 15 April 2015 at 02:36, John Clark <johnkcl...@gmail.com> wrote:
Whenever somebody says X is an illusion the first question that should be
asked is how would things be different if X were not an illusion; so how
would things be different if persisting through time were not an illusion?
If being a unique entity persisting through time were not an illusion
then in the case of duplication experiments we could definitely say
that you would persist as one of the copies rather than the other.
I think that thought experiments can only take one so far in this
duplication scenario. Suppose you had a fully functional AI, with an
appropriate physical body through which the consciousness could receive
normal sense stimuli inputs, and through which the consciousness could
interact with the world in physical ways. In other words, functionally
equivalent to flesh-and-blood creatures like you and me.
I use AI in order that copying is unproblematic -- we have equivalent
functional bodies ready to accept the uploaded mental state;
consciousness, memories, emotional characteristics, values system, and
whatever else one considers goes towards the makeup of a person.
We then create two such copies from one individual and then remove the
original from the scene (turn it off, or whatever). If we now put these
two copies together in a comfortable setting for a chat, what is likely
to be the direction of the conversation?
Are they going to start arguing: "I'm him!" "No, I'm him!" while they
gesture towards the absent original? Or are they going to talk, and find
that they have a lot in common. They can prompt each other towards
shared memories, political opinions, life-style preferences, and so on.
Unless they are informed about their common origin, there is no real
reason that they would ever suppose that they were closer than two
siblings or twins brought up closely together who had a lot of shared
experiences in life.
In other words, I am suggesting that if this happened, then for all
practical purposes you would have created two new persons who happened
to share a lot of personal characteristics and memories. Nothing is
really gained by claiming that they are still the *same* person. We know
that, inevitably, because they occupy distinct bodies in different
volumes of space, they are rapidly going to diverge -- even during the
supposed period of initial conversation (in MWI terms, the copies
rapidly decohere). They are not both going to end up saying exactly the
same things at the same time as each other. The copies are independent,
not lock-stepped, and just as you don't have *all* your memories and
*all* your views on things to the fore at the same time, these two
copies are not going to appear identical to each other, despite all the
similarities.
Bruce
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.