On 7/10/2015 7:40 pm, Stathis Papaioannou wrote:
On 7 October 2015 at 18:16, Bruce Kellett <bhkell...@optusnet.com.au <mailto:bhkell...@optusnet.com.au>> wrote:

    On 7/10/2015 5:30 pm, Stathis Papaioannou wrote:
    On 6 October 2015 at 16:54, Bruce Kellett
    <bhkell...@optusnet.com.au <mailto:bhkell...@optusnet.com.au>>
    wrote:

        It is possible that you might be able to make copies -- you
        certainly can for AI. A copy means that information from the
        original was used to construct the copy.

    Where I use "copy" you might use a different word. Let's define
    "doppelganger" as something which resembles the original
    arbitrarily closely but was not constructed using information
    from the original, while a "copy" is the same except constructed
    using information from the original. Using these definitions, are
    you saying that if you are replaced by a copy you will survive,
    whereas if you are replaced by a doppelganger you will not?
    I am saying, among other things, that you cannot establish that
    such 'doppelgangers' exist. Even if you had a purported
    doppelganger, you could not establish that is was identical to you.

    Theorizing on the basis of such doppelgangers is unsound.


You could establish that the doppelganger exists and is identical as easily as you could establish this about the copy, or about the original compared to its earlier self. And even if you could not, you could theorise about it even if you were a logical positivist.

That is simply false. You can establish the validity of the copy by a process of technological audit, if you like that term. You know how you gathered the data on which the copy is based, and the process by which that data was instantiated in the copy. These processes are available for testing and verification.

You have none of these facilities for doppelgangers, especially if they are outside your light cone. Logical positivists would label all such theorizing about unverifiable, untestable, speculations empty, --even if they are too polite to call it meaningless.

Bruce

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to