Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Randall Randall
On Jun 29, 2007, at 6:54 PM, Matt Mahoney wrote: --- Randall Randall <[EMAIL PROTECTED]> wrote: On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote: --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: How does this answer questions like, if I am destructively teleported to two different location

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Matt Mahoney
--- Randall Randall <[EMAIL PROTECTED]> wrote: > > On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote: > > --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > >> How does this answer questions like, if I am destructively teleported > >> to two different locations, what can I expect to experience?

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Heartland
Stathis: > Although you make an exception when the copying takes place gradually > inside your own head, switching atoms in your brain for new ones > obtained from environmental raw materials, and excreting the original > atoms. There is no exception because the two cases are not equivalent whe

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Heartland
Stathis: > Although you make an exception when the copying takes place gradually > inside your own head, switching atoms in your brain for new ones > obtained from environmental raw materials, and excreting the original > atoms. There is no exception because the two cases are not equivalent whe

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Stathis Papaioannou
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote: Stathis: > Although you make an exception when the copying takes place gradually > inside your own head, switching atoms in your brain for new ones > obtained from environmental raw materials, and excreting the original > atoms. There is no exce

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Heartland
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote: The contradiction exists only in the minds of those who can't see or are unable to accept that "consciousness" doesn't transfer to a copy regardless of anything else. Once this is clear, the imaginary paradox disappears. This paradox has alw

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Stathis Papaioannou
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote: The contradiction exists only in the minds of those who can't see or are unable to accept that "consciousness" doesn't transfer to a copy regardless of anything else. Once this is clear, the imaginary paradox disappears. This paradox has alway

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Tom McCabe
I'm going to let the zombie thread die. - Tom --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote: > On 29/06/07, Tom McCabe <[EMAIL PROTECTED]> > wrote: > > > But when you talk about "yourself", you mean the > > "yourself" of the copy, not the "yourself" of the > > original person. While all th

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Stathis Papaioannou
On 29/06/07, Tom McCabe <[EMAIL PROTECTED]> wrote: But when you talk about "yourself", you mean the "yourself" of the copy, not the "yourself" of the original person. While all the copied selves can only exist in one body, the original self can exist in more than one body. You can pull this off