On 10/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> --- Vladimir Nesov <[EMAIL PROTECTED]> wrote:
>
> > I intentionally don't want to exactly define what S is as it describes
> > vaguely-defined 'subjective experience generator'. I instead leave it
> > at description level.
>
> If you can't defi
--- Vladimir Nesov <[EMAIL PROTECTED]> wrote:
> I intentionally don't want to exactly define what S is as it describes
> vaguely-defined 'subjective experience generator'. I instead leave it
> at description level.
If you can't define what subjective experience is, then how do you know it
exists?
Monday, September 10, 2007, Matt Mahoney wrote:
MM> Perhaps I misunderstand, but to make your argument more precise:
MM> X is an implementation of a mind, a Turing machine.
No. The whole argument is about why turing machine-like implementation
of uploaded brain doesn't seem to do the trick. X is
--- Vladimir Nesov <[EMAIL PROTECTED]> wrote:
> Sunday, September 9, 2007, Matt Mahoney wrote:
>
> MM> Also, Chalmers argues that a machine copy of your brain must be
> conscious.
> MM> But he has the same instinct to believe in consciousness as everyone
> else. My
> MM> claim is broader: that
Sunday, September 9, 2007, Matt Mahoney wrote:
MM> Also, Chalmers argues that a machine copy of your brain must be conscious.
MM> But he has the same instinct to believe in consciousness as everyone else.
My
MM> claim is broader: that either a machine can be conscious or that
consciousness
MM>
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 09/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
>
> > > > Your dilemma: after you upload, does the original human them become a
> > > > p-zombie, or are there two copies of your consciousness? Is it
> necessary
> > > to
> > > > kill the
--- Nathan Cook <[EMAIL PROTECTED]> wrote:
> >
> > What if the copy is not exact, but close enough to fool others who know
> > you?
> > Maybe you won't have a choice. Suppose you die before we have developed
> > the
> > technology to scan neurons, so family members customize an AGI in your
> > l