Re: advice needed for Star Trek talk

2010-12-15 Thread Bruno Marchal


On 14 Dec 2010, at 20:24, Brent Meeker wrote:


On 12/14/2010 7:30 AM, Jason Resch wrote:


Ron,

I think the path to seeing the mind as a program is easier in this  
way:
1. It's not what the parts of the brain are made of its how they  
function which determines behavior
2. This leads to the idea of multiple realizability http://en.wikipedia.org/wiki/Multiple_realizability 
 (Brains can be made in different ways so long as the parts  
function the same)
3. Accordingly, one could replace each neuron, or each atom, (or  
whatever) with a device that behaved like what it was replacing (A  
man made out of antimatter and antiparticles would still be a man)
4. Philosophical zombies ( http://en.wikipedia.org/wiki/Philosophical_zombie 
 ) are not possible, their brain/mind would have all the same  
beliefs, and all the same information as the equivalently organized  
and behaving brain it replaced, but in what sense could one say  
this one's beliefs are wrong but this one's beliefs are right?   
There would be no way to ever prove that one is conscious and one  
is not, it would be wrong for no reason at all.  This is what it  
takes for the idea of zombies to be consistent.  Further, the real  
brain and zombie brain could never even report feeling any  
different, since both brains contain the same information and same  
knowledge, how is it possible for one to report differences in  
experience?  This addresses your question of whether or not there  
would be an impact to one's consciousness if their brain were  
swapped by a device with equivalent processing of information.


I don't disagree with any of the above.  But there is a complexity  
that is passed over.  Having information, and being able to equate  
the same information, implies that the processes in the brain are  
about something, something that the differently realized brains can  
agree on.  I think this requires an external world with which they  
both can interact.



The problem, that is *the* mind-body problem in the mechanist frame,  
is that a digital mechanism cannot distinguish a local simulation of  
an external world with an external world. This eventually leads to  
making external worlds into a statistical sum on all the computations  
going through our current state.


Bruno




http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: advice needed for Star Trek talk

2010-12-15 Thread Bruno Marchal


On 14 Dec 2010, at 16:30, Jason Resch wrote:


Ron,

I think the path to seeing the mind as a program is easier in this  
way:
1. It's not what the parts of the brain are made of its how they  
function which determines behavior
2. This leads to the idea of multiple realizability http://en.wikipedia.org/wiki/Multiple_realizability 
 (Brains can be made in different ways so long as the parts function  
the same)
3. Accordingly, one could replace each neuron, or each atom, (or  
whatever) with a device that behaved like what it was replacing (A  
man made out of antimatter and antiparticles would still be a man)
4. Philosophical zombies ( http://en.wikipedia.org/wiki/Philosophical_zombie 
 ) are not possible, their brain/mind would have all the same  
beliefs, and all the same information as the equivalently organized  
and behaving brain it replaced, but in what sense could one say this  
one's beliefs are wrong but this one's beliefs are right?  There  
would be no way to ever prove that one is conscious and one is not,  
it would be wrong for no reason at all.  This is what it takes for  
the idea of zombies to be consistent.  Further, the real brain and  
zombie brain could never even report feeling any different, since  
both brains contain the same information and same knowledge, how is  
it possible for one to report differences in experience?  This  
addresses your question of whether or not there would be an impact  
to one's consciousness if their brain were swapped by a device with  
equivalent processing of information.
5. If zombies are impossible, then any device containing the same  
information and processing it in the same way as another mind should  
have the same consciousness.
6. By Church-Turing thesis, a Turing machine (computer) can process  
information in any way that information can be processed.


But then a digital machine cannot see the difference between its brain  
emulated by a physical device, of by the true existence of the proof  
of the Sigma_1 relation which exists independently of us in  
arithmetic. Some will argue that a physical universe is needed, but  
either they add a magic, non comp-emulable, relation between mind and  
matter, or if that relation is emulable, they just pick up a special  
universal number (the physical universe) or introduce an ad hoc  
physical supervenience thesis.


Note that to say the mind is emulable by a computer says very little  
about a mind, it essentally says only that that the mind is a  
process.  The analogy is that a computer can process information in  
any possible way given the appropriate programming, just as a record  
player can produce any possible sound given the appropriate record.   
Saying the mind is emulable by a computer is like saying voice is  
emulable by a record player.  (It is not a very big leap,  
conceptually)


I agree. But the consequence seems to be a big leap for many. Seems  
because the results are more ignored than criticized.
The problem (for many) is that mechanism is used by materialists, but  
in fine mechanism is not compatible with materialism. Mechanism makes  
matter an emerging pattern from the elementary arithmetical truth seen  
from inside. That makes mechanism a testable hypothesis, and that can  
already explain many qualitative features of the observable worlds,  
like indeterminacy, non-locality, non-clonability of matter, and some  
more quantitative quantum tautologies.


A key idea not well understood is the difference between proof/belief  
and computation/emulation. I will send a post on this.





It doesn't matter if the process is like parallel programs,  
networked computers, etc. a single computer can process information  
in the same way as a whole bunch of computers running in parallel  
without any difficulty.  The thing computers have difficulty with  
are infinities.  Questions which take an infinite amount of  
processing or infinite amount of information to answer can't  
realistically be simulated.  On this Bruno has said, if you don't  
believe the neuron requires an infinite amount of information to  
decide whether or not to fire, then you are a mechanist.


Jason

On Mon, Dec 13, 2010 at 6:13 PM, ronaldheld ronaldh...@gmail.com  
wrote:

Bruno:
 Thanks for the weekend wishes.
  I believe the Brain runs programs, in parallel, but are they the
Mind, and are they able to be run as Turing emulable programs with no
impact to one's consciousness?
 Ronald

On Dec 11, 7:51 am, Bruno Marchal marc...@ulb.ac.be wrote:
 On 11 Dec 2010, at 01:01, ronaldheld wrote:

  Bruno:
   I stand corrected  on steps 6 and 7. I believe I understand  
your UDA

  diagrams.

 OK.  Thanks for saying.

  Before I can comment, I need to decide waht progrmas are and
  are not Turing emulatable,

 All programs are Turing-emulable. That is a consequence of Church
 thesis.
 Many computer scientists tend to consider that Church Thesis is
 trivially 

Re: advice needed for Star Trek talk

2010-12-15 Thread ronaldheld
Jason:
   I do not think a neutron take more trhan a finite amount of voltage
to be able to fire. I do wonder if merely replacing the bio parts by
processing hardware, do you lose the part of the complexity of the
mind? Np problem with an antimatter man and mind.
 
Ronald


On Dec 14, 10:30 am, Jason Resch jasonre...@gmail.com wrote:
 Ron,

 I think the path to seeing the mind as a program is easier in this way:
 1. It's not what the parts of the brain are made of its how they function
 which determines behavior
 2. This leads to the idea of multiple 
 realizabilityhttp://en.wikipedia.org/wiki/Multiple_realizability(Brains can 
 be made in
 different ways so long as the parts function the same)
 3. Accordingly, one could replace each neuron, or each atom, (or whatever)
 with a device that behaved like what it was replacing (A man made out of
 antimatter and antiparticles would still be a man)
 4. Philosophical zombies (http://en.wikipedia.org/wiki/Philosophical_zombie) 
 are not possible,
 their brain/mind would have all the same beliefs, and
 all the same information as the equivalently organized and behaving brain it
 replaced, but in what sense could one say this one's beliefs are wrong but
 this one's beliefs are right?  There would be no way to ever prove that one
 is conscious and one is not, it would be wrong for no reason at all.  This
 is what it takes for the idea of zombies to be consistent.  Further, the
 real brain and zombie brain could never even report feeling any different,
 since both brains contain the same information and same knowledge, how is it
 possible for one to report differences in experience?  This addresses your
 question of whether or not there would be an impact to one's consciousness
 if their brain were swapped by a device with equivalent processing of
 information.
 5. If zombies are impossible, then any device containing the same
 information and processing it in the same way as another mind should have
 the same consciousness.
 6. By Church-Turing thesis, a Turing machine (computer) can process
 information in any way that information can be processed.  Note that to say
 the mind is emulable by a computer says very little about a mind, it
 essentally says only that that the mind is a process.  The analogy is that a
 computer can process information in any possible way given the appropriate
 programming, just as a record player can produce any possible sound given
 the appropriate record.  Saying the mind is emulable by a computer is like
 saying voice is emulable by a record player.  (It is not a very big leap,
 conceptually)

 It doesn't matter if the process is like parallel programs, networked
 computers, etc. a single computer can process information in the same way as
 a whole bunch of computers running in parallel without any difficulty.  The
 thing computers have difficulty with are infinities.  Questions which take
 an infinite amount of processing or infinite amount of information to answer
 can't realistically be simulated.  On this Bruno has said, if you don't
 believe the neuron requires an infinite amount of information to decide
 whether or not to fire, then you are a mechanist.

 Jason



 On Mon, Dec 13, 2010 at 6:13 PM, ronaldheld ronaldh...@gmail.com wrote:
  Bruno:
   Thanks for the weekend wishes.
    I believe the Brain runs programs, in parallel, but are they the
  Mind, and are they able to be run as Turing emulable programs with no
  impact to one's consciousness?
                                                   Ronald

  On Dec 11, 7:51 am, Bruno Marchal marc...@ulb.ac.be wrote:
   On 11 Dec 2010, at 01:01, ronaldheld wrote:

Bruno:
 I stand corrected  on steps 6 and 7. I believe I understand your UDA
diagrams.

   OK.  Thanks for saying.

Before I can comment, I need to decide waht progrmas are and
are not Turing emulatable,

   All programs are Turing-emulable. That is a consequence of Church
   thesis.
   Many computer scientists tend to consider that Church Thesis is
   trivially true, but, when you study it you might realize that CT is on
   the contrary quite miraculous. Like Gödel saw, it is a miracle that
   the Cantor-like diagonalization procedure does not lead outside the
   class of partial recursive functions. The gift is a very robust notion
   of universality. The price to pay for that is also very big: the
   abandon of any complete TOE (unless ultrafinitism, ...). But
   psycholically that price is a relief: it prevents computer science to
   be reductionist.

and if the brain runs a program, parallel
programs, or something else.

   Brains and other biological organs and organisms,  run parallel
   programs. But all digitalizable parallel programs can be made
   equivalent with dovetailing on non parallel programs. The UD does run
   an infinity of programs in parallel, for example. So the brain
   parallelism does not change anything unless the brain is not a
   digitalizable physical process