On 8/26/2012 2:09 PM, Bruno Marchal wrote:
On 25 Aug 2012, at 15:12, benjayk wrote:
Bruno Marchal wrote:
On 24 Aug 2012, at 12:04, benjayk wrote:
But this avoides my point that we can't imagine that levels, context
and
ambiguity don't exist, and this is why computational emulation does
not mean
that the emulation can substitute the original.
But here you do a confusion level as I think Jason tries pointing on.
A similar one to the one made by Searle in the Chinese Room.
As emulator (computing machine) Robinson Arithmetic can simulate
exactly Peano Arithmetic, even as a prover. So for example Robinson
arithmetic can prove that Peano arithmetic proves the consistency of
Robinson Arithmetic.
But you cannot conclude from that that Robinson Arithmetic can prove
its own consistency. That would contradict Gödel II. When PA uses the
induction axiom, RA might just say "huh", and apply it for the sake of
the emulation without any inner conviction.
I agree, so I don't see how I confused the levels. It seems to me you
have
just stated that Robinson indeed can not substitue Peano Arithmetic,
because
RAs emulation of PA makes only sense with respect to PA (in cases
were PA
does a proof that RA can't do).
Right. It makes only first person sense to PA. But then RA has
succeeded in making PA alive, and PA could a posteriori realize that
the RA level was enough.
Like I converse with Einstein's brain's book (à la Hofstatdter), just
by manipulating the page of the book. I don't become Einstein through
my making of that process, but I can have a genuine conversation with
Einstein through it. He will know that he has survived, or that he
survives through that process.
Dear Bruno,
Please explain this statement! How is there an "Einstein" the
person that will know anything in that case? How is such an entity
capable of "knowing" anything that can be communicated? Surely you are
not considering a consistently solipsistic version of Einstein! I don't
have a problem with that possibility per se, but you must come clean
about this!
That is, it *needs* PA to make sense, and so
we can't ultimately substitute one with the other (just in some relative
way, if we are using the result in the right way).
Yes, because that would be like substituting a person by another,
pretexting they both obeys the same role. But comp substitute the
lower process, not the high level one, which can indeed be quite
different.
Is there a spectrum or something similar to it for substitution levels?
It is like the word "apple" cannot really substitute a picture of an
apple
in general (still less an actual apple), even though in many context
we can
indeed use the word "apple" instead of using a picture of an apple
because
we don't want to by shown how it looks, but just know that we talk about
apples - but we still need an actual apple or at least a picture to make
sense of it.
Here you make an invalid jump, I think. If I play chess on a computer,
and make a backup of it, and then continue on a totally different
computer, you can see that I will be able to continue the same game
with the same chess program, despite the computer is totally
different. I have just to re-implement it correctly. Same with comp.
Once we bet on the correct level, functionalism applies to that level
and below, but not above (unless of course if I am willing to have
some change in my consciousness, like amnesia, etc.).
But this example implies the necessity of the possibility of a
physical implementation, what is universal is that not a particular
physical system is required for the chess program.
With comp, to make things simple, we are high level programs. Their
doing is 100* emulable by any computer, by definition of programs and
computers.
I agree with this, but any thing that implies interactions between
separate minds implies seperation of implementations and this only
happens in the physical realm. Therefore the physical realm cannot be
dismissed!
Bruno Marchal wrote:
With Church thesis computing is an absolute notion, and all universal
machine computes the same functions, and can compute them in the same
manner as all other machines so that the notion of emulation (of
processes) is also absolute.
OK, but Chruch turing thesis is not proven and I don't consider it true,
necessarily.
That's fair enough. But personnally I find CT very compelling. I doubt
it less than the "yes doctor" part of comp, to be specific.
How is Deutsch's version different?
I don't consider it false either, I believe it is just a question of
what
level we think about computation.
This I don't understand. Computability does not depend on any level
(unlike comp).
I don't understand either.
Also, computation is just absolute relative to other computations,
not with
respect to other levels and not even with respect to instantion of
computations through other computations. Because here instantiation and
description of the computation matter - IIIIIIIII+II=IIIIIIIIIII and
9+2=11
describe the same computation, yet they are different for practical
purposes
(because of a different instantiation) and are not even the same
computation
if we take a sufficiently long computation to describe what is actually
going on (so the computations take instantiation into account in their
emulation).
Comp just bet that there is a level below which any functionnally
correct substitution will preserve my consciousness. It might be that
such a level does not exist, in which case I am an actually infinite
being, and comp is false. That is possible, but out of the scope of my
study.
Bruno, this is exactly my argument against step 8; it fails exactly
at the infinite case. COMP is omega inconsistent.
Bruno Marchal wrote:
It is not a big deal, it just mean that my ability to emulate einstein
(cf Hofstadter) does not make me into Einstein. It only makes me able
to converse with Einstein.
Apart from the question of whether brains can be emulated at all (due to
possible entaglement with their own emulation, I think I will write a
post
about this later), that is still not necessarily the case.
It is only the case if you know how to make sense of the emulation.
And I
don't see that we can assume that this takes less than being einstein.
No doubt for the first person sense, that's true, even with comp. You
might clarify a bit more your point.
I am interested in benjayk answer too.
Bruno
http://iridia.ulb.ac.be/~marchal/
--
Onward!
Stephen
http://webpages.charter.net/stephenk1/Outlaw/Outlaw.html
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.