On 03 Sep 2012, at 16:12, benjayk wrote:



Bruno Marchal wrote:


On 25 Aug 2012, at 15:12, benjayk wrote:



Bruno Marchal wrote:


On 24 Aug 2012, at 12:04, benjayk wrote:

But this avoides my point that we can't imagine that levels, context
and
ambiguity don't exist, and this is why computational emulation does
not mean
that the emulation can substitute the original.

But here you do a confusion level as I think Jason tries pointing on.

A similar one to the one made by Searle in the Chinese Room.

As emulator (computing machine) Robinson Arithmetic can simulate
exactly Peano Arithmetic, even as a prover. So for example Robinson
arithmetic can prove that Peano arithmetic proves the consistency of
Robinson Arithmetic.
But you cannot conclude from that that Robinson Arithmetic can prove its own consistency. That would contradict Gödel II. When PA uses the
induction axiom, RA might just say "huh", and apply it for the sake
of
the emulation without any inner conviction.
I agree, so I don't see how I confused the levels. It seems to me
you have
just stated that Robinson indeed can not substitue Peano Arithmetic,
because
RAs emulation of PA makes only sense with respect to PA (in cases
were PA
does a proof that RA can't do).

Right. It makes only first person sense to PA. But then RA has
succeeded in making PA alive, and PA could a posteriori realize that
the RA level was enough.
Sorry, but it can't. It can't even abstract itself out to see that the RA
level "would be" enough.

Why?



I see you doing this all the time; you take some low level that can be made sense of by something transcendent of it and then claim that the low level
is enough.

For the ontology. Yes.



This is precisely the calim that I don't understand at all. You say that we only need natural numbers and + and *, and that the rest emerges from that
as the 1-p viewpoint of the numbers.

I say that this follows from comp.



Unfortunately the 1-p viewpoint itself
can't be found in the numbers, it can only be found in what transcends the numbers, or what the numbers really are / refer to (which also completely
beyond our conception of numbers).

?



That's the problem with Gödel as well. His unprovable statement about
numbers is really a meta-statement about what numbers express that doesn't even make sense if we only consider the definition of numbers. He really just shows that we can reason about numbers and with numbers in ways that can't be captured by numbers (but in this case what we do with them has
little to do with the numbers themselves).

Gödel already knew that the numbers (theories) can do that. He bet that the second incompleteness theorem is a theorem of PA. This will be proved by Hilbert and Bernays later. Then Löb generalized this, etc.




I agree that computations reflect many things about us (infinitely many
things, even), but we still transcend them infinitely.

Numbers can do that to, relatively to universal numbers. It is the whole (technical) point.



Strangely you agree
for the 1-p viewpoint. But given that's what you *actually* live, I don't see how it makes sense to than proceed that there is a meaningful 3- p point
of view where this isn't true. This "point of view" is really just an
abstraction occuring in the 1-p of view.

Yes.




Bruno Marchal wrote:

Like I converse with Einstein's brain's book (à la Hofstatdter), just
by manipulating the page of the book. I don't become Einstein through
my making of that process, but I can have a genuine conversation with
Einstein through it. He will know that he has survived, or that he
survives through that process.
On some level, I agree. But not far from the level that he survives in his
quotes and writings.

He does not survive in writing and quotes. That is only a metaphor. But he does survive in the usual sense in the emulation, assuming comp.




Bruno Marchal wrote:

That is, it *needs* PA to make sense, and so
we can't ultimately substitute one with the other (just in some
relative
way, if we are using the result in the right way).

Yes, because that would be like substituting a person by another,
pretexting they both obeys the same role. But comp substitute the
lower process, not the high level one, which can indeed be quite
different.
Which assumes that the world is divided in low-level processes and
high-level processes.

Like arithmetic.





Bruno Marchal wrote:

It is like the word "apple" cannot really substitute a picture of an
apple
in general (still less an actual apple), even though in many context
we can
indeed use the word "apple" instead of using a picture of an apple
because
we don't want to by shown how it looks, but just know that we talk
about
apples - but we still need an actual apple or at least a picture to
make
sense of it.

Here you make an invalid jump, I think. If I play chess on a computer,
and make a backup of it, and then continue on a totally different
computer, you can see that I will be able to continue the same game
with the same chess program, despite the computer is totally
different. I have just to re-implement it correctly. Same with comp.
Once we bet on the correct level, functionalism applies to that level
and below, but not above (unless of course if I am willing to have
some change in my consciousness, like amnesia, etc.).

Your chess example only works because chess is already played on a computer. Yes, you can often substitute one computer for another (though even this often comes with problems), just as you can practically substitute apple
juice with orange juice as a healthy morning drink. You still can't
substitute it with fuel though, no matter what you do with it.

No problem with this.




Bruno Marchal wrote:

With comp, to make things simple, we are high level programs. Their
doing is 100* emulable by any computer, by definition of programs and
computers.
OK, but in this discussion we can't assume COMP. I understand that you take it for granted when discussing your paper (because it only makes sense in that context), but I don't take it for granted, and I don't consider it
plausible, or honestly even meaningful.

Then you have to tell me what is not Turing emulable in the functioning of the brain.

Also, I don't take comp for granted, I assume it. It is quite different.

I am mute on my personal beliefs, except they change all the time.

But you seems to believe that comp is inconsistent or meaningless, but you don't make your point.





Bruno Marchal wrote:

I don't consider it false either, I believe it is just a question of
what
level we think about computation.

This I don't understand. Computability does not depend on any level
(unlike comp).
Assuming church-turing thesis ;).

In my opinion that's precisely where it goes wrong. It wants to abstract
from levels, but really just trivializes computation in the process
(reducing it to the lowest level aspect of computation).

I think what a computer computes does only make sense in the context of the machine. Eg if one turing machine emulates another the emulation just makes sense if we consider the turing machine that is emulated. Otherwise we can't state that it emulates anything (because its computation doesn't have to be
interpreted as an emulation).
This is also an argument against CT: If we take it to be true, the notion of
emulation ceases to make sense (because emulation is not an absolute
computational notion, but relates on computation with another).

?


Even the computation 1+1=2 doesn't make sense apart from context. What do one thing and two things even mean if we try to completely abstract from
things? Nothing.

Dreams can be done without context/environment. Context is only needed for having a deep cosmology, not for being consciousn- in the "here and now".




Bruno Marchal wrote:



Bruno Marchal wrote:

It is not a big deal, it just mean that my ability to emulate
einstein
(cf Hofstadter) does not make me into Einstein. It only makes me able
to converse with Einstein.
Apart from the question of whether brains can be emulated at all
(due to
possible entaglement with their own emulation, I think I will write
a post
about this later), that is still not necessarily the case.
It is only the case if you know how to make sense of the emulation.
And I
don't see that we can assume that this takes less than being einstein.

No doubt for the first person sense, that's true, even with comp. You
might clarify a bit more your point.
Apparently you know what I mean if you say its true from the first person. But then considering that this is what we *actually experience*, I don't see how it makes any sense to try to abstract from that (postulating a "3-p
perspective").

It makes sense to accept an artificial brain, like it makes sense to accept an artificial heart.




In which way does one thing substitute another thing if actually the correct interpretation of the substitution requires the original? It is like saying "No you don't need the calculator to calculate 24,3^12. You can substitute it with pen and pencil, where you write down 24,3^12=X and then insert the
result of the calculation (using your calculator) as X."
If COMP does imply that interpreting a digital einstein needs a real
einstein (or more) than it contradicts itself (because in this case we can't *always* say YES doctor, because then there would be no original left to
interpret the emulation).
Really it is quite a simple point. If you substitute the whole universe with
an emulation (which is possible according to COMP)

It is not.
You need a real Einstein to do a local relative copy, but you don't need it in principle, as it already exists in arithmetic, trivially (assuming comp).



than there is nothing
left to interpret the emulation. We couldn't even say whether it is an
emulation or not (because a computation itself is not an emulation, just
it's relation with the orginal).

An emulation is a computation. You forget that we throw the original brain in the trash after the doctor operation.



If there was something outside the universe
to interpret the simulation, then this would be the level on which we can't be substituted (and if this would be substituted, then the level used to
interpret this substitution couldn't be substituted, etc....).
In any case, there is always a non-computational level, at which no digital substitution is possible - and we would be wrong to say YES with regards to that part of us, unless we consider that level "not-me" (and this doesn't
make any sense to me).


Indeed we are not our material body. We are the result of the activity of the program supported by that body. That's comp.

I don't have a clue why you believe this is senseless or inconsistent.

Bruno

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to