On 07 Aug 2011, at 21:50, benjayk wrote:



Bruno Marchal wrote:



Bruno Marchal wrote:



Bruno Marchal wrote:

Then computer science provides a theory of consciousness, and
explains how
consciousness emerges from numbers,
How can consciousness be shown to emerge from numbers when it is
already
assumed at the start?

In science we assume at some meta-level what we try to explain at
some
level. We have to assume the existence of the moon to try theories
about its origin.
That's true, but I think this is a different case. The moon seems to
have a
past, so it makes sense to say it emerged from its constituent
parts. In the
past, it was already there as a possibility.

OK, I should say that it emerges arithmetically. I thought you did
already understand that time is not primitive at all. More on this
below.
Yeah, the problem is that "consciousness emerging from arithmetics" means
just that we manage to point to its existence within the theory.

Er well, OK. But arithmetic explains also why it exist, why it is undoubtable yet non definable, how it brings matter in the picture, etc.



We have no
reason to suppose this expresses something more fundamental, that is, that consciousness literally emerges from arithmetics. Honestly, I don't even
know how to interpret this literally.


It means that the arithmetical reality "is full" of conscious entities of many sorts, so that we don't have to postulate the existence of consciousness, nor matter, in the ontological part of the TOE. We recover them, either intuitively, with the non-zombie rule, or formally, in the internal epistemology canonically associated to self- referring numbers.




Bruno Marchal wrote:


But consciousness as such has no past, so what would it mean that it
emerges
from numbers? Emerging is something taking place within time.
Otherwise we
are just saying we can deduce it from a theory, but this in and of
itself
doesn't mean that what is derived is prior to what it is derived from.

To the contrary, what we call numbers just emerges after
consciousness has
been there for quite a while. You might argue that they were there
before,
but I don't see any evidence for it. What the numbers describe was
there
before, this is certainly true (or you could say there were implicitly
there).

OK. That would be a real disagreement. I just assume that the
arithmetical relations are true independently of anything. For example
I consider the truth of Goldbach conjecture as already settled in
Platonia. Either it is true that all even number bigger than 2 are the sum of two primes, or that this is not true, and this independently on
any consideration on time, spaces, humans, etc.
Humans can easily verify this for little even numbers: 4 = 2+2, 6 =
3+3, 8 = 3+5, etc. But we don't have found a proof of this, despite
many people have searched for it.
I can see that the expression of such a statement needs humans or some
thinking entity, but I don't see how the fact itself would depend on
anything (but the definitions).
My point is subtle, I wouldn't necessarily completly disagree with what you said. The problem is that in some sense everything is already there in some form, so in this sense 1+1=2 and 2+2=4 is independently, primarily true, but
so is everything else.

The theory must explains why and how relative contingencies happen, and it has to explain the necessities (natural laws), etc.



Consciousness is required for any meaning to exist,

That is ambiguous. If you accept that some proposition can be true independently of us, it can mean that some meanings are true independently of us. If not you need some one to observe the big bang to make it happen, or the numbers to make them existing.



and ultimately is equivalent to it (IMO), so we derive from the meaning in
numbers that meaning exist. It's true, but ultimately trivial.

No, we derive from numbers+addition+multiplication a theory of meaning, consciousness, matter. You should not confuse a theory, and its meaning, interpretation, etc. I happens that we can indeed explain how numbers develop meanings for number relations, etc.




Either everything is independently true, which doesn't really seem to be the case, or things are generally interdependent. 1+1=2 is just true because 2+2=4 and I can just be conscious because 1+1=2, but 1+1=2 is just true because I am conscious, and 1+1=2 is true because my mouse pad is blue,
etc...

This view makes sense to me, because it is so simple. One particular
statement true statement is true, only because every particular statement true statement is true, and because what is true is true. In this sense
every statement is true because of every other statement. If we derive
something, we just explain how we become aware of the truth (of a
statement). There is no objective hierarchy of emergence (but apparently necessarily a subjective progression, we will first understand some things
and later some other things).

I can follow you on this.



That's why it makes little sense to me to say consciousness as such arises
out of numbers.

It means that we have a theory with some simple primitive terms, actually 0, s(0), s(s(0), + the laws of addition and mulitiplication, and from this, and only from this (not from our interpretation of those symbols, just by applying the las of addition and multiplication, + definitions), we can derive proposition concerning observers, their consciousness, meaning, the mass of their body, etc. I might miss something, but your critics here resemble to "we cannot understand how the brain function, because we need a brain to make the understanding". That problem has been solved *in* arithmetic. It is not entirely obvious.




Subjectively we first need consciousness to make sense of
numbers.

Yes, but the numbers themselves does not need consciousness primitively. They does not need we make sense of them. On the contrary we need the numbers to address the question of how to define the higher level notion of subjectivity, and then we can make sense of your correct sentence that an entity needs to have consciousness for making sense, personnally, of the numbers. If not you will need an observer outside the universe to make sense of the universe.


But certainly understanding of numbers can lead us to become more
conscious.


Bruno Marchal wrote:

Bruno Marchal wrote:

Yet, consciousness is not assumed as
something primitive in the TOE itself.
But this doesn't really matter, as we already assume that it's
primitive,
because we use it before we can even formulate anything.

We already assumed it exists, sure. But why would that imply that it
exists primitively? It exist fundamentally: in the sense that once you
have all the true arithmetical relation, consciousness exists. So,
consciousness is not something which appears or emerges in time or
space, but it is not primitive in the sense that its existence is a
logical consequence of arithmetical truth (provably so when we assume
comp and accept some definition).

Sometimes I sketch this in the following manner. The arrows are logico-
arithmetical deduction:

NUMBERS => CONSCIOUSNESS => PHYSICAL REALITY => HUMANS => HUMANS'
NUMBERS
I accept this deduction. But just because it can deduced does not mean it is more primary. To me there is no reason to suspect that consciousness does
not exist primitively.

That is like: I completely understand how a car engine function, but I do not see any reason why this would prevent car to be pulled by invisible horses. If the numbers can explain why the numbers believe correctly in the existence of consciousness, without postulating consciousness at the start, the theory NUMBERS is preferable to the theory NUMBERS+CONSCIOUSNESS, especially that consciousness is hard to define, and is at the origin of controversies. It is just a use of the traditional weak form of OCCAM in a theoretical framework.





Bruno Marchal wrote:

You can't just
ignore what you already know, by not making your assumptions
explicit in
your theory.

It is just not an assumption in the theory, but a derived existence.
With comp, consciousness is implicit in the arithmetical truth.
Maybe, but it seems arithmetical truth is implicit in consciousness also.

This I doubt. But it is very ambiguous. Arithmetical truth is *very big*. Consciousness needs only a tiny part of it, although matter might need a much bigger part of it, yet both conscious and material things will only scratch the surface of arithmetical truth.

Hmm I might begin to see, below, what is your problem.




Bruno Marchal wrote:


Bruno Marchal wrote:

In theory, even one symbol can represent every statement in any
language,

That does not make sense for me. (or it is trivia).
Yes, it is trivial. We just encode statements with numbers expressed
with
one symbol (eg + is I, 1 is II,...).

How will you encode with one symbol the statement 1+1=2? I think you
will need two symbols, unless you fix a bijection between a language
and the natural numbers.
Right, this is what I mean.


Bruno Marchal wrote:

But you will need a richer language to
describe that bijection.
But as you said below, the same it true for expressing points with natural numbers. It makes only sense if we encode the points in the numbers and have
an external decoding mechanism.

Not at all: the "external decoding" is, in the case of arithmetic (number + the laws of addition and multiplication) entirely internal. That is why we don't need to suppose anything else ontologically. That is a fundamental difference. It is the numbers who interpret the numbers. An interpretation is itself a relation between numbers, a complex one involving universal numbers, relative coding, etc. But everything needed to that exists as a consequence of addition and multiplication.





Bruno Marchal wrote:



Bruno Marchal wrote:

but still it's not as powerful as the language it represents.

Similarily if you use just natural numbers as a TOE, you won't be
able to
directly express important concepts like dimensionality.


Why? If you prove this, I abandon comp immediately.
Hm, how do you express the point (3,4) on a two-dimensional plane with
natural numbers?

I might use a Gödel-like coding for the string "(s(s(s(0))),
s(s(s(s(0)))))", like coding "(" by 2, "s" by 3, "0" by 4 and ")" by
5, and then the string itself, using the prime numbers, by 2^2 * 3^3 *
5^2 * 7^3 * etc. That is each prime number exponent the code of the
particular symbol. Or something like that, where I can code an
axiomatic of the plane by a number too, etc.
But then you faild to directly express the concept! You just represented it
in a less rich language.

The concept itself is expressed through some arithmetical relation, that is a sentence build on the language of first order logic + the symbols: s, +, * and 0.





Bruno Marchal wrote:

It seems we have to interpret the numbers in a certain way
to do this, and can't express it directly. If we used gaussian
integers we
could simply describe the point as 3+4i.

That's OK, but 3+4i can itself be coded, by 2^(code of 3)*3^(code of
+)*5^(code of 4) *7^(code of i).
But that's the point! It can be *coded*. But everything can be coded with the symbol "I" as well. In both cases we need some intelligent decoding to
retrieve the meaning.


No, we don't need it. The intelligent being is coded, but not just coded, it is fully represented by arithmetical relations, and fully emulated by arithmetical relations. So it has its personal points of view, and from its points of view it does not matter how he is represented. Well, it matters for his physics which will result from the existence of infinitely many such coding and relational representations. We all have an infinity of bodies and programs in a tiny part arithmetical truth.




Bruno Marchal wrote:



Bruno Marchal wrote:

From comp you can
derive the whole of physics, and this should be easy to understand if
you get the UDA1-7.
Well, I get that if we accept COMP we need to associate sheafs of
computations to mind-states, but I have no clue how natural numbers
can be
used to derive physics, or even formulate anything related to physics,
without using a meta-level of interpretation. It seems we always
need a more
powerful language to do that.

So physics becomes a first person uncertainty calculus associating to
each computational state a collection of computations, hopefully with
a reasonable measure (which has to be derived by the self-reference
logic.

The meta-level of comprehension can be embedded in the arithmetical
truth, in the same way that Gödel discovered that metamathematics can
be embedded in (and retrieved from) arithmetic.
It all comes down to the same thing, that we encode statements in
arithmetic. But for this to make sense we need some external thing to make
sense of the encoded statements.

I see you miss the "real thing", which is tedious to explain (but well understood by logicians). You don't need to interpret the coding and the decoding. The coded entities do it by themselves. Arithmetisation of metamathematics is not just coding, it is also given by the (true) arithmetical formula corresponding to the emulation of those code/ programs. Universal systems are intrinsically dynamical system, even if the "time" aspect is an internal view. The UD has a notion of step attach to it. Your consciousness here and now exists because the existence of your computational state here and now is a true arithmetical statement, and that it has many proofs (which will provide the weight for the relative measure on the computations). A (tiny) part of arithmetical truth gives the block-mindscape of the 'matrix'.





Bruno Marchal wrote:



Bruno Marchal wrote:

Comp remains incomplete on God, consciousness and
souls, and can explain why, but physics, including dimensionality is
entirely explained. To be sure comp is still "hesitating" between
dimension 2 and dimension 24 for the shadow of the notion of space,
but this is a very complex mathematical problem, and it assumes that the Z1* logic (the "divine" third person plural points of view) give
rise to some mathematical structure (Temperley-Lieb algebra, braid
groups).
But how can you formulate dimension 2 / 24 or Z1* logic in arithmetic?

Z1* is the logic of Bp & Dt & p; the p are arithmetic proposition and
the B and D are the Beweisbar arithmetical predicate and its dual (D =
~B~). The Gödel-like arithmetization does the remaining work.
But then the result of the arithmetization makes no sense by itself, doesn't
it?

Arithmetisation makes sense *in* arithmetic. It makes sense for the internal creatures.


So natural numbers are not sufficient after all? It seems to me we have to know how the arithmetization worked, and what it arithmetized to make
sense out of it.

No, the sense of it is an internal building by the creature itself. To assume that we need an external observer would be like to say that your brain can function only if it is observed by ... another observer- with-a-brain, and that leads to a infinite regression or a god of the gap, which is ridiculous in the comp theory: brain and self- referential numbers does the job by obeying only to the laws of addition and multiplication (which is Turing universal).





Bruno Marchal wrote:

Remember that: I do assume comp, and whatever is your conception of
space and dimension, this is already represented in your brain through
neuronal relations (say), and those neuronal relations are themselves
represented, even emulated, in arithmetic.
So, they are represented? But you can represent anything with anything.

Not at all. The representation have to be faithful and as rich as what they represent. My body represents me in this reality, and your body represents you. It is the same with the numbers. Perhaps I shoud explain this explicitly, but then you will have a lot of math work for your holiday. I guess by conversing I might point exactly on what you seem to precisely still misunderstand.
You are confusing coding and representation. I think.
With the numbers and addition only, I cannot represent all computable functions. With numbers and multiplication only, I cannot represent all computable functions. But it happens that with addition and multiplication I *can* represent all computable functions. This is done in the good textbook on theoretical computer science (Boolos and Jeffrey, Epstein and Carnielli, Mendelson, for example) The representation works through some coding, but the emulation is represented through the coding and the laws of addition + multiplication.



This
is just trivial. I can just say that this letter "A" represents the axioms of peano arithemtic, and that's my TOE. Of course, arithmetic representation
is much more clever and expressive, but that's beside the point.

The big, enormous, ultra-fundamental difference, is that arithmetic represents itself, and all observers, without any further ado. The codings themselves exist in arithmetic, independently of any observer. The numbers code and decode themselves, because they obey precise laws we agree on (+ and *). The letter A only just does nothing, without the lexicon saying that "A is PA", and the human who can understand this.





Bruno Marchal wrote:

I
mean, you don't have to explain it precisely, but can you give a
hint how
this could even be conceived to be possible?

I hope that what I say above helps a bit. Let me try again.

We assume comp. So you can imagine that all the "reality" we observe
is secondary: we might be dreaming, or, sharing "a video game", or
being in a matrix (a giant computer handling the whole game and the
software of our minds), OK?
OK.


Bruno Marchal wrote:

Now, and that is not obvious, but is rather well known by logicians,
is that such a matrix is emulated by the arithmetical consequences of
the laws of addition and multiplication. The step 8 of UDA explains
that, assuming we are physical computers, we cannot distinguish a
physical computer from its infinitely many arithmetical emulations,
and that in fine, by taking into account the first person
indeterminacy, whatever we can observe below our substitution level
results from a sort of competition among *all* universal machine/
numbers. That set of numbers is not a computable set, and only God
knows the winner. Yet, machines can backtrack from observation and
introspection to get better and better picture.
OK.


Bruno Marchal wrote:

I am not proposing an explanation of "reality", on the contrary, I
show that a very common hypothesis, mechanism (made clear through
Church thesis and computer science) makes the mind body problem two
times more difficult than it is usually understood.
It makes the physical laws more mysterious, it leads to a purely
arithmetical body problem.
And at first sight, it does look like a refutation of comp, because if
we just look at the computations, we can expect an inflation of
possibilities (the white rabbit problem). It looks like even if we
were in one winning computation, perhaps physical, we are immediately
at first send in a solipsistic mental space, and then get dissolve in
white noise. And that, admittedly is not confirmed by the experiments
nor experience, except with salvia perhaps :).
OK.... Well everything you said was natural language, not numbers, so in
some sense you unfortunately missed my point (even though it was
interesting) :). It seems to me it is impossible to formulate this in
arithmetic without postulating some more powerful language first, and then
represent it in arithmetic. But in this case arithmetic is hardly
fundamental anymore.

It is here that you *might* be deadly wrong, or not. The point is that, accepting that the truth of 1+1=2 is independent of any observers, what you call the more "powerful languages" are in fact the internal Löbian machines/numbers. They exist *in* arithmetic independently of any external observer, and they do their job of coding, decoding, interpreting, finding meaning, ... independently of any observer. That would not be the case without the independent laws of addition and multiplication. This comes from the (amazing) fact that addition+multiplication (of natural numbers) is already Turing universal. Without this, we would have to postulate consciousness as a primitive in the TOE, even without comp. A brain would not generates meaning without an external observers.

OK, I think, benjayk, that we are progressing. We might have to delve a bit more deeply on the difference between coding (a relatively trivial notion), and representing something (a notion which needs more familiarity in math and logic). In our case, we can replace representations by emulations (exact simulations). You have to grasp that the arithmetical (and statical) relations between numbers (made true by the laws of addition and multiplication) does emulate the computation of (all) observers, entirely by itself. This makes not just 1+1=2 true independently of you and me, it makes "Benjayk believes 1+1=2" true, independently of you and me. I think that if you persist in making your misunderstanding as clear as you do now, we will eventually come to a point where I will tell you something objective (third personal) about the numbers where you will just tell me "I don't believe you", and where I will be able to answer, "ah but this has been proved in the literature, see Boolos & Jeffrey page ...".

Bruno

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to