Dear John,
On 21 Jul 2010, at 22:03, John Mikes wrote:
Dear Bruno,
on diverse lists I bounce into the 'numbers' idea - in different
variations. I wonder if your position states that the world
(whatever) has been 'erected' (wrong word) based on integer numbers
and their additive multiplicity, or it can be 'explained' by such?
The answer is "it can be explained by such". The world is not
computable. It is not a number, nor is it made of numbers.
It is not so much a mosition of mine (which I keep for myself) than a
point, or proof, or argument. All what I say is that if we are Turing
emulable, then the phsyical lwas are no more fundamental, and are a
consequence of the way the numbers are related with each other. But
the comp non locality, and the comp indeterminacy entails that matter
is in principle a highly non computational stuff.
The fact that the world seems partially computable g-has to be
explained. We can no more (assuming the comp hyp) take the existence
of laws for granted.
Instead of number, we have the choice of taking any terms from any
Turing-complete theory. I would take the combinators or the lambda
terms if people were not so freaked out by new mathematical symbols.
At least numbers are taught in school.
It makes a big difference in my agnostic views (I dunno) because to
explain is human logic (never mind which kind)
All right. Sure.
while to erect means ontological bind - what I cannot condone in its
entire meaning.
Consciousness came up as being primary or not: I hope thought of in
my version, as "response to information" - with response in ANY way
and information as our acquired knowledge of relations among
components of the totality (unlimited wholeness).
OK.
Numbers, however, as I referred to earlier - quoting David Bohm, are
'human inventions' - unidentified further.
I think it is a human discovery. I find a bit pretentious the idea
that "we" have made them. You may say so, but assuming comp, you would
have to say that galaxies and dinosaurs are human inventions too. That
would be confusing, to say the least. I put in the hypothesis of comp
(if only to making sense) that I take some truth like "1+2=3" as being
a non local, atemporal, and aspatial statement. It does not depend of
the apparition of humans. Of course the symbol "1", and "2" are human
invention, but they should not be confused with the abstract objects
they are pointing too. I could have written the same assertion in
english with a sentence like "the successor of zero added to the
successor of the successor of 0 gives the successor of the successor
of the successor of zero".
When we do theories, we have to start from something. If you agree
that 1+2=3, we can proceed.
Now I got additional news from Keith Devlin (Stanford U., "The Math
Gene: "How Matheamtical Thinking Evolved" and "Why Numbers Are Like
Gossip" - plus other ~2 dozen books)
I read with interest his book on information.
who stated that:
"Numbers are so ubiquitous and seem so concrete, it is easy to
forget they are
a human invention and a recent one at that, dating back only 10,000
years.
Though the things we count are often in the world, the numbers we
use to count
them are figments of our imagination. For that reason we should not
be surprised
(though we usually are) to discover they are usually influenced by
the way our
brains work.< ... > When we try to attach numbers to things in the
world , as
William Poundstone describes, we find psychology gets into the mix.
Numbers may be - I think they are - among the most concrete and
precise ways
to describe our world, but they are still a human creation, and as
such they reflect
us as much as the things in our environment."
~2,500 years ago 'math' with the then recently acquired 'numbers-
knowledge' had but a little domain to overcome and our awe for the
wisdom of the old Greeks accepted the numbers as 'GOD". I have no
problem to use numbers for explaining most of the world (the only
exceptions I carried earlier were the 'non-quantizable' concepts -
earlier, I said, because lately I condone in my agnosticism that
there may be ways (beyond our knowledge of yesterday) to find
quantitative characteristics in those, as well)
Both are true. Some qualitative things can have quantitative features.
And numbers themselves have a lot of qualitative features, some of
them having no quantitative features at all. After Gödel's
incompleteness result, humans assuming comp can say: already about the
numbers we can only scratch the surface. Comp kills reductive thinking
at his root. Digital mechanism is the most modest and humble
hypothesis in the field.
but in our 'yesterday's views' I don't want to give up to find
something more general and underlying upon which even the numbers
can be used and applied for the world, of which our human mind is a
part - that invented the numbers.
That is the idea. My TOE is a part of all physical theory. Comp shows
they are redundant. We don't have to posit a primary physical world.
We can explain it from addition and multiplication, and doing that in
the right way provide an explanation of both quanta and qualia and
their relations.
Anoither question arose in my mind about the discussion with Rex
Allen: the postulate that the world is Turing Emulable - as per your
not too thoroughly detailed response to me some time ago - would
refer to 'more than just the binary contraptions we presently use as
"Turing Machines" - but - maybe - a Universal Machine (Computer)
that covers all. This position would make the thing volatile:
meaning that the world is "emulable" by some construct that makes it
- well, emulable. (We know precious little about the (technical)
workings of the so called Universal Machine). In that case I would
write the name of Turing at least in lower case as a type: 'turing'
to eliminate the reference to the very invention of Alan Turing.
Careful. If comp is true, the world is NOT "turing-emulable". The
first person plural observations are NOT turing emulable. It is not
digitally emulable at all, nor is the first person (singular), whose
consciousness is related to "all computations at once" (see the UDA).
I use "universal machine" in the rather precise mathematical sense
given by Church, Turing, Post, etc. I try to make it clear by
explaining "Church thesis".
Also, once I use "turing" instead of "Turing", and this was counted as
a spelling mistake. Even adjective, like Platonist or Loebian need
capital, I have been told (it is the contrary in french, where none
adjective have capital!). I understand your point and it is more a
type than a person.
Best regard,
Bruno
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.