Re: Applied vs. Theoretical

2002-12-03 Thread Osher Doctorow
From Osher Doctorow [EMAIL PROTECTED], Tues. Dec. 3, 2002 1326

Tim May gives a very detailed account of his ideas on category and topos
theories, and I will only comment on a few of his ideas and some of Ben
Goertzel because of space and time limitations.

I think that Tim and I, and hopefully Ben, do not differ on the extreme
usefulness of being able to generalize concepts across many different fields
and subfields.  MacLane and Lawvere's category theory TRIED to do that, and
the effort is certainly commendable.   Perhaps it is more than commendable.
As one who seldom receives commendations, I may tend to give them less often
to others than I did when I started out in mathematics/science.

Nevertheless, I perceive or understand what Ben refers to as a certain lack
of deep results in category theory as compared with my Rare Event Theory for
example - although Ben understands it relative to his own experiences.  The
theorems that Tim has cited are one counterexample class to this, but where
are the great predictions, where is there anything like the Einstein Field
Equation, the Schrodinger Equation, Newton's Laws, Fermat's numerous
results, Maxwell's Equations, the Gauss-Bonnet Theorem and its associated
equation that ties together geometry and topology, Non-Euclidean/Riemannian
Geometry, Euclidean Geometry, the Jacobsen radical, Gauss-Null and related
sets in geometric nonlinear functional analysis, Godel's theorems, or even
Hoyle's Law or the Central Limit Theorems or the almost incredible theorems
of Nonsmooth Analysis and Kalman filters/predictors and Dynamic Programming
and the Calculus of Variations and Cantor's cardinals and ordinals and
Robinson's infinitesimals and Dirac's equations and Dirac's delta functions
and Feynmann's path history integrals and diagrams and the whole new
generation of continuum force laws and on and on.

Sure, category theory can go in to many fields and find a category and then
take credit for the field being essentially a category, and I can go into
many fields and find plus and minus and division and multiplication analogs
and declare the field as an example of Rare Event Theory [RET] or Fairly
Frequent Event Theory [FFT or FET] or Very Frequent Event Theory [VFT or
VET] or a plus field or a minus field or a division field or a
multiplication field.   And both Category and RET-FET-VET theories can show
that many of their concepts cross many fields.   This is very commendable,
although to me it is old hat to notice that something like a generalization
of a group crosses many branches of mathematics, whereas RET-FET-VET classes
such as GROWTH, CONTROL, EXPANSION-CONTRACTION,
KNOWLEDGE-INFORMATION-ENTROPY tend to cross not only branches of mathematics
but branches of physics and biology and psychology and astrophysics and on
and on.

But string and brane theory are suffering from precisely what category
theory is suffering from - a paucity of predictions of the Einstein and
Schrodinger kind mentioned in the second paragraph back, and a paucity of
depth.  Now, Tim, you certainly know very very much, but how are you at
depth [question-mark  - my question mark and several other keys like
parentheses are out].

I will give an example.  Socrates would rank in my estimation as a Creative
Geniuses of Maximum Depth.The world of Athens was very superficial,
facially and bodily and publicly oriented but with relatively little depth,
and when push came to shove, rather than ask what words meant, it preferred
to kill the person making the inquiries.   What it was afraid of was going
deep, asking what the gods really were, why so-called democracy ended at the
boundaries of Athens and even was inapplicable to all people in Athens, what
democracy really was, why the individual and the group/humanity were not
equally important, when the Golden Mean and the Golden Extreme as I would
call it [for example, valuing Knowledge rather than compromising between
Knowledge and Ignorance] applied.

You mentioned, Tim, that the Holographic Model is still very hypothetical.
Are we to understand that G. 't Hooft obtained the Nobel Prize for a very
hypothetical idea [question-mark] among others.   I have actually
generalized the Holographic Principle and it follows from RET-FET-VET
Theory.   But it happens to be an example of DEPTH of a type that Category
Theory does not know how to handle.  It says that LOWER DIMENSIONS CONTAIN
MORE KNOWLEDGE-INFORMATION THAN HIGHER DIMENSIONS - IN FACT, ALL OF IT, with
appropriate qualifications.

I will conclude this rather long posting with an explanation of why I think
Lawvere and MacLane and incidentally Smolin and Rovelli went in the wrong
direction regarding depth.  It was because they were ALGEBRAISTS - their
specialty and life's work in mathematics was ALGEBRA - very, very advanced
ALGEBRA.  Now, algebra has a problem with depth because IT HAS TOO MANY
ABSTRACT POSSIBILITIES WITH NO [MORE CONCRETE OR NOT] SELECTION CRITERIA
AMONG THEM.   It is somewhat

Re: Mathematics and the Structure of Reality

2002-12-03 Thread Osher Doctorow
From Osher Doctorow [EMAIL PROTECTED], Tues. Dec. 3, 2002 1601

Tim,

I quote first your comment early in your posting on my RET theory.

[TIM]
I don't think the world's nonacceptance of RET means it is on par
 with category theory, just because some here don't think much of it.


[OSHER]
Next, I quote your own apparent sensitivity to your belief that somebody
might be attacking you, from later in the same posting.

[TIM] On your second point, about how are you at depth?, I hope this
wasn't
 a cheap shot. Assuming it wasn't, I dig in to the areas that interest
 me

[OSHER] What in the world are you talking about - what does depth have to do
with a cheap shot [question mark - my question-mark key is out].  And what
kind of a pun is that some here not thinking much of RET thereby not putting
it on a park with category theory.  This last sentence, if it is not a cheap
shot, is definitely worthy of all the scientific research that can be
brought to bear on the typist of the sentence.  First of all, there has been
no discussion with me participating in which somebody previously told me
that they don't think much of RET theory.  Second, in the same posting, you
claim in effect to not understand RET theory.  Those two sentences
approximately are all you say in this posting on RET theory.  Then you go on
to generalize to quote some here not thinking much of RET unquote, which
gives the impression that there may be more than you, and you have not even
stated that you do not think much of it since you claim to not understand
it - which, incidentally, is far easier to explain than category theory [and
all my explanations of it are easier - that is the advantage of knowing
fuzzy multivalued logics, which apparently you do not], and so you have the
distinction of understanding what is harder to explain and not understanding
what is easier to explain - a phenomenon definitely worthy of the fullest
research to which science can be put.

Still, your replies are worthy of commenting upon because they pioneer new
directions in errors but also give some interesting references as a
redeeming feature.Your picture of Socrates is perhaps the funniest of
all your pictures, and that is especially interesting because you are
already in the non-fuzzy-multivalued-logic camp, so now you move into the
non-philosopher camp - which, believe it or not, is also where Smolin,
Rovelli, MacLane also are.  So you think that in essence Socrates was an
idiot, the citizens of Athens were heroes, and Plato was a hero, and Sir
Roger Penrose was a hero.  If you had the slightest background in philosophy
beyond philosophy 1 and 2, you would know that Plato wrote the biography of
Socrates - that nothing is known of Socrates beyond what his STUDENT Plato
wrote, and that Plato literally worshipped Socrates, and that Plato
described in detail how the horrible citizens of Athens destroyed Socrates
and forced him to take poison.Then, toward the end of your posting, you
claim that this Athens was nothing like the Athens you studied - even though
you cite Plato as one of the great philosophers in effect.  One would have
to believe that you were there before Plato, somehow interposed between
Plato and his teacher Socrates, studying all this wisdom [from where, if I
may ask] and going around polling Athenians [certainly not examining the
psychology or culture or history of Athens - a poll would more likely
reflect the beliefs of conformists whether in science or politics].

Your master step, however, was your discovery that G. 't Hooft obtained the
Nobel Prize not from the Holographic Principle but from his work on the
electroweak force, contrary to my claim that he obtained it for the
Holographic Principle among other things.   Most Nobel Laureates in physics
obtain their prizes for the so-called sum total of their work, although
specific highlights are often mentioned and apparently may be the only ones
mentioned.  I assumed that 't Hooft's Holographic Principle was included.
In any case, if you think as you claim that the Holographic model is still
very hypothetical, you are out of line of what is one of the core concepts
of string, brane, supersymmetry, TQFT, loop theory, and on and on at the
present time.   Since you keep referring me to your references, glance
through the arXivs in physics and mathematics from the 1997s through 2002.
Or read one of my explanations on one of the sites that I have cited in
previous postings.

I will conclude this posting with a summary of what I think your orientation
is.   You, and your apparently closest hero John Baez, are COMPUTER PEOPLE.
Computer people, in my 64 years of experience in life, have almost always
near 0 verbal ability and about 50 percent quantitative ability on a scale
of 0 to 100.   They have NO philosophical ability, which computers also
don't have, and their use of logic is confined to reading proofs of theorems
that have already been invented by somebody else and making stupid machines

Re: Applied vs. Theoretical

2002-12-01 Thread Osher Doctorow
From Osher Doctorow [EMAIL PROTECTED], Sunday Dec. 1, 2002 1243

Sorry for keeping prior messages in their entirety in my replies.

Let us consider the decision of category theory to use functors and
morphisms under composition and objects and commuting diagrams as their
fundamentals.  Because of the functor-operator-linear transformation and
similar properties, composition and its matrix analog multiplication
automatically take precedence over anything else, and of course so-called
matrix division when inverses are defined - that is to say, matrix inversion
and multiplication.

It was an airtight argument, it was foolproof by all that preceded it from
the time of the so-called Founding Fathers in mathematics and physics, and
it was wrong - well, wrong in a competitive sense with addition-subtraction
rather than multiplication-division.  There is, of course, nothing really
wrong with different models, and at some future time maybe the
multiplication-division model will yield more fruit than the
addition-subtracton models.   And, of course, each model uses the other
model secondarily to some extent - nobody excludes subtraction from the
usual categories or multiplication from the subtractive models.

What do I mean when I say it was relatively wrong, then, in the above sense
[question-mark].

Consider the following subtraction-addition results - in fact, subtraction
period.

1. Discriminates the most important Lukaciewicz and Rational Pavelka fuzzy
multivalued logics from the other types which are divisive or identity in
their implications.
2. Discriminates the most important Rare Event Type [RET] or Logic-Based
Probability [LBP] which describes the expansion-contraction of the universe
as a whole, expansion of radiation from a source, biological growth,
contraction of galaxies, etc., from Bayesian and Independent
Probability-Statistics which are divisive/identity function/multiplicative.
3. Discriminates the proximity function across geometry-topology from the
distance-function/metric, noting that the proximity function is enormously
easier to use and results in simple expressions.

It sounds or reads nice, but the so-called topper or punch line to the story
is that ALL THREE subtractive items above have the form f[x, y] = 1 plus y -
x.  ALL THREE alternative division-multiplication forms have the form f[x,
y] = y/x or y or xy.

Category theory has ABSOLUTELY NOTHING to say about all this.

So where are division and multiplication mainly used [question mark].  It
turns out that they are used in medium to zero [probable] influence
situations, while subtraction is used in high to very high influence
situations.

Come to your own conclusions, so to speak.

Osher Doctorow


- Original Message -
From: Tim May [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, December 01, 2002 10:44 AM
Subject: Applied vs. Theoretical




Re: Alien science

2002-11-30 Thread Osher Doctorow
From Osher Doctorow [EMAIL PROTECTED], Sat. Nov. 30, 2002 1005

I agree generally with Tim May on mathematics and physics vs computers and
AI.   My most amusing example is something of a Jonathon Swift parody of all
four of these fields.   Gulliver lands on an island inhabited by
mathematicians, physicists, computer scientists/computer engineers, and AI
people, all competing.  He notices that they all rushing ahead to greater
and greater complication and complexity and so on, and it occurs to him that
this might be their weak point.   Could they all have overlooked something
simple [question-mark].   He discovers, as it so happens that I discovered
some 20 or so years ago, that they are all using division and multiplication
to formulate relationships involving influence and causation [I omit
calculus limits for those unfamiliar with them], and minimally using
subtraction and addition.   Gulliver then reformulates all of their theories
using subtraction and/or addition, and it turns out that all of the
resulting theories are completely different from the old ones.   Not one
person among all the island's so-called geniuses had come up with the very
tiny idea of changing from division/multiplication to subtraction/addition
in all of their work.   Upon presenting this fact to the gathered people of
the island, the people debate for a long time, and then decide that Gulliver
knows more, so they decide to drop their entire four fields and start all
over again much more slowly, this time not racing to greater complication
before analyzing the simple concepts that they are using.  The name Socrates
is mentioned as an example.   The predictions of several popular science
writers on the island such as Professor Kaku to the effect that computers
are going to almost literally swamp everything else are accordingly
considerably modified to say the least.

Osher Doctorow

- Original Message -
From: Tim May [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Saturday, November 30, 2002 2:37 PM
Subject: Alien science



 On Saturday, November 30, 2002, at 01:32  PM, Ben Goertzel wrote:
  ...
 
  I think this is certainly a plausible prediction of the future, but I
  see it
  as an unlikely one.
 
  I think that intelligent software programs will be brought into
  existence
  within the next 10-50 years, and that among other effects, this will
  cause a
  physics revolution.  Furthermore, it will be a revolution in a
  direction now
  wholly unanticipated.

 It will be interesting and exciting if you are right, but I think the
 kind of AI you describe above and below is further off than 10-30
 years, though perhaps not 50 years.

 
  Right now we analyze data about the microworld in a very crude way.
  For
  example, we scan Fermilab data for events -- but what about all the
  other
  data that isn't events but contains meaningful patterns?
 
  Create an AI mind whose sensors and actuators are quantum level, and
  allow
  it to form its own hypotheses, ideas, concepts, ontologies  Do you
  really think it's going to come up with anything as awkward and
  overcomplex
  as our current physics theories?

 I have no idea. True, it may come up with all sorts of weird theories.
 But, absent new experimental evidence, will these new theories actually
 tell us anything new?

 Your point about AIs exploring physics is an interesting one. And you
 are right that Egan has his AIs, his uploaded Orlandos and even his
 computer-produced Yatima, looking very much like humans. Not at all
 like the Entities of Vinge's Deepness, Zindell's Neverness, or
 Stephenson's Hyperion series. But let us imagine that an advanced AI
 were to be turned loose on a Newtonian world. I can well imagine that
 such an entity, left to its own devices, might come up with weird names
 for inertia, mass, friction, etc. Perhaps even synthetic combinations
 of what we take to be the basic vectors of classical mechanics. Instead
 of 3-space being so primal, phase spaces of 6, 18, and even many more
 dimensions would perhaps be more natural to such a mind. (Needless to
 say, given that today's best AI programs and computers are having a
 very hard time even doing naive physics, a la ThingLab and its
 descendants, I'm not expecting progress very quickly. And ThingLab is
 more than 20 years old now, so expecting massive breakthroughs in the
 next 10-20 years seems overly optimistic.)

 More importantly, would an AI version of classical physics, complete
 with incomprehensible (to us) phase spaces and n-categories and so on,
 including constructs with no known analogs in our current universe of
 discourse, would this version give any predictions which differ from
 our own? In short, would the AI's version of physics give us any new
 physics?

 My hunch is no. It might be better at solving some problems, just as
 the mental architecture of birds may give them much better abilities to
 solve certain kinds of 3D problems than we have had to evolve, and so

Re: Good summary of Bogdanov controversy

2002-11-10 Thread Osher Doctorow
From: Osher Doctorow [EMAIL PROTECTED], Sunday Nov. 10, 2002 5PM

Thanks to Tim May for the site reference.  I read the story, and it's quite
interesting.  It's the first time I've looked at this in detail, although I
heard a rumor about it.  I have a few comments that I'd like to make now.

1.  The acceptance of nonsense for publishing or Ph.D.s or M.A.s or M.S.s is
obviously wrong.
2.  The cause of the acceptance needs to be investigated by scientists and
philosophers and others.
3.  History tells us a few things about nonsense if we study it carefully,
especially the history of Creative Geniuses like Beethoven, Shakespeare,
Paul Dirac, Einstein, Schrodinger, Socrates, Plato, Mozart, etc.   I will
itemize these below beginning with 4, but I'll just mention that they fall
under Mediocrity, Ingenious Imitation, and Creative Genius.
4. Mediocre scientific people in my definition don't even have the ability
to imitate (see below).
5. Ingenious Imitators in science (and similarly for music, literature,
etc.) imitate other scientists but only go 0 or 1 step ahead of whomever
they are imitating.
6. Creative Geniuses go more than 1 step ahead of anybody else working on
the same or similar problem or anybody else in the field or subfield.
7. Having spent most of my 63 years of life in Academia, both as a student
and as a teacher/researcher in mathematics including statistics and
mathematical physics, it is my opinion that more than 99% of mathematicians
and physicists are Ingenious Imitators, and I have a stong suspicion that
this is the case in most other academic fields.
8. Peer review is the usual way of determining which papers are published in
scientific journals, and it follows from 7 if I am correct that most peer
reviewers are Ingenious Imitators, and therefore that what gets published in
most journals is at most one step ahead of the previous person (and possibly
0 steps ahead).
9. The solution to the problem of 8 and similar difficulties with Ph.D. and
Masters Degrees is in my opinion a positive one rather than a negative one,
namely, to foster more Creative Geniuses in Mathematics and Physics (and
other fields).
10. In my opinion, Ingenious Imitators can become Creative Geniuses with
sufficient education, tolerance, practice in accepting and thinking up new
ideas, learning tranquility rather than anger or fear, and guidance from
other Creative Geniuses or Creative Problem Solvers (a sort of borderline
type between Creative Genius and Igenious Imitators, which I'll explain
another time hopefully).   Giving up Materialism, including Money-Related
Materialism, Power Materialism, and Sensation Materialism, which includes
giving up bureaucracy or the interest in becoming part of it, is key in
this.

Osher Doctorow, Ph.D.
One or more of California State Universities and Community Colleges
(Mathematics, Statistics)

- Original Message -
From: Tim May [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, November 10, 2002 12:44 PM
Subject: Good summary of Bogdanov controversy



 A good summary of the Bogdanov controversy is in the New York Times
 today. URL is

 http://www.nytimes.com/2002/11/09/arts/09PHYS.html

 Some of the folks we like to quote here are quoted in the article,
 including Lee Smolin, John Baez, Carlo Rovelli, etc.

 Also, the latest Wired print issue has a fairly good survey article
 by Kevin Kelly about theories of the universe as a cellular automaton.
 Konrad Zuse gets prominent mention, along with Ed Fredkin. I didn't
 read the article closely, so I didn't notice if either Tegmark or
 Schmidhuber were mentioned. The usual stuff about CA rules, Wolfram's
 book, etc.

 Things have been quiet here on the Everything list. I haven't been
 commenting on my own reading, which is from books such Physics Meets
 Philosophy at the Planck Scale and Entanglement. Isham's collection
 of essays on QM should arrive momentarily at my house. My interest
 continues to be in topos theory, modal logic, and quantum logic.



 --Tim May
 (.sig for Everything list background)
 Corralitos, CA. Born in 1951. Retired from Intel in 1986.
 Current main interest: category and topos theory, math, quantum
 reality, cosmology.
 Background: physics, Intel, crypto, Cypherpunks





Re: Good summary of Bogdanov controversy

2002-11-10 Thread Osher Doctorow
From: Osher Doctorow [EMAIL PROTECTED], Sunday Nov. 10, 2002 5:45PM

Duraid,

Well said!   I am very happy that some Australians have a sense of humor,
which I hadn't realized until now.   I know that British and Irish humor are
excellent.   USA humor varies between the mediocre and the sublime.

This reminds me of the last time that I wrote similarly about Creative
Genius on the internet to a forum of rather incompetent (mostly) teachers,
after which one teacher replied with a hysterical email accusing me of
implying that I am a Creative Genius and everybody else is ___ (expletive
deleted).   Her argument was that teachers are so dedicated and loving and
kind and generous and...etc., that to criticize them was tantamount to
blasphemy.   I hesitated to tell her (and I did not) that expletives deleted
as a way of life are more common among the Mediocre than other categories in
my opinion.  My wife, Marleen J. Doctorow, Ph.D., a licensed clinical
psychologist for over 30 years, would be very proud of me if she had any
time left after her patients.   Oops! Did I imply anything about her?  If
so, I withdraw my last sentence.  :  )

Osher

- Original Message -
From: Duraid Madina [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Sunday, November 10, 2002 5:39 PM
Subject: Re: Good summary of Bogdanov controversy



  1.  The acceptance of nonsense for publishing or Ph.D.s or M.A.s or
M.S.s is
  obviously wrong.
 
  4. Mediocre scientific people in my definition don't even have the
ability
  to imitate (see below).

 Why are you being so hard on yourself??


 Tongue firmly pressed against cheek,
 Duraid




Enormous Body of *Evidence* For Analysis-Based TOES

2002-09-23 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Mon. Sept. 23, 2002 12:32PM

I refer readers to http://www.superstringtheory.com/forum, especially to the
String - M Theory - Duality subforum of their Forum section (membership is
free, and archives are open to members, and many of my postings are in the
archives), during the last few days, in which I have provided literature
references and sites from very recent research mostly that puts Analysis via
determinants and/or negative exponentials at the forefront of science - not
merely the interesting Fredholm type determinants and the Slater
determinants, but determinant maximization in general (with constraints).
Fields crossed by these include quantum theory, general relativity,
information theory, communications theory, experimental design, system
identification, statistics as a whole, geometry, computer programming,
entropy, experimental design, algorithms including path-finding algorithms
for convex optimization, etc.

Let me very briefly recapitulate why determinants are Analysis-based rather
than Algebra-based.   The expression 1 + y - x, which can be generalized to
c + y - x for arbitrary real constant c (or even to non-real expressions,
but that is another story) or simply written y - x with incorporation of c
into y or x, is continuous and CRITICAL to outgrowths of Analysis including
probability-statistics (for Rare Event scenarios), fuzzy multivalued logics
(see below for those who believe that logic is algebraic), proximity
functions, geometry-topology based on proximity functions.   Determinants
generalize y - x to a finite alternating series.   Alternating series in
general generalize determinants.   The same site (earlier postings) explains
why fuzzy multivalued logics and logics in general are Analysis-based rather
than Algebra-based, although many mathematical and non-mathematical
logicians unfamiliar with Analysis have believed otherwise.

Osher Doctorow




Re: Tegmark's TOE Cantor's Absolute Infinity

2002-09-22 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Sat. Sept. 21, 2002 11:38PM

Hal,

Well said.   I really have to have more patience for questioners, but
mathematics and logic are such wonderful fields in my opinion that we need
to treasure them rather than throw them out like some of the Gung-Ho
computer people do who only recognize the finite and discrete and mechanical
(although they're rather embarrassed by quantum entanglement - but not
enough not to try to deal with it in their old plodding finite-discrete
way).

Mathematics and Physics are Allies, more or less equal.   I prefer not to
call the concepts of one inferior directly or to indirectly indicate
something of the sort, unless they really are contradictory or something
very, very, very close to that more or less.   As for a computer, maybe
someday it will be *all it can be*, but right now I have to quote a retired
Assistant Professor of Computers Emeritus at UCLA (believe it or not,
bureaucracy can create such a position - probably the same bureaucratic
mentality that created witchhunts and putting accused thieves' heads into
wooden blocks so that they could be flogged by passers-by in olden times),
who said: *Computers are basically stupid machines.*We knew what he
meant.   They're very vast stupid machines, and sometimes we need speed,
like me getting away from the internet or I'll never get to sleep.

Osher Le Doctorow (*Old*)


- Original Message -
From: Hal Finney [EMAIL PROTECTED]
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Saturday, September 21, 2002 7:18 PM
Subject: Re: Tegmark's TOE  Cantor's Absolute Infinity


 Dave Raub asks:
  For those of you who are familiar with Max Tegmark's TOE, could someone
tell
  me whether Georg Cantor's  Absolute Infinity, Absolute Maximum or
Absolute
  Infinite Collections represent mathematical structures and, therefore
have
  physical existence.

 I don't know the answer to this, but let me try to answer an easier
 question which might shed some light.  That question is, is a Tegmarkian
 mathematical structure *defined* by an axiomatic formal system?  I got
 the ideas for this explanation from a recent discussion with Wei Dai.

 Russell Standish on this list has said that he does interpret Tegmark in
 this way.  A mathematical structure has an associated axiomatic system
 which essentially defines it.  For example, the Euclidean plane is defined
 by Euclid's axioms.  The integers are defined by the Peano axioms, and
 so on.  If we use this interpretation, that suggests that the Tegmark
 TOE is about the same as that of Schmidhuber, who uses an ensemble of
 all possible computer programs.  For each Tegmark mathematical structure
 there is an axiom system, and for each axiom system there is a computer
 program which finds its theorems.  And there is a similar mapping in the
 opposite direction, from Schmidhuber to Tegmark.  So this perspective
 gives us a unification of these two models.

 However we know that, by Godel's theorem, any axiomatization of a
 mathematical structure of at least moderate complexity is in some sense
 incomplete.  There are true theorems of that mathematical structure
 which cannot be proven by those axioms.  This is true of the integers,
 although not of plane geometry as that is too simple.

 This suggests that the axiom system is not a true definition of the
 mathematical structure.  There is more to the mathematical object than
 is captured by the axiom system.  So if we stick to an interpretation
 of Tegmark's TOE as being based on mathematical objects, we have to say
 that formal axiom systems are not the same.  Mathematical objects are
 more than their axioms.

 That doesn't mean that mathematical structures don't exist; axioms
 are just a tool to try to explore (part of) the mathematical object.
 The objects exist in their full complexity even though any given axiom
 system is incomplete.

 So I disagree with Russell on this point; I'd say that Tegmark's
 mathematical structures are more than axiom systems and therefore
 Tegmark's TOE is different from Schmidhuber's.

 I also think that this discussion suggests that the infinite sets and
 classes you are talking about do deserve to be considered mathematical
 structures in the Tegmark TOE.  But I don't know whether he would agree.

 Hal Finney





Re: Tegmark's TOE Cantor's Absolute Infinity

2002-09-21 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Sat. Sept. 21, 2002 10:39PM

I've glanced over one of Tegmark's papers and it didn't impress me much, but
maybe you've seen something that I didn't.

As for your question (have you ever been accused of being over-specific?),
the best thing for a person not familiar with Georg Cantor's work in my
opinion would be to read Garrett Birkhoff and Saunders MacLane's A Survey of
Modern Algebra or any comparable modern textbook in what's called Abstract
Algebra, Modern Algebra, Advanced Algebra, etc., or look under transfinite
numbers, Georg Cantor, the cardinality/ordinality of the continuum, etc.,
etc. on the internet or in your mathematics-engineering-physics research
library catalog or internet catalog.

To answer even more directly, here it is.   *Absolute infinity* if
translated into mathematics means the *size* of the real line or a finite
segment or half-infinite segment of the real line and things like that, and
it is UNCOUNTABLE, whereas the number of discrete integers, e.g., -1, 0, 1,
2, 3, ..., is called COUNTABLE.   If you accept a real line or a finite line
segment or a finite planar geometric figure like a circle or a 3-dimensional
geometric figure like a sphere as being *physical*, then *absolute infinity*
would be physical.   If you don't accept these as being physical, then you
can't throw them out either - if you did, you'd throw physics out.  So there
are *things* in mathematics that are related to physical things by
*approximation*, in the sense that a mathematical straight line approximates
the motion of a Euclidean particle in an uncurved universe or a region far
enough from other objects as to make little difference to the problem.
There are also many things in mathematics, including the words PATH and
CURVE and SURFACE, that also approximate physical dynamics.   Do you see
what the difficulty is with over-simplifying or slightly misstating the
question?

Osher Doctorow
- Original Message -
From: [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Saturday, September 21, 2002 6:59 PM
Subject: Tegmark's TOE  Cantor's Absolute Infinity


 For those of you who are familiar with Max Tegmark's TOE, could someone
tell
 me whether Georg Cantor's  Absolute Infinity, Absolute Maximum or
Absolute
 Infinite Collections represent mathematical structures and, therefore
have
 physical existence.

 Thanks again for the help!!

 Dave Raub





A New Start

2002-09-06 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Fri. Sept. 6, 2002 8:36AM

After my discouragement of yesterday, I have decided to give myself one more
chance to try to be compatible with everything-list.   I have just
downloaded J. Schmidthuber's *A computer scientist's view of life, the
universe, and everything,* (1997), and it is well enough written that I
apparently will be able to understand it.   I also have one of the other two
papers that were available on the everything-list site, which I understood
fairly well several weeks ago.

I will take this opportunity to write/type a few words about Knowledge (K
for short) rather than information (I), distinguishing between them as the
semantic (meaning) part of that whose syntactic part is information vs
information itself.

There are several directions in which I have developed K, but the simplest
way is to consider that K contains primitive pointlike or stringlike
elements which may have fuzzy truth values (on a scale between 0 and 1 for
the non-trivial cases) or probabilities or both assigned to them.  Let us
call these K-points for brevity.   Each K-point has MEANING, and I will
regard this as a primitive undefined concept in this presentation, although
one can develop things from several other viewpoints.The word *MEANING*,
however, is to be used in practice rather similarly to its dictionary
definition(s) and intuitively is like ideas, thoughts, cognitions, provided
that they are accompanied by *understanding* rather than merely regarded as
sounds or sights or perceptions with nothing that can be specified behind
them.   I do relate it here at all to the computational linguistics idea of
*meaning*.

Knowledge (K) is regarded as continuous or piecewise continuous and
connected or piecewise connected, and could theoretically either increase or
decrease or remain constant, although in fact I postulate somewhat
analogously although apparently not structurally related to entropy in
thermodynamics that K increases in time in the universe.   In fact, letting
E symbolize entropy, I postulate that the rate of increase of K in time
exceeds the rate of increase of E in time, symbolically:

1)  Dt(K - E)  0

where Dt is the partial derivative with respect to time, although I am open
to generalizing it to covariant or gauge derivatives and so on.   Equation
(1) has an interesting interpretation, namely, that instead of disorder
increasing overall in the universe with time, the ordered part of Knowledge
increases with time - possibly by matter converting to radiation in whole or
in part, or possibly by some other scenario.   Even the notion of a
radiation form of life is not excluded in these considerations - in fact, it
may be indicated.   It might be in some places combined with material form
of life, as in the human brain, where the global aspects may relate more to
radiation and the local aspects more to matter.   There is a well accepted
physical theory of the initial radiation-dominated era of the universe
succeeded by a matter-dominated era in which radiation still plays an
important part, and several theorists consider that a radiation type of era
will eventually constitute a third era.

Does this mean that digital computes do not do anything?No.   They
calculate very fast.  They store discrete steps and discrete Knowledge
representations (or attempted representations via syntax) and discrete
syntax.   When they calculate very fast, they sometimes produce numerical
approximations to solutions of differential or integral equations which we
do not know how to produce otherwise, and this helps increase Knowledge,
although I think that is it qualitatively somewhat inferior to CAUSAL
KNOWLEDGE.   There is factual knowledge (details) about the real world,
there is causal knowledge about what causes or influences what in the real
world, and there is speculative or even fictional or fantasy *knowledge*
about things or events or processes that are not considered to be real or to
have real analogs in the real physical or even psychological worlds.
Digital computers can help factual knowledge, but so far they have not
helped causal knowledge much.

Does K (Knowledge) relate to multiple universes, multiple histories, etc.?
This is a more advanced question than I can deal with here.   I think that
multiple universes and multiple histories are interesting ideas, but that at
the present time their logical and physical and philosophical structures
have not been well established.   If they exist, then I have no doubt
intuitively that K applies to them as well.

Finally, the mathematical formulation of Causal Knowledge in my Rare Event
Theory (RET) resides in fuzzy multivalued logical x--y or its
probability-statistics analogs or proximity function - geometry-topology
analogs.There are 3 types of x--y, which correspond respectively to
Rare Events (Lukaciewicz and Rational Pavelka fuzzy multivalued logics (FML)
in the non-trivial case), Fairly Frequent Events (Product

Serious *Mistake* by Schmidthuber

2002-09-06 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Fri. Sept. 6, 2002 11:45AM

I have read about half of J. Schmidthuber's *A computer scientist's view of
life, the universe, and everything,* (1997), and he has interesting ideas
and clarity of presentation, but I have to disagree with him on a number of
places where he uses conditional probability including his section
Generalization and Learning.   I hasten to add that I do not view
alternative theories as *wrong* but as competing and that they should almost
all survive for competition, motivation, and also because many of them turn
out to have useful contributions long after they have been regarded as
*discredited*.

Schmidthuber (S for short) concludes that generalization is impossible in
general by using a proof based on conditional probability, and similarly he
concludes that the learner's life in general is limited by also a
conditional probability proof.  Most readers will undoubtedly stare at this
statement in bewilderment, since as far as they know nothing is wrong with
conditional probability.

They are partly correct and partly wrong.   Nothing is wrong with
conditional probability, which is the main tool of the Bayesian school (or
as I abbreviate it, the BCP or Bayesian Conditional Probability-Statistics
school), for Fairly Frequent Events.For Rare Events, something very
strange happens.   This was how my wife Marleen and I began our exploration
of Rare Events in 1980.   Conditional probability divides two probabilities
and regards that as an indication of the probability of one event *given*
another event, where *given* is used in the sense of *freezing the other
event in place*.   Some real analysis experts will argue that this is all
justified by the Radon Derivative of the Lebesgue-Radon-Nikodym theorem(s),
not quite realizing that the proof of those theorems only hold up to
equivalence classes outside sets of measure ZERO.   But events of
probability zero are the Rarest Events.   Moreover, division of
probabilities blows up even in small (one-sided) neighborhoods of
probability 0 since division by 0 is impossible.   Thus, not only can
conditional probability not model events of probability 0, but it cannot
even model events of probability close to 0 (Rare Events).

Is there a simple solution?   Yes!Product/Goguen fuzzy multivalued
logical implication x--y is defined as y/x for x not 0.   So it corresponds
to conditional probability where x and y are carefully chosen probabilities
in the probability-statistics analog.   Lukaciewicz and Rational Pavelka
fuzzy multivalued logical implications (Rational Pavelka is the predicate
logic generalization of Lukaciewicz propositional logic) are x--y = 1 + y -
x for y  = x for the non-trivial case.   The latter does not involve
division by 0 and does not blow up in any (one-sided) neighborhood of zero.
Logic-Based Probability (LBP) uses precisely the same definition of 1 + y -
x in place of y/x for exactly the same probabilities x, y which BCP uses.
My wife and I introduced LBP in 1980.   It may be remarked here the Godel
fuzzy multivalued logic, which we showed applies to Very Frequent (Very
Common) Events, uses x--y = y and refers in the probability-statistics
analog to INDEPENDENT events, and since in general events are not
independent unless that can be established in special cases, LBP is the
correct result to use.

So when S claims that generalization is impossible in general and that the
learner's life is limited in general, he has to be referring to Fairly
Frequent Events, not Rare Events or even Very Frequent Events (which use the
Godel analog).

But surely that leaves much room for S to maneuver in?In a way, yes, and
in a way, no.  S is very interested in the Great Programmer or even a
decreasing sequence of Great Programmers each delegating authority to the
other in different universes and so on.   The Great Programmer thinks on the
level of the Universe or All Universes or the particular Universe in the
sequence.   So we have to ask: which type of fuzzy multivalued logic or its
probability-statistics analog (or proximity function - geometry - topology
analog, which we developed as exact analogs of the above) most influences
the Universe(s)?

The answer turns out to be very simple, namely Lukaciewicz/Rational Pavelka
(Rare Event) or its probability-statistics analog LBP.This is because in
our universe it is generally agreed that a Rare Event called a Big Bang
occurred (I have proven that even if it did not, as in Steinhardt-Turok and
Gott-Li cyclic or backward time loop cosmological theories, LBP is the key
influence probability), and that very rare events such as inflation and the
transition from radiation-dominated to matter-dominated eras and transition
from non-accelerating to accelerating universe which fairly recently
occurred - that all of these Rare Events played critical roles in the
development of the Universe.

I should also mention that Shannon Information-Entropy and its

Re: Serious *Mistake* by Schmidthuber

2002-09-06 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Fri. Sept. 6, 2002 6:17PM

Bill Jefferys says:

 Nonsense. It's done all the time for events of low probability.

If *doing something all the time* is your reply to nonsense, then can I
assume that not doing something is your reply to *sense*?Ah well, the
subtleties of logic!

Do you really want to argue about division by 0 and near 0 denominator?
Why don't you think about if for a few days or weeks.   I would hate to see
you lose so easily.

Osher Doctorow


- Original Message -
From: Bill Jefferys [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Friday, September 06, 2002 2:19 PM
Subject: Re: Serious *Mistake* by Schmidthuber


 At 12:20 PM -0700 on 9/6/02, Osher Doctorow wrote:

 Thus, not only can
 conditional probability not model events of probability 0, but it cannot
 even model events of probability close to 0 (Rare Events).

 Nonsense. It's done all the time for events of low probability.

 Bill





Re: Schmidhuber II implies FTL communications

2002-09-05 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Thurs. Sept. 5, 2002 5:07PM

Wei Dai,

Good!   I will try to access the paper almost immediately.   I have long
been partial to FTL as a conjecture.   When Professor Nimtz of U.
Koln/Cologne came up with his results, or shortly thereafter, and
interpreted them favorably toward FTL, I emailed him, and he was kind enough
to send me copies of some of his papers by regular (*snail*) mail/post.

Some of the non-Analysis school have indicated here and on other forums that
the pendulum has swung too far away from algebra/arithmetic/number theory,
but the loop theorists like Smolin and Ashtekar and a number of people in
string/brane/duality theories who follow their leads, not to neglect the
MacLane/Lawvere Category theorists in mathematics and physics, actually
constitute an extremely large Mainstream today rather than a downtrodden
minority (although the Gauge Field Theorists still claim the *largest
Mainstream* title).   My tendency is to follow the least popular path in
science and in several other fields.   That was the way of life of Socrates,
and also of many of the greatest Creative Geniuses in history - including
Kurt Godel, who is still being berated by conformists shuddering at the
thought that there might be limitations as to what assumptions can lead to.

Osher Doctorow

- Original Message -
From: Wei Dai [EMAIL PROTECTED]
To: Russell Standish [EMAIL PROTECTED]
Cc: Hal Finney [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Thursday, September 05, 2002 4:50 PM
Subject: Schmidhuber II implies FTL communications


 On Mon, Sep 02, 2002 at 12:51:09PM +1000, Russell Standish wrote:
  This set of all descriptions is the Schmidhuber approach, although he
  later muddies the water a bit by postulating that this set is generated
  by a machine with resource constraints (we could call this Schmidhuber
  II :). This latter postulate has implications for the prior measure
  over descriptions, that are potentially measurable, however I'm not
  sure how one can separate these effects from the observer selection
  efects due to resource constraints of the observer.

 I just found a paper which shows that if apparent quantum randomness has
 low algorithmic complexity (as Schmidhuber II predicts), then FTL
 communications is possible.

 http://arxiv.org/abs/quant-ph/9806059

 Quantum Mechanics and Algorithmic Randomness
 Authors: Ulvi Yurtsever
 Comments: plain LaTeX, 11 pages
 Report-no: MSTR-9801

 A long sequence of tosses of a classical coin produces an apparently
 random bit string, but classical randomness is an illusion: the
 algorithmic information content of a classically-generated bit string lies
 almost entirely in the description of initial conditions. This letter
 presents a simple argument that, by contrast, a sequence of bits produced
 by tossing a quantum coin is, almost certainly, genuinely
 (algorithmically) random. This result can be interpreted as a
 strengthening of Bell's no-hidden-variables theorem, and relies on
 causality and quantum entanglement in a manner similar to Bell's original
 argument.





Re: Schmidhuber II implies FTL communications

2002-09-05 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Thurs. Sept. 5, 2002 5:43PM

I have accessed the paper by Yurstever, and I want to mention that I have
been pursuing the algorithmic incompressibility thread on
[EMAIL PROTECTED] in connection with supersymmetric theories of
memory.   The reception there was partly one of interest from a member of
the Royal Statistical Society, but lately two members have complained about
(a) off-topic, and (b) too lengthy emails of mine.  This is definitely
progress toward the Socratic position, and I am encouraged.   :  )

I am very impressed by the algorithmic incompressibility viewpoint of
randomness, although I should point out that it is only one (but a very
good) viewpoint.   Now I will continue reading the Yurtsever paper.

Osher Doctorow




Page 2 of Yurtsever (relates to Schmidhubert II implies FTL communications)

2002-09-05 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Thurs. Sept. 5, 2002 6:17PM

I have now read page 2 of Yurtsever, having previous read page 1, and I must
confess that his style does not quite have the clarity of my style - his is
more like the clarity of Sigmund Freud's style  :  )   However, I am happy
to see that he recognized the role of Godel's incompleteness theorem on his
page 1.

On page 2, Yurtsever put the cart before the horse in a sense by telling us
what would happen if his theory turns out to be correct, but since he plans
to prove it in pages 3ff, he can be forgiven for that.   I notice in
connection with his last 2 paragraphs of page 2, which run over into the
first 2 paragraphs of page 3, that he seems to agree with Sir Roger Penrose
and me (independently - I have never met Sir Roger) that brain activity
cannot be faithfully simulated on a digital computer.  Sir Roger, by the
way, like me (I have been told) rather dislikes computers and does not (or
at least when last I heard about it) even answer email on computers.  I am
slightly different in that I both write and answer email, but I rather
dislike digital computers although I will defend to the death their right to
have their own opinions. :  ).  I have not yet decided about quantum
computers, analog computers, molecular computers, laser/light computers,
etc.   My argument about brain activity is far simpler than Sir Roger's - I
derive it from mathematical fuzzy multivalued logics and their
probability-statistics and proximity function-geometry-topology analogs,
which does not make use of randomness as incompressibility or even computer
randomness at all.

Speaking of randomness, I pointed out that incompressibility randomness is
only one interpretation of randomness.   To those of us who grew up and
spent at least half of our lives in the non-computer world (or at least, the
not heavily computerized world), probability and statistics vs computer
viewpoints are not quite the same thing.  When somebody in one of my
statistics classes tells me that something is random, I tend to be slightly
put off.   You see, everything is random in a sense in
probability-statistics.   Even the non-random world so-called is random,
only the probability of the random part is near or at zero - which,
strangely enough, does not mean impossible or the null set.

Let me clarify the latter.   The probability of an impossible event, like
the probability of the null set, is zero.   But an uncountably number of
things have probability zero.   In n-dimensional Euclidean space or even
spaces that are rather similar to it, any n-k dimensional subset (k = 1, 2,
3, ..., n - 1) has probability zero provided that a continuous random
variable has a distribution on that space or on a volume of space containing
the events in question.   The proof is the same as the corresponding proof
for Lebesgue measure.   Moreover, the same is true for time, not such space,
since an event that occurs at only one point in time has dimension 0 in
time, and so has dimension one less than the time dimension of 1, and so the
above result holds.   So in 3-dimensional Euclidean-like space or 3+1
Euclidean-like spacetime, points, strings, planes, plane figures or their
approximations laminae, curves, lines, line segments, curve segments,
2-dimensional surfaces of 3-dimensional objects (e.g., the surface of the
human brain, the surface of a human being which is usually skin, the surface
of an organ, the surface of the earth, etc.), they all have probability 0
under the rather general assumption that a continuous random variable has a
distribution on space(-time), e.g., the Gaussian/normal distribution.

The events at or near probability zero, and likewise processes of those
characteristics, are RARE EVENTS/PROCESSES (RARE EVENTS for short).

Now that I have started elaborating, I will conclude with one other note of
caution.  In what might look like an Old Testament prohibition, I should say
that *ALL IS NOT IN CONCATENATED STRINGS OF SYMBOLS.*   In fact, it might be
more accurate to say that almost nothing is in strings, but that might be
misunderstood, so I restrain myself.   In my theory, which I refer to as
Rare Event Theory (RET), I distinguish between SYNTAX and SEMANTICS.  Of
course, computer people do that too, and computational linguists.   But when
push comes to shove, they mostly regard information as SYNTAX.   In order
not to confuse myself with computer people or computational linguists, I
distinguish between information, which is syntactic, and KNOWLEDGE, which is
semantic in the usual dictionary sense of MEANING - what symbols and words
and propositions and sentences MEAN.I am not at all sure that
incompressibility captures KNOWLEDGE so much as SYNTAX.   However, we will
let that pass for now, except for the slight detail that Knowledge, Memory,
and Rare Events appear to coincide - although part of it is a well-motivated
and well-indicated conjecture.   In any case, I will continue

Re: Schmidhuber II implies FTL communications

2002-09-05 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Thurs. Sept. 5, 2002 10:25PM

I don't know whether Hal Finney is right or wrong after reading pages 5-8 of
Yurtsever, since Yurtsever writes like David Deutsch and Julian Brown and so
many other members of the quantum entanglement school - no matter how many
words they put in, they always leave out interconnecting logic and physics.
Most mathematical psychology models have in the past been of this type,
believe it or not, which is probably why mathematical psychology is today
one of the most backward fields.  I think that, despite NASA's alleged use
of chaos avoidance in some satellite or missile, chaos theory is more or
less in the same boat.

I spent some time on an internet forum discussing David Deutsch's work some
time ago, and neither Deutsch nor his friends had the faintest idea what I
was talking about, and the feeling is mutual.   I used to think that
misunderstandings between scientists (including mathematicians) are not
usually deliberate, but I am beginning to even question that in reference to
quantum entanglement because such dogmatism and intolerance and lack of
spelling out steps characterizes the field.   And it's OK with some people,
because they've been doing that as a way of life with less complicated stuff
and getting away with it!

If nothing else, entanglement as a continuous or connected process/event
can't be as easily faked or double-talked as entanglement as a bunch of
discrete steps.   Unless somebody has some comments to make about my work,
much of which is at http://www.superstringtheory.com/forum, I'll go back to
the forum where I can continue my continuous are piecewise continuous
approach.  Actually, they can reach me at [EMAIL PROTECTED],. if they have
any useful comments.

Osher Doctorow

Osher Doctorow
- Original Message -
From: Hal Finney [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Thursday, September 05, 2002 7:32 PM
Subject: Re: Schmidhuber II implies FTL communications


 Wei writes:
  I just found a paper which shows that if apparent quantum randomness has
  low algorithmic complexity (as Schmidhuber II predicts), then FTL
  communications is possible.
 
  http://arxiv.org/abs/quant-ph/9806059

 This was an interesting paper but unfortunately the key point seemed
 to pass by without proof.  On page 5, the proposal is to use entangled
 particles to try to send a signal by measuring at one end in a sequence
 of different bases which are chosen by an algorithmically incompressible
 mechanism.  The assumption is that this will result in an algorithmically
 incompressible set of results at both ends, in contrast to the state
 where stable measurements are done, which we assume for the purpose of
 the paper produces algorithmically compressible results.

 The author writes: This process of scrambling with the random template T
 guarantees that Bob's modified N-bit long string of quantum measurements
 is almost surely p-incompressible..., and that Alice's corresponding
 string (which is now different from Bob's) is also (almost surely)
 p-incompressible

 It's not clear to me that this follows.  Why couldn't Bob's measurement
 results, when using a randomly chosen set of bases, still have a
 compressible structure?  And why couldn't Alice's?

 Also, does this result depend on the choice of an unbalanced system
 with alpha and beta different from 1/2?  This short description of
 the signalling process doesn't seem to refer explicitly to special
 alpha/beta values.

 If not, could the procedure be as simple as choosing to measure in
 the X vs + bases, as is often done in quantum crypto protocols?  If we
 choose between X and + using an algorithmically incompressible method,
 will that guarantee that the measured values are also incompressible?

 Hal Finney





Re: Time as a Lattice of Partially-Ordered Causal Events or Moments

2002-09-03 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Tues. Sept. 3, 2002 8:26AM

It also depends on the logic that one chooses (e.g., Lukaciewicz/Rational
Pavelka and Product/Goguen and Godel fuzzy multivalued logics - see P. Hajek
Metamathematics of Fuzzy Logics, Kluwer: Dordrecht 1998 for an excellent
exposition except for his mediocre probability section)..   See my
contributions to http://www.superstringtheory.com/forum, especially to the
String - M Theory - Duality subforum of their forum section (most of which
is archived, but membership is free, and archives are accessible to
members).   Or my paper in B. N. Kursunuglu et al (Eds.) Quantum Gravity,
Generalized Theory of Gravitation, and Superstring Theory-Based Unification,
Kluwer Academic: N.Y. 2000, 89-97, which has some further references to my
earlier work.

Analysis including nonsmooth analysis does combine the discrete and the
connected/continuous, but in my opinion it generally regards the discrete as
an approximation to the continuous/connected or piecewise
continuous/piecewise connected (pathwise, etc.).

One confusing point, I think, is the tendency of many mathematical logicians
to identify with algebra and in fact to claim that their field is a branch
or outgrowth of algebra.   This was originally claimed by *Clifford
Algebra,* but Clifford himself and many of his wisest descendants/followers
such as Hestenes of Arizona State U. realized than the opposite true -
*Clifford Analysis,* *Spacetime Algebra,* and so on are typical terminology
used by the latter and others to indicate that they are really dealing with
analysis and geometry and related things.  Why do so many mathematical
logicians identify with algebra?Largely, in my opinion, because algebra
is much more mainstream-accepted than mathematical logic (and popular, and
respected, etc.), but also because algebra is abstract and mathematical
logic seems to many of its practitioners to be more abstract than concrete.
I have cautioned in various places that even in pure mathematics there needs
to be a balance between abstractness/abstraction and concreteness/physical
application.  Analysis historically has had much more of this balance (rough
equality of abstraction and concreteness).

There are also many built-in biases in mathematical and theoretical physics,
and one of them in my opinion is the bias toward dissolution of geometry at
the sub-Planck level.Part of this is the pre-quantum computer bias
toward the discrete and finite or at most countably infinite and the digital
vs analog computer bias (in favor of digital computers).   The real line and
real line segments of course are uncountably infinite and connected, and
thee would essentially be no applied mathematics or mathematical physics for
example without it - and not much pure mathematics either.It helps to
occasionally look back in mathematical history, especially to Georg Cantor's
Contribution to the Theory of Transfinite Numbers, which even Birkhoff and
MacLane in their algebra textbooks made sure to include.   Of course,
Birkhoff ended up in applied differential equations and hydrodynamics
largely, but MacLane has never been accused of being Analysis-inclined to my
knowledge, and Birkhoff started out at least algebraic.

I hope that we can resist the temptation to go into absolutes.   I am glad
to see that you started your reply with a tolerant and compromising tone,
and I will end this posting with a similar tone.  The discrete and the
connected are in my opinion different theories or parts of different
theories overall, and they are also parts of different interpretations.   My
view is that science progresses by tolerating different theories and
different interpretations for competition and because many supposedly wrong
theories or interpretations end up much later having something useful to
contribute.   The majority of scientists (the *Mainstream* as I refer to
them) do not subscribe to this view, but consider that science advances in a
spiral by *killing off* the wrong theories or by only generalizing
(including generalizing in the limit) the partly correct theories - the Law
of the Jungle viewpoint of competition by (intellectual) warfare or
*cannibalistic* absorbtion as opposed to the Competing Teams idea of
competition in which one keeps other teams alive in order to keep competing
and for motivation and ultimately because one respects them and regards them
as like oneself trying to achieve the *Impossible Dream*.   You may find my
contributions to math-history (see the Math Forum and epigone sites) to be
interesting in this regard.

Osher Doctorow



- Original Message -
From: Tim May [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, September 02, 2002 11:07 PM
Subject: Re: Time as a Lattice of Partially-Ordered Causal Events or Moments



 On Monday, September 2, 2002, at 09:22  PM, Osher Doctorow wrote:

  From: Osher Doctorow [EMAIL PROTECTED], Mon. Sept. 2, 2002 9:29PM
 
  It is good to hear

Re: Time

2002-08-31 Thread Osher Doctorow

From: Osher Doctorow [EMAIL PROTECTED], Sat. Aug. 31, 2002 9:52PM

Hal Finney, John Mikes, and the others on the parts of this thread that I
have read have contributed some interesting ideas and questions.

I have not read the *time* articles in Scientific American, but I would like
to put in a good word for at least the principle of inter-translating
between quantitative and verbal languages, including the question of what to
do about *rough* translations.  Many quantitative people are very hesitant
to publish in or contribute to the *popular* journals and literature
including books and public-directed internet because they seem to feel that
they would lose something important (*rigor*) in the translation and that
their audience won't amount to *research material* anyway.I've taught
mathematics/statistics and done research in mathematical physics and
mathematical modeling at the college level (and occasionally Secondary
levels) since the 1970s (I'm 63 years old), and I'm of the opinion that
Creative Geniuses of the Leonardo Da Vinci and Pierre De Fermat level were
verbal-quantitative geniuses (they happened to also be several hundred years
ahead of their times, which is not quite true of many Nobel Prize winners).
I would also say that Kurt Godel, Paul Dirac, Steven Weinberg (well, until
recently anyway), Lord Francis Bacon, Shakespeare, Bach, Beethoven, Mozart,
Vivaldi, Cervantes, Erwin Schrodinger, Einstein, G. 't Hooft, and Socrates
especially reveal strong verbal-quantitative Creative Genius abilities and
skills.   In the process of translating back and forth between verbal and
quantitative modalities, one stimulates associations and memories and ideas
in both types of memory storage, and they seem to influence each other into
further combinations and ideas.   Largely because of this, I consider that
it is better to translate *roughly* than not at all, and that if the main
idea is conveyed, the details can wait to some extent for later if ever -
the main ideas may well contribute to someone's Creativity.  Scientific
American and also various internet forums and discussion groups have done
that mostly, and I like to point out that good side to them.

I also think that the tendency to label *time* schools by individuals' names
would better be changed to describing time schools by brief labels as to
what they do.   For example, *computer-time* versus *no special
time-orientation* hardly seems a basis for categorizing time, although they
could well contribute to some other categorization of time or something
else.   For myself, I think discrete versus continuous time and spacelike vs
non-spacelike (in the sense of the 3+1 labellings vs the 4-dimensional ideas
which just regard time as another spacelike axis or dimension) are more
useful.   Of course, it is interesting to ask how computers relate to time -
and I think that we will eventually have to tackle the question of how
quantum computers and analog computers for example differ from digital
computers on this question.

Osher Doctorow Ph.D.
One or More of California State Universities and Community Colleges
- Original Message -
From: Hal Finney [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Saturday, August 31, 2002 2:25 PM
Subject: Re: Time


 John Mikes writes:
  would it be too strenuous to briefly (and understandably???)
  summarize a position on time which is in the 'spirit' of the
  'spirited' members of this list?

 It seems to me that there are two views of time which we have considered,
 which I would classify as the Schmidhuber and the Tegmark approaches.
 In the Schmidhuber view time is of fundamental importance, and in the
 Tegmark view it is basically unimportant.

 Schmidhuber models the multiverse as the output of a computational
 process operating on all possible programs.  Since computation is
 inherently sequential, it imposes a time ordering on the output.  It is
 natural to identify the time ordering of a computation with the time
 ordering of events in our universe.  So the simplest interpretation of
 the Schmidhuber model as an explanation of our universe is to picture
 the computer as generating successive instants of time as it operates.

 An obvious problem with this is that time appears to have a more
 complex structure in our universe than in the classical Newtonian
 block model.  Special relativity teaches us that simultaneity is not
 well defined.  And general relativity even introduces the theoretical
 possibility of time loops and other complex temporal topologies.  It
 is hard to see how a simple interpretation of Schmidhuber computation
 could incoporate these details.

 Stephen Wolfram considers some related issues in his book, A New Kind
 of Science.  He is trying to come up with a simple computational model
 of our universe (not of the multiverse, but the same issues arise).
 In order to deal with special relativity he shows how a certain kind of
 computational network can have consistent causality even