On Sat, Sep 11, 2021 at 9:01 AM Ben Goertzel via AGI <[email protected]>
wrote:

> James,
>
> ***
> My assertion is that the notion of "type" is rescued by the notion of
> "unit" and that "abstract type" is rescued by the notion of
> "dimension" within the relational paradigm. That this might be the
> case should be no surprise as the natural sciences (particularly
> physics) most rigorously address "the empirical world".
>
> ***
>
> I don't get what you mean by "  "abstract type" is rescued by the
> notion of "dimension" within the relational paradigm". ?
>
> Take e.g. a complex dependent type as expressed in Idris, or a
> probabilistic type as expressed in
>
> https://arxiv.org/abs/1602.06420


My point is that *all* "type theories", as currently conceived, are ill
founded because they aren't grounded in the requirement that they express
an arithmetic of *observation* which, in multiplicity, become quantities
called *measurements* represented as *numbers* that carry with them
*dimensions* -- what Russell was trying to get at with "relation numbers"
-- grounded in the relational dimensions of the empirical world.  Physics
has had to explore this *metaphysics *for the obvious reason that the
axioms of physical theory are observations -- these observations qualify as
axioms because they are the "givens" of the formal systems of physics.
There is something profound about this reversal.  By "reversal" I mean we
ordinarily think of "induction" as producing "axioms" from observations so
as to produce a computer program (the Kolmogorov Complexity program).  This
reversal relates to the way dynamical systems theories provide
time-reversible laws of nature via complex valued functions.

Taking this from another angle:  In model theory terminology, there must be
an *interpretation* for any *meaningful* "type theory" -- where "meaningful
interpretation" is what model theory is (supposedly) all about:  a model.
I say "supposedly" because what we usually see in model theory is
ungrounded interpretations -- interpreting one theory in terms of another
theory.  All this beating-around-the-bush provides employment for
mathematicians but gets us nowhere.

How are these rescued or reformulated or ?? as dimensions?

Think about it like this:

A testing suite for any formal system purporting relevance is that
dimensionless numbers must be a special case of the default case:
dimensioned numbers aka measurements.  Moreover, its arithmetic must do
inherent "type checking" by asking whether two numbers that are being
added/subtracted have the same dimensions (and conserve their case counts)
and whether the result of any multiplication/division results in the
expected addition/subtraction of their dimensional exponents.

I have noted that the 4 truth values identified by Kauffmann/ Collins
>
> https://arxiv.org/abs/1905.12891


Yes, that approach is getting closer to the metaphysics I'm talking about.


> map into Patterson's Constructible Duality (paraconsistent) logic's
> truth values, and that logic maps into a pair of Heyting algebras
> which means that expressions in CD logic correspond to pairs of
> programs in dependently typed languages without continuation.   In
> this sense LoF expressions are isomorphic to pairs of dependently
> typed expressions, but I'm not sure who is rescuing whom from what ;)
>

Again, appealing to my notion of a formal systems "test suite":

Such a test suite -- such a "set of requirements" -- for a formal system
"rescues" the formal system from mental masterbation in the same way that a
programmer's test suite or set of requirements rescues the programmer.


>
> ben
>
> On Fri, Sep 10, 2021 at 8:39 AM James Bowery <[email protected]> wrote:
> >
> > Any approach to AGI needs a mathematical metaphysics.  The most
> widely-accepted such metaphysics at present is the Turing Machine's
> foundation for Algorithmic Information Theory.  While some Theories of
> Everything argue against the implied causal structure's unidirectional
> "arrow-of-time" it is not unreasonable to attempt to elaborate the Turing
> Machine approach to AGI.  In that elaboration, what might be thought of as
> a "standard library" must be constructed.  In so-doing, "the relational
> dimensions of the empirical world", represented as "number" with implied
> dimensionality must fall out quite early and naturally as applicable to
> physical dimensions (length, mass, etc.) or we are on the wrong track.  In
> what follows I summarize a lifetime of professional support of, if not my
> own work on programming languages toward this end.  This is not a complete
> picture, as the metaphysical assumption about time's arrow, implied by the
> Turing machine, precludes modeling the deeper relational structures from
> which it time, itself, may-yet emerge.  Geometric algebra may-yet place
> Turing's metaphysical assumption about time in its proper perspective as
> emergent from a mathematical metaphysics "standard library" that
> better-compresses our empirical observations.  In that process of
> philosophical discovery, I fully expect that AIT's other-half, sequential
> decision theory, will find its utility function specified and, by
> implication, provide mathematical structures for "awareness", "qualia",
> etc., if not "consciousness".
> >
> > First I'm going to make a few radical assertions:
> >
> > A real-world relation is best-regarded as a random variable.  Think of
> measurement.  This is consistent with SQL's default allowance of duplicate
> rows in an extension.  These count tables represent the probability
> distribution of the random variable.  Each relationship (row) in an
> extension is, therefore, best thought of as a single measurement, or case.
> The duplicate row counts are therefore case-counts.  A probability density
> function results from simply dividing each case's count by the total counts
> in the relation.
> > The properties of a measurement (say, time and distance) are the
> dimensions of the measurement and these correspond to the columns of the
> extension.
> > Any measurement can be thought of as a low-dimensional selected
> projection of the empirical world: the universe. The universal extension
> has a single row -- a row with as many dimensions as the entire history of
> the universe has properties:  We might call this row "That which is The
> Case."
> >
> >
> >
> > Now, accepting all of that (which philosophers may well argue against --
> particularly if they don't like Descartes, etc.):
> >
> > In order for the random variable to have meaning, its dimensions must
> have counts, just as do its duplicate rows.  For instance, we might think
> of a relation whose composite dimension is velocity, with columns: time and
> distance.  Although there might be meaning to a physical dimension of
> time*distance (time^(+1) * distance^(+1)) that is not the physical
> dimension we call "velocity".  To obtain a velocity relation, we need
> distance/time which is time^(-1) * distance^(+1).  Note that these terms
> commute because multiplication (like 'and') commutes.  Column order is
> meaningless, just as is row order meaningless since addition (like 'or')
> also commutes.
> >
> > Now consider the relational dimension of energy where we join the
> velocity relation with a mass relation and assign column counts thus:
> time^(-2) * distance^(+2) * mass^(+1).
> >
> > Note that thus far, I have not talked about "units", nor of "types".
> First a down-to-earth comment about "units":  It is important to regard
> "units" as I/O formats (or "representations") with isomorphic
> transformations between them (1:1 correspondence between a distance
> measurement in inches and one in feet).  Second is a more philosophical
> comment about "types" vs "dimensions" that gets to the heart of what I
> believe is a huge mistake in the foundation of computer science dating to
> Russell and Whitehead's Principia Mathematica:
> >
> > PM's type theory (and elaborations/variations thereof) is the current
> foundation of computer science.  Russell used it to develop Relation
> Arithmetic.  In "My Philosophical Development", of Principia Mathematica
> Part IV "Relation Arithmetic", Bertrand Russell laments:
> >
> > "I think relation-arithmetic important, not only as an interesting
> generalization, but because it supplies a symbolic technique required for
> dealing with structure. It has seemed to me that those who are not familiar
> with mathematical logic find great difficulty in understanding what is
> meant by 'structure', and, owing to this difficulty, are apt to go astray
> in attempting to understand the empirical world [emphasis JAB]. For this
> reason, if for no other, I am sorry that the theory of relation-arithmetic
> has been largely unnoticed."
> >
> >
> >
> > However, the ultimate project of Principia Mathematica was directed at
> "the empirical world" in the conclusion of PM: Part VI "Quantity".
> "Quantity" consists of 3 sections the last of which, section "C", is about
> "Measurement" in terms of a generalization of the concept of number
> (section "A"), to include units of measurement (mass, length, time, etc.)
> as commensurable (dimensioned) quantities ("B" "Vector-Families").
> >
> > Yet, other than *314:
> >
> > "Relational real numbers are useful in applying measurement by means of
> real numbers to vector-families, since it is convenient to have real
> numbers of the same type as ratios."
> >
> > I see nothing in Part VI that references anything like "relation
> numbers" as defined in Part IV.
> >
> > Before I get into a resolution strategy, I want to add one final issue
> that is key to understanding relational structure in terms of measurement:
> >
> > Any value that we assign to a cell in a table has what is called
> "measurement error".  Note, I'm talking here not of a relation (table) nor
> of a relationship (row), but of a relata (cell value) of that
> relationship.  Take, for instance, a table of velocities with time and
> distance columns.  Each case (row, or relationship between measured
> properties) has two measurements for that case: a measured distance and a
> measured time.  What we call "measurement error" is an estimate of the
> probability distribution that would prospectively obtain with repeated
> measurements of the same conditions.  In other words, assigning measurement
> error, or "fuzziness", is best thought of as imputing missing data -- those
> prospective measurements just mentioned.  In any rigorous attempt to deal
> with the fuzziness of the real world, it is important to keep in mind the
> relational structure of the measurements so that propagation of measurement
> error is understood in terms of relational composition (aka 'JOIN' to use
> database jargon).
> >
> > Now to proceed to the resolution strategy:
> >
> > Late in Russell's life he admitted he regretted Type Theory, stating it
> was the most arbitrary thing he and Whitehead did and that it was more of a
> stopgap than a theory.
> >
> > As it turns out, Russell admitted this because he was relieved and
> delighted he lived long enough to see the matter resolved in the late 1960s
> book titled "The Laws of Form" by G. Spencer Brown.  The resolution was to
> include what logicians think of as "paradox" as a, if not the, primary
> foundation of mathematical logic:
> >
> > Russell's Paradox (The set of all sets that don't contain themselves as
> members.) which motivated PM's Type Theory, is only one form of this
> protean "paradox".  The most Laconian form is:
> >
> > "This sentence is false."
> >
> > The resolution provided in GSB's LoF was to introduce the the square
> root of -1 as primary in mathematical logic.  This is otherwise known as
> the imaginary identity 'i' found throughout all of dynamical systems
> theory.  Dynamical systems are about changes.  In relational database
> terms, these are updates.  Relational updates are addition and subtraction
> of rows.
> >
> > Under the notion of row-as-relationships-as-case, subtraction entails
> negative case counts.
> >
> > Interestingly, negative case counts permit the emergence of something
> called Link Theory which Paul Allen's think tank, Interval Research
> supported until its demise, at which point I supported it at HP's "Internet
> Chapter II" project aka "eSpeak" until _its_ demise, at which point
> Federico Faggin (co-founder of Intel's microprocessor division) underwrote
> its final support at Boundary Institute.
> >
> > Link Theory utilized negative case counts to provide a relational
> description of physics including the core of quantum mechanics -- and was
> therefore of interest in the quantum computing field.  This is due to the
> fact that quantum measurement involves projection (as do all measurements
> -- see my prior invocation of "That which is The Case.") that included not
> only ordinary probabilities, but also what are called "probability
> amplitudes".  Quantum probability amplitudes have complex values on the
> unit circle of the complex plane. Complex values have imaginary
> components,   Link theory accommodated QM's imaginary components with a
> particular symmetry used by George W. Mackey in his 1963 book "Mathematical
> Foundations of Quantum Mechanics" representing 'i' as a 2x2 spinor matrix:
> >
> >  0  1
> > -1  0
> >
> > See Appendix A of "Link Theory -- From Logic to Quantum Physics".
> >
> > The -1 in this spinor corresponds to the negative case counts required
> for relational structure to encompass quantum measurement.
> >
> > Federico Faggin supported this work because hardware design languages
> needed a formal theory other than conventional logic to model digital
> circuits with feedback (ie: memory, state change, etc.).  George Spencer
> Brown developed his mathematics as a result of inventing minimal circuits
> in the early days of the transistor -- and found he was working with
> imaginary logic values.
> >
> > So, tying this all together to address the original point:  It would
> appear that the computer science notion of "type" is not only ill-founded
> -- leading to all manner of confusion regarding "the empirical world" (in
> Russell's apt descriptive phrase) but is recognized as being ill-founded by
> its founder!
> >
> > My assertion is that the notion of "type" is rescued by the notion of
> "unit" and that "abstract type" is rescued by the notion of "dimension"
> within the relational paradigm. That this might be the case should be no
> surprise as the natural sciences (particularly physics) most rigorously
> address "the empirical world".
> >
> > Once we accept the framework of dimensionality as relational structure,
> we can see, further, the potential for new modes of schema analysis based
> on the scientific discipline of dimensional analysis.
> > Artificial General Intelligence List / AGI / see discussions +
> participants + delivery options Permalink
> 
> --
> Ben Goertzel, PhD
> http://goertzel.org
> 
> “He not busy being born is busy dying" -- Bob Dylan

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta03542805b689301-M84ca090c527bd72f916b73a0
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to