Martin Baker wrote:
On Friday 20 November 2009 23:17:25 Tim Daly wrote:
There is an excellent talk by Rich Hickey about modelling time,
identity, values,
perception, state, memory, etc.
Tim,
While I was watching this talk I was wondering about the difference between
the mainstream computing issues verses mathematical computing issues.
I get the impression that the mainstream issue, from this talk, is about how
to run multiple algorithms in parallel?
If we are trying to solve a set of equations, is there a natural parallelism ?
For the reasons discussed in the talk, should a rule based method be preferred
wherever possible and explicit coding of algorithms be discouraged?
Martin Baker
Martin,
I don't know but I have some thoughts on the subject.
The first comment is that I have deep experience in rule based
programming (I was a team
member on a commercial rule-based programming product at IBM, we also
built a huge
expert system (FAME) on a combination rule-based/knowedge-rep system I
built (KROPS),
one of my two thesis topics was on the subject of rules, I use them in
work on our current
Function Extraction project, etc)
I fear rule-based programming. It has the siren-song subtle appeal of
being very easy
to state "WHEN this DO thisthing". When you get into a problem, the
solution is to add
another rule to solve that particular case. Ultimately, you end up in a
situation like the
dinosaur in the tarpit... he can lift any leg but he cannot get out of
the pit.
Rule based systems are subject to two general classes of failure. Either
they simply
stop because no rule applies or they go into an infinite loop because a
prior state repeats.
I do not know of a general way to verify and validate a rule based
program (which was
the subject of the thesis). In fact, they are extremely hard to debug.
You can't use
"print" statements or debuggers. Tracing is a swamp of output. Every
rule could
be perfectly correct and the program is still wrong (witness the dinosaur).
Worse yet, the whole system working system can become unhinged by the
addition of
just one "obviously correct" rule.
The second problem with rule-based programming is that they are not
generally designed
to be "theory-aware". For instance, you can write rules in Axiom to do
simplification by
pattern matching. You could write rules such as "divide each side by a
constant". The
problem is "what if the constant is zero?".... Ok, we can fix that with
a rule... But now
someone wants to use your simplification ruleset in a different domain
(e.g. a domain
which is non-associative for instance)... where does your ruleset assume
associative?
How does that assumption affect other rules? What if I want to apply
"obviously correct"
formulas (e.g. x = sqrt(x^2))? But is this correct if x=-1?
Theory-aware systems need
to be built on a consistent world based on consistent axioms. This kind
of effort feels like a
"Principia" approach which Godel undermined.
I don't think rules will operate correctly in parallel either (although
I have not tried).
For rules to operate effectively they need to perceive (to use Hickey's
term) the world
in some consistent state. But a parallel system will undermine that
assumption. A
parallel dinosaur could be in a state with all of its feet out of the
tar since each process
is "lifting" one foot, (the dinosaur effectively "jumped") but the
problem isn't solved.
Hickey is advocating pure functions which move from state to state,
which look a lot
like rules but they can be much more theory aware and they can be
applied in a
procedural way.
Curiously, Hickey does not define functions on Identities, claiming
they are an emergent property of states. But we do reason about
Identities also
(they are meta-states, I guess). The river may not be the same from
moment to
moment except before the river flows and after the river dries up. The
Identity of
a river has a lifetime above and beyond its states.
Tim
_______________________________________________
Axiom-developer mailing list
Axiom-developer@nongnu.org
http://lists.nongnu.org/mailman/listinfo/axiom-developer