Hi Julian,

I heartily agree that building your own implementations is important.
Although I have also found it illuminating to read and study the
implementations of others, there is nothing that brings the issues
home like making the implementation choices yourself.  Then you can
see exactly what results from the interactions among those choices,
leading to a deeper understanding of the fundamental issues.  I
blogged about part of my own journey in a 4-part series starting with
http://www.dalnefre.com/wp/2011/11/fexpr-the-ultimate-lambda/ (where I
reference both Maru and Kernel).

One insight that I would like to share is that semantics should be
considered separately from syntax.  Not to say that syntax isn't
important, it is!  Only that, at least for me, considering the
different elements of meaning and how they interact is more important
than how they are expressed textually (or visually).  My approach to
this has been to build an object/actor model for each semantic
element, then consider their responsibilities and their protocols for
interacting with each other in order to achieve a computational
result.

I keep coming back to "minimal" sets of conceptual "primitives", where
"minimal" and "primitive" are relative to which model you choose.
Three of these sets keep recurring in my work: actors, functions and
patterns.  For actors, the primitives are "create", "send" and
"become".  For functions, the primitives are "variable", "abstraction"
and "application".  For patterns, the primitives are "predicate",
"sequence" and "alternative".  I continue to explore productive
combinations of these concepts.  I hope you enjoy your journey of
discovery as much as I am enjoying mine!

Take care,
Dale

On Tue, Apr 10, 2012 at 12:04 PM, Alan Kay <alan.n...@yahoo.com> wrote:
> Hi Julian
>
> (Adding to Ian's comments)
>
> Doing as Ian suggests and trying out variants can be an extremely
> illuminating experience (for example, BBN Lisp (1.85) had three or four
> choices for what was meant by a "lambda closure" -- three of the options I
> remember were (a) "do Algol" -- this is essentially what Scheme wound up
> doing 15 years later, (b) make a private a-list for free variables, (c) lock
> the private a-list to the values of the free variables at the time of the
> closure).
>
> I  suggest not trying to write your eval in the style that McCarthy used
> (it's too convoluted and intertwined). The first thing to do is to identify
> and isolate separate cases that have to be taken care of -- e.g. what does
> it mean to eval the "function position" of an expression (LISP keeps on
> evaling until a lambda is found ...). Write these separate cases as
> separately as possible.
>
> Dave Fisher's thesis "A Control Definition Language" CMU 1970 is a very
> clean approach to thinking about environments for LISP like languages. He
> separates the "control" path, from the "environment" path, etc.
>
> You have to think about whether "special forms" are a worthwhile idea (other
> ploys can be used to control when and if arguments are evaled).
>
> You will need to think about the tradeoffs between a pure applicative style
> vs. being able to set values imperatively. For example, you could use
> "Strachey's device" to write "loops" as clean single assignment structures
> which are actually tail recursions. Couple this with "fluents" (McCarthy's
> "time management") and you get a very clean non-assignment language that can
> nonetheless traverse through "time". Variants of this idea were used in
> Lucid (Ashcroft and Wadge).
>
> Even if you use a recursive language to write your eval in, you might also
> consider taking a second pass and writing the eval just in terms of loops --
> this is also very illuminating.
>
> What one gets from doing these exercises is a visceral feel for "great power
> with very little mechanics" -- this is obtained via "mathematical thinking"
> and it is obscured almost completely by the standard approaches to
> characterizing programming languages (as "things in themselves" rather than
> a simple powerful kernel "with decorations").
>
> Cheers,
>
> Alan
>
>
>
> ________________________________
> From: Ian Piumarta <piuma...@speakeasy.net>
> To: Julian Leviston <jul...@leviston.net>
> Cc: Fundamentals of New Computing <fonc@vpri.org>
> Sent: Monday, April 9, 2012 8:58 PM
> Subject: Re: [fonc] Kernel & Maru
>
> Dear Julian,
>
> On Apr 9, 2012, at 19:40 , Julian Leviston wrote:
>
>> Also, simply, what are the "semantic inadequacies" of LISP that the "Maru
>> paper" refers to (http://piumarta.com/freeco11/freeco11-piumarta-oecm.pdf)?
>> I read the footnoted article (The Influence of the Designer on the Design—J.
>> McCarthy and Lisp), but it didn't elucidate things very much for me.
>
> Here is a list that remains commented in my TeX file but which was never
> expanded with justifications and inserted into the final version.  (The ACM
> insisted that a paper published online, for download only, be strictly
> limited to five pages -- go figure!)
>
> %%  Difficulties and omissions arise
> %%  involving function-valued arguments, application of function-valued
> %%  non-atomic expressions, inconsistent evaluation rules for arguments,
> %%  shadowing of local by global bindings, the disjoint value spaces for
> %%  functions and symbolic expressions, etc.
>
> IIRC these all remain in the evaluator published in the first part of the
> LISP-1.5 Manual.
>
>> I have to say that all of these papers and works are making me feel like a
>> 3 year old making his first steps into understanding about the world. I
>> guess I must be learning, because this is the feeling I've always had when
>> I've been growing, yet I don't feel like I have any semblance of a grasp on
>> any part of it, really... which bothers me a lot.
>
> My suggestion would be to forget everything that has been confusing you and
> begin again with the LISP-1.5 Manual (and maybe "Recursive Functions of
> Symbolic Expressions and Their Computation by Machine").  Then pick your
> favourite superfast-prototyping programming language and build McCarthy's
> evaluator in it.  (This step is not optional if you want to understand
> properly.)  Then throw some expressions containing higher-order functions
> and free variables at it, figure out why it behaves oddly, and fix it
> without adding any conceptual complexity.
>
> A weekend or two should be enough for all of this.  At the end of it you
> will understand profoundly why most of the things that bothered you were
> bothering you, and you will never be bothered by them again.  Anything that
> remains bothersome might be caused by trying to think of Common Lisp as a
> dynamically-evaluated language, rather than a compiled one.
>
> (FWIW: Subsequently fixing every significant asymmetry, semantic
> irregularity and immutable special case that you can find in your evaluator
> should lead you to something not entirely unlike the tiny evaluator at the
> heart of the original Maru.)
>
> Regards,
> Ian
>
> _______________________________________________
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>
>
> _______________________________________________
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to