Hello folks,
Further to Joe Marshalls comments about
REBOL & tail-call optimisation and continuations
here's some more general info, this time 
>From the PYTHON Mailing List Archive

------ SNIP ---------------------------------------------------------

Thread: Scheme's Infinite Zen Was Python's Glorius Future
> Most Scheme stuff can optimize tail-recursion.
In order to actually qualify as Scheme, implementations are required to
be properly tail recursive. It's a short step... if you're using a
Scheme that can't optimize tail-recursion, find another Scheme. (Or
better yet, write your own!)
I wouldn't say that Scheme is impoverished by comparison to Perl or
Python... in fact, once people really know how to milk the key
abstractions in Scheme for all they're worth, I think it's easier to
write equivalent programs in less code than other languages. It's kind
of like UNIX, in that there're just a few incredibly powerful
abstractions, uniformly applied. The following are possible in Scheme:
whole OS kernel schedulers in less than a page... interpreters in a
page or two... compilers in about the same... (note: not necessarily
for Scheme-based syntaxes / semantics, either!) equation solvers in a
page and a half... whole OO runtimes in a couple of pages...
unification engines for theorem proving and type inferencing systems in
about a page... FFT in half a page. These are just some of the
canonical examples from (for instance) the essential Scheme texts like
Dybvig (see below.)
The one area that Perl and, to a lesser extent, Python have an edge on
Scheme in is text processing / string processing. Having the equivalent
of sed built into the language syntax for Perl gives it a real edge;
any sort of good integration with regexps is a bonus for that. OTOH, if
you step one step away from that and can abstract your problem into some
kind of symbolic processing, that's ameliorated to some extent and IMO
Scheme becomes a much better choice.
I wrote up this little piece this morning in a private response to John
Klassa... it's germaine to the discussion and has a few good pointers
so I thought I'd repost it to the list at large.
--




You know, my major epiphany about programming and programming languages
didn't really happen until about 1990, when I was at Sun.  My group was
building a large distributed system which had a need to support dynamic
updates to the client across a geographically dispersed user base. (This
is the system Sun uses to handle service calls.)  We came up with the
notion of using downloadable code --- this is pre-Java --- the client
was actually just an engine which would download the actual "functional"
client code from a database at runtime.  We designed a little (C++)--
language called "Evil" (don't ask ;-) and designed a little interpreter
system for it.  (In some ways, I've always wondered if we influenced Oak
a.k.a. Java at all.  Surely not. ;-)  I learned more than I could've
imagined by picking apart a bunch of different implementations of Scheme
in C.  It's amazing how little C it often takes to implement a Scheme
--- sometimes in as little as, say, 2-4k LoC.  Check some of these out,
especially SIOD.


Diving into interpreters --- actually implementing one --- was a real
eye-opener.  I'd used Lisps a lot in school, and had a lot of formal
stuff, but never really saw the connection between the mathematical
underpinnings of programming languages and reality until then.  Since
then, I've come to realize that there's, like, this "infinitely deep
Zen" in the lambda calculus and particularly in Scheme.  My sort of
"holy grail quest" ever since then has been in trying to synthesize an
object calculus analog to Church's calculus.  (The reason for this is
that such a formalism would make it possible to produce efficient
compilers for dynamic object-oriented languages, and would avoid lots of
runtime messiness in generated code that often exists today.)


At any rate, here're some books and stuff I'd heartily recommend if
you're interested in this kind of thing.


    The Scheme Programming Language by Kent Dybvig
    Structure and Interpretation of Computer Programs by Abelson and
Sussman


These two are general Scheme and "deep programming" texts.  Dybvig is
the K&R of Scheme, and a really pretty little book...  it even covers
areas like meta-circular interpreters, designing multitasking engines /
schedulers for operating systems, etc.  It's a real eye-opener to see
how really complicated concepts like these can be implemented in a page
or two of code in a powerful language like Scheme.  (Aside - I worked
with a woman who was a grad student of Dybvig's --- she said he was the
smartest person she'd ever met.)  Abelson and Sussman is a bit broader
book, and may well be the best general programming book in existance.


    Programming Languages:  An Interpreter-Based Approach by Samuel
Kamin


This book talks in great detail about how all programming languages
share a common set of abstractions, and how the differences in certain
language features drive implementation choices.  It builds up
implementations of interpreters for several "languages" - a "basic" (not
BASIC) interpreter, Lisp, APL, Scheme, SASL, CLU, Smalltalk, and Prolog
(note that he's really illustrating the key unique abstractions in each,
he's actually re-worked the syntax for them all into a sort of
s-expression / Scheme-like syntax so as not to obscure the issues.)  The
implementations in the book are done in Pascal, ick.  However, there's a
guy out on the net, Timothy Budd, who some years ago did an adjunct to
this book called "The Kamin Interpreters in C++" which shows how each of
these can be approached from an OOP /  structural perspective and
implemented in C++ --- in effect he builds up a really nice class
library for building interpreters in C++.  You can find that here.


    Essentials of Programming Languages by Daniel Friedman et. al.
    Compiling with Continuations by Andrew Appel


This first one is the major mindblower.  I can't even say enough about
EOPL.  It is the most mathematically-rigorous and interesting book of
the set.  The implementation language is Scheme;  Friedman is a big name
in Scheme, and one of the deepest thinkers about programming languages
I've ever encountered.  Also check out Friedman's numerous papers
detailing his investigations into making various language elements
first-class:  continuations, interpreters themselves, extents.
Essentially, anything by Friedman here.  Once you get past the "lambda
calculus gestalt" you run into a second gestalt regarding continuations
and the usefulness of something called "continuation-passing style."
This book closes in that direction, illustrating how programs can be
formally transformed into CPS from which it is trivial to generate
psuedo-assembly for Von Neumann processors.  Wow.  Math to assembly
without all the Aho et. al. ;-)  Maybe those Bell Labs guys weren't so
smart after all. ;-)  A much deeper look at that topic alone - compiling
with continuations - can be had in CC above.


Enjoy!  I'm convinced that there are at most a few dozen people on the
planet that really have a fundamental grasp of programming.  I don't,
yet [despite years of cranking code in all sorts of languages] --- but
I'm determined to get there.  These books have helped me immensely along
my way.


$0.02,


jb

------------- END SNIP ----------------------------------------------------

I've more to say on Joe Marshalls thoughts & comments soon, but that will have
to wait until I can find sufficient time to write down my thoughts on some
of the finer points & issues he raised.

As this applies to REBOL 1.x it is all theoretical now but for me it's 
interesting and important nonetheless when considering REBOL as a language
and it's possible implementations & capabilities.

More soon,

Mark Dickson



-- 
To unsubscribe from this list, please send an email to
[EMAIL PROTECTED] with "unsubscribe" in the 
subject, without the quotes.

Reply via email to