On Wed, Sep 16, 2009 at 10:42 AM, John Cowan <[email protected]> wrote:
> Lynn Winebarger scripsit:
>
>> As for static vs dynamic, it's more like static as in "frozen on the page
>> of the standard" versus a system which facilitates
>> metaprogramming.  `call/cc' is a metaprogramming feature, just as
>> macros are, and as "library management" would be.
>
> If call/cc is a metaprogramming feature, then so is procedure call, for
> one is simply the dual of the other.
>

I'd like to clarify this, because I used "static" in an odd way.  What I
meant was something akin to "syntactic".

The simplest meaning of "a language" is its syntactic extent:  the set of
all strings that would be accepted as part of the language.  A
programming language extends this to map those syntactic constructs
to a semantic model.  That's one of the purposes of the language
standard: to define the language syntactically and then assign
meaning to its programs.

I see two basic views about the nature of programming languages
in relation to their users [*].  They are clearly too
simplistic to describe any particular individual's views.

One is that the only part of the language that the user should
know (or need to know) is the syntactic part - the semantic
model is normative denotationally but not operationally.  There
is a high value placed on making the meaning of a program
depend solely on the text of the program.

The other is that the language is merely an interface to the
semantic model.  The program author often has expectations
both that a certain operational model is at work for time/space
considerations and that certain syntactic constructions will
affect that operational model in certain ways, even if they don't
affect the denotational "meaning" of the program.

When I see someone emphasize "language" in talking about
Scheme, I take it they are going to be arguing from the
former point of view.  There will be a preference for
minimizing the part of the semantic model available
to the user to what is explicitly available in
the syntactic part of the language.

For example, in Scheme, almost all types of values are
constructable in the syntax, either at the lexical level
(numbers, symbols, vectors, etc) or have constructors
in the syntax (lambda).  I point this out because even
though most data types have operators and constructors
that are purely procedural, they are part of the language
at a syntactic level as well.  Application, the only
mandatory operator for procedures is, of course, built
into the syntax of the language.  In a language without
first-class continuations, a continuation is simply an
inseparable piece of the application operator from the
POV of the language user.

You can argue that continuations are implicit in the syntax,
and certainly with call/cc you can syntactically construct
at least part of the continuation of your choice, but you
can't just write a continuation as a value.  In Scheme minus
the call/cc procedure, they simply don't exist as a
value that can be referenced by a program author.

A substantial literature exists on the difficulties
incurred in implementing first class continuations when
compared to a language that only has the application
operator.  Even though it only exists in the language
as a value and not syntactically, call/cc radically
alters the semantics of the language.

Metaprogramming is about reflection and reification.  In
a language design that constrains itself to its syntactic
extent, the reflection and reification happen as macros
that reflect procedures into the parsing phase and
extend the syntax available to the programmer.  However,
components of the semantic model can also be the
subject of reflection and reification.

To summarize:

Macros allow us to reflect procedures into the parsing phase.

"3D" macros allow us to reify "compile-time" values into our source
code (think compiled procedures appearing as literal constants in
the operator position of a S-expr).

Call/cc allows us to reify a part of the run-time semantic model
to which we would otherwise have no access.

While we can achieve the effect of reflecting an arbitrary
procedure into the "current continuation", we don't have
a direct way of performing the reflection.

A feature like first-class environments, if it applies to arbitrary
closures, is unpleasant because it takes an aspect of the
syntax of the language and forces it to exist in the semantic
model when it is unnecessary[**].  Some equivalent of
a continuation is guaranteed to exist in the semantic model
thanks to the CPS transform, though its extent changes from
compiler-controlled to arbitrary.  That's why the latter is
acceptable if potentially annoying to at least some
optimization-minded compiler-writers, while the former
is likely to anathema.

If a Scheme system allows you to import libraries, those
libraries must exist somewhere that can be referenced.
It may be the case the user has no right or free space
to add library code in that accessible storage, but it does
not change the semantic model to simply acknowledge
such resources must exist and to allow the user
abstracted control over them.  If it doesn't work, the
system can throw an exception or just abort on an
error.

That's my take on it anyway.
Lynn

[*] There is nothing inherently functional or imperative about
    either of these viewpoints, though they are undoubtedly
    correlated with the sides of that argument.
[**]  First class environments for use with eval or
       top-level code with free variables is fine by me.

_______________________________________________
r6rs-discuss mailing list
[email protected]
http://lists.r6rs.org/cgi-bin/mailman/listinfo/r6rs-discuss

Reply via email to