On 25 Oct 2002, Marco Baringer wrote:
: Luke Palmer <[EMAIL PROTECTED]> writes:
: >                        But think of what macros in general provide:
: > 
: >       * Multi-platform compatability
: >       * Easier maintenance
:           * Creating/Embedding custom languages. aka - adapting the
:             langauge to your problem domain.
: 
: common lisp macros allow you to locally extend the vocabulary of
: common lisp in the same way engineers have locally (ie within the
: engineering domain) extended english with new syntax/semantics to deal
: with engineering problems. 

Only up to a point.  Engineers sometimes muck with the language at
the parse level, before the macro processor even has a shot at it.
Lisp gets away with this only because its syntax unambiguously
distinguishes verbs from nouns.  But engineers are always messing
around with their word categories.  How about using a verb as a
predicate adjective:

    All systems are go for launch.

That's probably derived from something more like:

    All systems are "go" for launch.

So a macro system that takes preparsed text is still not powerful
enough.  It could be argued that you just pass in a string of
data tokens without parentheses to get any arbitrary language,
but you still can't parse a sentence like:

    All systems are ( for launch.

: macros are functions which are run when the source code is read
: (parsed). the argument to a macro is source code (expressed as a data
: structure and not simple text) and the return value is source code
: (not text). this is a fundamental difference between C's
: text-processing macros, without this macros lose most of their power
: and become too hard to write to be used.

Yes, source filters have the same problem.

: - what macros are really good at is embedding mini-languages and
:   creating new idioms, this often goes with, but is not nessecarily
:   related to, reducing lines of code. example: CLOS/MOP (common lisp
:   object system/meta object protocol) are implemented as macros on top
:   of non-OO lisp (this statement maybe be a lie if you go deep enough
:   into the specifics of some implementations).

Support for mini-languages is a major design goal for Perl 6.

: - the power of lisp's macros is that they allow you to perform
:   arbitrary code transformations by working with code as if it was
:   data. at one point there was discussion about having perl subs with
:   "auto-args" (i forget where i read about this) where by the
:   arguments to the sub where determined by parsing the body of the sub
:   itself and looking at what variables where used, this is a trivial
:   macro in lisp. adding this to perl5 required a source filter which
:   took forever to write and was never used because is was never
:   reliable enough (this may say more about my capabilities as a
:   programmer than about perl5 source filters).

But we want auto-args by marking the args themselves, not by
mentioning a special macro name in front.  So support has to be
built-in.

: - everything you can do with macros you can do without, since macros
:   always get expaned (translated) into "regular" common lisp
:   code. however, sometimes (like with CPS) hand writing the output is
:   prohibitly difficult.

Sure.

: - some people consider macros to actually reduce maintainability since
:   they perform arbitrary code manipulations, so you have _no_ idea of
:   what is going on if you don't know what the macro does. macros which
:   introduce new symbols are especially vulnerable to this.

Well, same is true of any built-in.  But macros get really nasty if
they cause your program to throw error messages that are impossible
to understand.

: - any sufficently powerful tool can be used to shot yourself in the
:   foot (or blow off your head). i doubt this java-esque argument
:   (let's "protect" the programmers from themselves) has any weight
:   with perl programmers, but it's something i've heard more than once.

Actually, it has a lot of weight, but not in the sense of preventing
Perl programmers from using the powerful features.  What we really
try to do is to avoid requiring the novice programmer to know abou the
powerful features before they need to know them.  If a Perl programmer
has to do grammar munging in order to write a CGI script, something
is terribly wrong.  They might use a module that does grammar munging
on their behalf, but that's different, because presumably someone
else with more expertise wrote that module.  So grammar munging is
there to make life easier for today's source filter writers, not to
make life harder for the novice.

: - writing realiable/robust source filters is hard (do able, but hard,
:   even with The Damien's Filter::Simple). writing grammars is better,
:   but still hard, and more importantly, both require a rdically
:   different mind set from "regular" programming. the ease of writing
:   lisp macros is largely due to the fact that lisp has no syntax
:   (almost), and that lisp's syntax is programmable. perl6 will have
:   the second and can't do much about the first (sort of goes against
:   "different things should look different").

Interestingly (you will appreciate this as an Italian, or at least
a resident of Italy), I was reading Umberto Eco's _The Search for a
Perfect Language_, and he makes the point that, over the centuries,
many of the designers of "perfect" languages have fallen into the trap
of trying to make similar things look similar.  He goes on to argue
that similar things should look different, because when you don't,
you end up with too little redundancy for effective communication.

Suppose you have a system in which all farm animals are classified
into the same category, and distinguished by one letter in their
name.  All farm animals begin with, say, "snarfu".  So we get:

    snarfum     "cow"
    snarfun     "horse"
    snarfux     "chicken"
    snarfuk     "pig"
    snarfuh     "goose"
    ...

As you can see, it would be really easy in such a system for you to
tell your spouse to kill the goose, and have them think you wanted them
to kill the horse instead, especially if the next word happened to
start with a nasal sound.  You might send someone to milk the horse
instead of the cow.  Or you might think you're going to have chicken
pot pie for dinner, and get pig pot pie instead.

Of course, if you mishear a different letter, you might get a category
error, and end up baking a spouse pot pie.  But you'd probably notice
that miscommunication...

This problem doesn't arise as much when similar words look and sound
quite different, as they do in natural languages.  But it does make
those languages harder to learn.

Written languages (such as computer languages) are perhaps a bit more
forgiving, since you don't have to parse them on the fly with only
one chance to hear--you can stare at them until you see that you have
a -n instead of a -m, or a qx// instead of a qw//.  Or a !! instead
of a ||, for that matter...

But it does mean that it would probably a mistake to force all
user-defined quote constructs to be of the form qX//.  Up till now
I've carefully chosen letters for X that don't have the same overall
shape, so that qr, qx, qq, etc. all look quite different.  Extending
that after the manner of "perfect" languages will reduce readability.

Larry

Reply via email to