Let me be a little pedantic.

The 9fans know given the haphazard nature of a hobbyist's knowledge I am extremely bad at this, but then let me give it a try.

FYI, it's been Lisp for a while.

As long as Britannica and Merriam-Webster call it LISP I don't think calling it LISP would be strictly wrong. Has LISt Processing become stigmatic in Lisp/LISP community?

Like what? The if statement, which was invented by Lisp? The loop
statement, for expressing loops? It sounds like you got a dose of Scheme
rather than Lisp to me.

I just read in Wikipedia that, "Lisp's original conditional operator, cond, is the precursor to later if-then-else structures," without any citations. Assuming that to be true conditional branching is a fundamental element of control flow and it has existed in machine languages ever since early days. There's really very little to brag about it.

Regardless, I offer the following comparison:

19.2. How to Use Defstruct
<http://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node170.html>

Struct (C programming language)
<http://en.wikipedia.org/wiki/Struct_(C_programming_language)>

In the (small mind?) mental model of small computer there's the row of pigeonholes and the stencil you may slide along the row for "structured" access to its contents. I leave it to you to decide which of the above better corresponds to that. My opinion you already know.

Indeed, my only encounter with LISP has been Scheme and through a failed attempt to read SICP.

This hasn't been true for a while. Common Lisp is a general purpose
language like any other. The only thing I have ever found obnoxious about
CL was the filesystem API. Most CL implementations are compilers these
days and they produce surprisingly efficient machine code. The Scheme
situation is more diverse but you can definitely find performance if
that's what you're eluding to.

I was alluding to the expressive power of C versus LISP considered with respect to the primitives available on one's computing platform and primitives in which solutions to one's problems are best expressed. It isn't a matter of whether the language you use is supplemented by good libraries or how fast the binary image you produce can run as I have little doubt out there exist lightning fast implementations of complex algorithms in LISP. I was trying to give my personal example for why I managed to learn C and failed to learn LISP.

If you have a scrawny x86 on your desktop and are trying to implement, say, a bubble sort--yes, the notorious bubble sort, it's still the first thing that comes to a learner's mind--it seems C is quite apt for expressing your (embarrassing) solution in terms of what is available on your platform. Loops, arrays, swapping, with _minimal_ syntactic distraction. Simple, naive algorithms should end up in simple, immediately readable (and refutable) code. Compare two implementations and decide for yourself:

<http://en.literateprograms.org/Bubble_sort_(Lisp)>
<http://en.literateprograms.org/Bubble_sort_(C)>

Its claim to fame as the language for "wizards" remains.

I think this has more to do with Lisp users being assholes than anything
intrinsic about Lisp. This is one of the nice things about Clojure. It's
a break from tradition in this regard, as well as many others.

I really did mean "wizards" by "wizards." I intended no insult--merely sort of an awed jealousy.

It's as though you have the up-to-date negative propaganda, but not the
up-to-date facts.

Of course. Propaganda has a wider outreach than facts, particularly when for every textbook on a subject there are, I don't know, ten (or more?) on the competing subject.

The main benefits it had in AI were features that came from garbage
collection and interactive development.

More importantly, LISt Processing which used to be an element of the expert systems approach to AI and which is now defunct (as a way of making machines intelligent, whatever that means). While "expert systems" continue to exist the word causes enough reverb of failure to be replaced by other buzzwords: knowledge-based systems, automated knowledge bases, and whatnot.

I think, and may be dead wrong, LISP's ominous appearance came from adhering to an AI paradigm. Now that the paradigm's no more viable why should the appearance persist?

An advantage it has these days is that it produces code that performs
better than, say, Python or Perl.

I cannot comment on this. Have no knowledge of Python and beg to disagree about Perl. The entry barrier for learning Perl was low enough for me to learn and use it, unlike LISP.

I definitely would not call being a "general purpose system" and
suitability for "application programming" a "specific application area."

Well, for one thing I believe you have misread me. I said C was a general-purpose language good for "system programming"--you seem to call that "being a good OS language"-- and low-level application programming. I probably should have taken more care and wrote the precise term: systems programming.

This is like saying agglutinative languages are worse for conquering the
world with than isolating languages because the Ottoman empire fell
before the English empire.

Correlation doesn't imply causation--that's true. But there _are_ ways to ascertain a correlation is due to a causal relationship. One such way is to identify known causes of success or failure. _If_ one claims a language costs more to learn and rewards similarly or even less than another language one already has identified a known cause of failure. If failure does occur, causation by the language itself, rather than its surrounding elements (marketers, users, designers, climate, serendipity), cannot be ruled out.

I think it's mostly happenstance. Lots of languages succeed despite
having a killer app or app area. Python's a good example.

Despite _not_ having those, you mean, right? I think it's too early to talk about Python's success. It has barely lived half as long as C and one-third as long as LISP. If you're really going to call Python successful I don't know how you're going to describe Java.

Please don't interpret this as "Lisp kicks C's ass."

I don't, and I certainly weren't implying "C kicks LISP's ass." I don't qualify for that sort of assertion.

There are simply too many variables to lay the blame at Lisp's alleged
functional basis.

That's a very good point. I did say "LISP represents a programming paradigm" but I don't think its (perceived?) failure has to do with the paradigm itself, rather with whether mere mortals can find application areas where the cost of assimilating that paradigm (and therefore learning the language) is justified by measurable gains.




--On Friday, September 04, 2009 15:36 -0600 Daniel Lyons <fus...@storytotell.org> wrote:

Let me be a little pedantic.

On Sep 4, 2009, at 2:18 PM, Eris Discordia wrote:
Above says precisely why I did. LISP is twofold hurtful for me as a
naive, below average hobbyist.

FYI, it's been Lisp for a while.

For one thing the language constructs do not reflect the small
computer primitives I was taught somewhere around the beginning of
my education.

Like what? The if statement, which was invented by Lisp? The loop
statement, for expressing loops? It sounds like you got a dose of Scheme
rather than Lisp to me.

For another, most (simple) problems I have had to deal with are far
better expressible in terms of those very primitives. In other
words, for a person of my (low) caliber, LISP is neither suited to
the family of problems I encounter nor suited to the machines I
solve them on.

This hasn't been true for a while. Common Lisp is a general purpose
language like any other. The only thing I have ever found obnoxious about
CL was the filesystem API. Most CL implementations are compilers these
days and they produce surprisingly efficient machine code. The Scheme
situation is more diverse but you can definitely find performance if
that's what you're eluding to.

Its claim to fame as the language for "wizards" remains.

I think this has more to do with Lisp users being assholes than anything
intrinsic about Lisp. This is one of the nice things about Clojure. It's
a break from tradition in this regard, as well as many others.

Although, mind you, the AI paradigm LISP used to represent is long
deprecated (Rodney Brooks gives a good overview of this deprecation,
although not specifically targeting LISP, in "Cambrian Intelligence:
The Early History of the New AI"). One serious question today would
be: what's LISP _really_ good for? That it represents a specific
programming paradigm is not enough justification. Ideally, a
language should represent a specific application area, as does C,
i.e. general-purpose system and (low-level) application programming.


It's as though you have the up-to-date negative propaganda, but not the
up-to-date facts. Lisp is "really good for" the same kinds of things
other general purpose languages are good for. The main benefits it had in
AI were features that came from garbage collection and interactive
development. You get those benefits today with lots of systems, but that
doesn't mean they aren't still there in Lisp. An advantage it has these
days is that it produces code that performs better than, say, Python or
Perl. I definitely would not call being a "general purpose system" and
suitability for "application programming" a "specific application area."
This is like saying agglutinative languages are worse for conquering the
world with than isolating languages because the Ottoman empire fell
before the English empire.

Please don't interpret this as "Lisp kicks C's ass." I'm saying, you're
only seeing the negative half of the situation, and seeing too much
causality. I think it's mostly happenstance. Lots of languages succeed
despite having a killer app or app area. Python's a good example.
Isolating the exact ingredients for the success of any language is
probably impossible. I'd say only with C is it really clear what led to
success, and it wasn't exclusively features of the language itself
(though it was a part of it), but also that it came with Unix along with
the source code. If the quacks had chosen C instead of Lisp for their "AI
research" perhaps C would have taken a big hit during the so-called AI
winter instead of Lisp. Perhaps if the Lisp machine vendors hadn't
misunderstood basic economics so thoroughly, their machines would have
become more common and taken Lisp with them the way Unix brought C. There
are simply too many variables to lay the blame at Lisp's alleged
functional basis. Especially today when languages like Haskell exist that
take functional so much further they make Lisp look like a procedural
language by comparison.

—
Daniel Lyons



Reply via email to