On 4/12/2012 4:50 PM, Andre van Delft wrote:
FYI: Michael Nielsen wrote a large article "Lisp as the Maxwell’s equations of 
software", about the famous page 13 of the LISP 1.5 Programmer’s Manual; see
http://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-equations-of-software/

The article is discussed on Reddit:
http://www.reddit.com/r/programming/comments/s5jzt/lisp_as_the_maxwells_equations_of_software/

partial counter-assertion:
it is one thing to implement something;
it is quite another to implement something, effective.

although a simple interpreter for a language is very possible, it is far from being competitive.

what about performance?
what about libraries?
what about interoperability?
...

much like writing a 3D renderer:
it isn't too hard to get a few polygons on the screen;
it is much harder to get it to handle scene-complexities like those in commercial games, with a similar feature-set, and comparable performance.

the eventual result is that a "simple" core will often end up hideously more complex, and considerably less flexible, than it could otherwise be.


now, as far as Lisp itself, it faces a few problems:
generally unfamiliar language constructions and control flow;
relatively few people are terribly fond of S-Expressions;
implementations with significant library and interoperability problems;
...

but, its supporters are fairly dedicated, and not really open to any sorts of change or variation (suggest throwing a C-style syntax on it, and many will balk...).


a problem I think is that many people tend to operate in a mindset of "it is either perfect" or it is "unacceptable", and assume that "perfection" is somehow part of an "ontology" or similar, however, many people disagree, even regarding the question of which options are "better" or "worse".

part of the problem I think is that many people are prone to classify things along an "axis of interest" or similar, and then prone to assume that this axis is an "absolute" ranking of the options. typically this classification is done at the absence of considering other aspects or relative tradeoffs.


better I think would be to think more in terms of how "locally optimized" something is for a given "problem domain" or "common special case".

for example, the problem domains C is optimal for are very different from those of Java and C#, and still different from those of C++, which are very different from those of ECMAScript, which are all very different from those of Lisp and Scheme.


nevermind such peculiarities (in math land) as set-theory, which is (for whatever reason) commonly used despite being similarly incomprehensible both to humans and machines (and lacking any real obvious/direct application in computing, 1), but presumably math people have some sort of reason for why they do things the way they do (throwing set notation and operations at pretty much everything no matter whether or not the topic in question really has much of anything really to do with sets, "well, I have a hammer, and that screw sure does look like a nail").

1: there are some debatable "indirect" cases, like SQL and algorithms built on walking or culling linked lists, but I don't really consider these to be applicable much beyond being "vaguely similar".


better I think could be to try to isolate out which elements are more foundational, and figure out what sorts of things can be built from these elements, rather than trying to make things be direct manifestations of said elements.

sort of like, with chemistry: the pure elements are fine and well, but focusing solely on these would ignore the large and diverse world of various compounds and material properties.


or such...

_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to