> A wise man once warned about the danger of premature optimisation. I often > spend ages labouring over efficiency aspects of my code (GHC, for example) > that turn out to be nowhere near the critical path. Language choice is > another example.
> My biased impression is that raw speed is seldom the determining factor in > language choice.
Etc. ...
> That said, there are undoubtedly reasons why a high level language is > fundamentally never going to be as fast as a low level one.
Well, ... What does it mean a "fast language"?
My favourite example, some centuries old, goes back to times when I was a happy, young physicist. We wanted to implement a combinatorial problem, the global effect of individual nucleon-nucleon scattering in alpha-alpha collisions. Some 256 diigraphs to compute, much more for the final result. A friend of mine wrote a Pascal/Fortran program for a CDC Cyber mainframe, and I took microProlog and a Sinclair Spectrum with 48K of main storage. His program gave the result after 3 seconds. Mine: after, hm. ... well, after some hours I had to use a cassette player to store the intermediate results, finally next morning I got something usable.
So what? - will you ask.
Well, the crux of the matter is that I wrote my program in two days. My friend spent about two weeks to complete his coding.
Now, who could drink more bottles of beer before obtaining the final results?
===================
And now, again, more and more people curse the laziness, fight against boxed, shareable data, want to produce lightspeed database interfaces, etc. I am too lazy to get nervous (unless I do so for theatrical reasons, as an old teacher), but I sincerely think that it is time that somebody writes a book on Haskell as a language for the FAST DESIGN of lousy algorithms...
Jerzy Karczmarczuk Caen, France
_______________________________________________ Glasgow-haskell-users mailing list [EMAIL PROTECTED] http://www.haskell.org/mailman/listinfo/glasgow-haskell-users