On Fri, 19 Feb 1999 01:19:18 -0500
Leon Smith <[EMAIL PROTECTED]> wrote:
> In this particular case, pretty much any Haskell compiler will
> automatically perform optimizations that will transform the first
> definition into the second definition.  So, the first definition will
> create the same object code as the second.  Even if the implementation
> doesn't optimize it, the fact that Haskell is lazy means that there isn't a
> huge difference between the two.  The following is how a implementation
> would reduce our un-optimized expression:

The instructive letter of Leon Smith advocated not only the use of lists,
but also the use of combinators on lists. I would like to ask a question
I have long in mind: to what extent can we trust compiler optimization?

2 examples I encountered recently are of particular interest to me. In
John Hughes' famous paper "Why Functional Programming Matters" 
he mentioned a function 'within' which takes a infinitely long list of 
approximations and returns the first element whose difference from
the previous element is smaller than a fixed epsilon:

within eps (x:y:xs) | abs (y-x) > eps = within eps (y:xs)
                    | otherwise       = y

I would like to code it in an alternative style:

within eps xs = (snd . head . (dropWhile (\(a,b) -> abs(b-a) > eps))) (xs, tail xs)

Or, in one stage of a merge sort, I would like to merge every 2 elements
of a list (of lists). In recursive style it would be

ms [] = []
ms [xs] = [xs]
ms (xs:ys:xss) = merge xs ys : ms xss

It's also possible to rephrase it this way

ms xs = lzipWith merge (odds xs, evens xs)

where evens and odds returns the even and odd elements of a list respectively
and lzipWith is the "long" version of zipWith (which I think is something missing
in the standard library of Haskell... is it?).

I would say that the 2nd version of both example is the style I prefer and
would use to show off to my friend. But I hesitate to use them in productive
code because I do not know how clever the compiler is. Transforming the 2 
cases into their efficient counterparts involves many levels of deforestation
as well as inlining of pair or list operations ( patter matching the 2nd element
of a list). How much price must a compiler pay to perform all these transformation
on all functions in a large program? How much price we would pay if it does 
not do so?

I would like to know whether the state-of-the-art Haskell compilers are
able to handle these cases and how it is done other than massively
trying to deforest everything.

sincerely,
Shin-Cheng Mu
Academia Sinica, Taiwan.


Reply via email to