Re: [Haskell-cafe] Re: Functional programming for processing of largeraster images

2006-06-21 Thread Brian Hulley

Joel Reymont wrote:

I think the issue wasn't using functional programming for large image
processing, it was using Haskell. OCaml is notoriously fast and
strict. Haskell/GHC is... lazy.

Everyone knows that laziness is supposed to be a virtue. In practice,
though, I'm one of the people who either can't wrap their heads
around it or just find themselves having to fight it from the start.


Perhaps laziness is more foundational, in that you can write

 if2 c x y = if c then x else y

However:

1) What's the advantage of being able to define if2?

2) Many control constructs, like =, foldl', map etc don't require laziness 
because they already take a function as argument rather than a plain 
expression. In fact, I can't think of any useful user defined control 
constructs that actually use laziness in the same way as if2, and the 
presence of laziness in foldr, foldl etc just seems to be a real pain 
leading to excessive heap consumption. (Counterexample? )


3) Lazy lists as glue can easily be replaced by force/delay lists + an 
extension to pattern matching where pattern matching against [a] forces the 
argument and the syntax [h|t] is used as in Prolog, instead of h:t (This 
would also free : to be used for with type or with partial type instead 
of ::)


4) Other examples of the utility of laziness can turn out to be impractical 
chimera. For example, the famous repmin replaces the traversal of a tree 
twice with the dubious advantage of traversing it only once and the 
building up of a cluster of expensive thunks instead, and since the thunks 
effectively encode the structure of the tree, evaluation of them effectively 
constitutes the second traversal. So nothing's gained except difficulty of 
understanding the all-too-clever code (very bad software engineering 
practice imho), more heap consumption, and more time consumption.




Should we all switch to OCaml? I wish I had a criteria to determine
when to use Haskell and when to use OCaml.


Everything else about Haskell is so great and well thought out (eg type 
classes, no side effects, higher rank polymorphism, existentials) it seems a 
pity to throw all this away just because of one unfortunate feature. An 
alternative would be to write a converter that reads a file of Haskell 
source which is to be interpreted strictly and outputs a file of lazy 
Haskell source with stictness annotations everywhere, with desugaring of [a] 
into force/delay representation (*). (Also in general, !x could mean force 
x ie x(), and ~x could mean \()-x (in value syntax) or ()-x (in type 
syntax), and !x could be allowed in patterns also (when the type at that 
position is ~x))


   foo : ~Int - Int  -- also taking the opportunity to replace :: by :
   foo !x = x

desugared to

   foo :: (() - Int) - Int
   foo cx = case cx () of
 x - x

The main challenge I think would be in converting bindings in let and where 
to appropriate case bindings to ensure that as far as possible, redundant 
bindings for a given path of control flow are not made, and analysis of 
mutually recursive bindings to ensure that everything is bound before being 
used.


Some issues would need to be resolved about what the user can expect. For 
example whole program optimization could allow some expressions to be 
optimized out (eg by splitting a function returning a pair into two 
functions, one for each element) that would otherwise be non-terminating, 
whereas with lazy Haskell, there is an easy rule that (effectively) 
expressions are only evaluated when needed, regardless of optimization.


Then an interesting investigation would be to see how easy it is to port 
lazy Haskell to the new, totally strict, language, and if there are any 
situations where existing code has used laziness (for algorithmic reasons or 
just for efficiency) without being aware of it.


(*) So that eventually a new compiler could be written to take full 
advantage of the side-effect-free nature of the language + the strictness 
which means that values can be accessed without having to go through the if 
unevaluated then evaluate, store, and return else return for each 
unoptimized access. Because OCaml and SML have side effects, which limit 
optimizations due to possible aliasing etc, it might be possible to compile 
such a language to code which is certainly no slower, and possibly *even 
faster* than OCaml!!! :-)


Regards, Brian.

--
Logic empowers us and Love gives us purpose.
Yet still phantoms restless for eras long past,
congealed in the present in unthought forms,
strive mightily unseen to destroy us.

http://www.metamilk.com 


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Functional programming for processing of largeraster images

2006-06-21 Thread Ralf Hinze
Am Mittwoch 21 Juni 2006 21:30 schrieb Brian Hulley:
 Perhaps laziness is more foundational, in that you can write
 
       if2 c x y = if c then x else y
 
 However:
 
 1) What's the advantage of being able to define if2?

Well, you can easily define you own control structures (when,
unless etc). This feature is wildly used in combinator libraries.
Also, in a non-strict language recursive definitions are not
limited to function types. Users of parser combinators heavily rely
on this feature. Just try to define/use parsing combinators
ins a strict language. The problem with eager evaluation is
that it is too eager (expressions are sometimes evaluated too
early) just as the problem with lazy evaluation is that it
is too lazy (evaluations sometimes happen too late).

Cheers, Ralf
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Functional programming for processing of largeraster images

2006-06-21 Thread Neil Mitchell

Hi,

I happen to like laziness, because it means that when I'm not thinking
about performance, I don't have to think about evaluation order _at
all_. And since my computer is a 750Mhz Athlon with Hugs, I never find
any need to worry about performance :) If it ever becomes an issue I
can move to GHC or buy a faster computer without too much hassle.


1) What's the advantage of being able to define if2?

What about , || ? Should they be built in? What about and, which
is just  a lot of times, should that be lazy? At what point do you
say no? Should I be able to define implies correctly?


3) Lazy lists as glue can easily be replaced by force/delay lists + an
extension to pattern matching where pattern matching against [a] forces the
argument and the syntax [h|t] is used as in Prolog, instead of h:t (This
would also free : to be used for with type or with partial type instead
of ::)

That seems like more thought when writing the program, maybe its
worth it, maybe its not, but it doesn't seem as neat as what we
already have.


4) Other examples of the utility of laziness can turn out to be impractical
chimera. For example, the famous repmin replaces the traversal of a tree
twice with the dubious advantage of traversing it only once and the
building up of a cluster of expensive thunks instead, and since the thunks
effectively encode the structure of the tree, evaluation of them effectively
constitutes the second traversal. So nothing's gained except difficulty of
understanding the all-too-clever code (very bad software engineering
practice imho), more heap consumption, and more time consumption.

Laziness doesn't have to be exploited in complex ways - minimum = head
. sort is a nice example. isSubstr x y = any (isPrefix x) (inits y) is
another one. Often by just stating a definition, laziness gives you
the performance for free. Of course, if you wanted to think harder
(and I never do), you can write better performing and strict-safe
versions of these, but again its more effort.

The other thing you loose when moving to strictness is the ability to
inline functions arbitrarily - consider:

if2 c t f = if x then t else f

Consider the expression:
if2 True 1 undefined

Now lets inline it and expand it, and in Haskell we get 1, which
matches the evaluation. In strict Haskell the inlining is now invalid,
and thats quite a useful optimisation to make. While it seems that
compilers can get round this, my concern is for the poor programmer -
this nice property of viewing functions as just replace this with
that has disappeared.

I suspect that in years to come, lazy languages will also have the
upper hand when it comes to theorem proving and formal reasoning, but
I guess thats a matter for future consideration.

While laziness may not be all good, its certainly not all bad :)

Thanks

Neil
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: Functional programming for processing of largeraster images

2006-06-21 Thread Robert Dockins


On Jun 21, 2006, at 3:30 PM, Brian Hulley wrote:


Joel Reymont wrote:

I think the issue wasn't using functional programming for large image
processing, it was using Haskell. OCaml is notoriously fast and
strict. Haskell/GHC is... lazy.

Everyone knows that laziness is supposed to be a virtue. In practice,
though, I'm one of the people who either can't wrap their heads
around it or just find themselves having to fight it from the start.


Perhaps laziness is more foundational, in that you can write

 if2 c x y = if c then x else y


[snip some conversation...]


For those who haven't seen this already, here is a presentation by  
Simon PJ in which he discusses his views on laziness (among other  
things).


http://research.microsoft.com/~simonpj/papers/haskell-retrospective/ 
HaskellRetrospective.pdf



Takeaway point about laziness: Laziness keeps you honest by not  
allowing you to slip in side effects.


Bonus takeaway: read Wadler's papers :-)



Rob Dockins

Speak softly and drive a Sherman tank.
Laugh hard; it's a long way to the bank.
  -- TMBG

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe