Re: [Haskell-cafe] Re: Does laziness make big difference?
apfelmus, Cool! I really like your examples! Thank you. Nick. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Does laziness make big difference?
Peter, Roughly, I'd say you can fudge laziness in data structures in a strict language without too much bother. (I don't have much experience with this, but the existence of a streams library for OCaml is the sort of thing I mean. There are plenty of papers on co-iterative streams and suchlike that show the general pattern.) Yes, agree. And this was my initial point. If you wish to add control structures you would need to use the lazy keyword a lot, e.g.: if cond then *lazy* S1 else *lazy* S2 and for more complicated structures it's not going to be always clear what needs to be suspended. Laziness is a conservative default here. (If you want to write an EDSL in a non-lazy language, you'll need to use some kind of preprocessor / macros / ... - in other words, a two-level language - or do thunking by hand, as above, or live with doing too much evaluation.) One way to gauge how useful laziness really is might be to look through big ML projects and see how often they introduce thunks manually. A thunk there is usually something like "fn () => ..." IIRC. Also IIRC, Concurrent ML is full of them. Probably, dealing with macros is not so scary and Paul Graham and Piter Siebel show that it is quite easy. :-) Ok, let's go from the another side: I have searched through Darcs source and found 17 datastructures with strict members (for example data Patch = NamedP !PatchInfo ![PatchInfo] !Patch) and 36 occurrence of this dreaded seq. And Darcs is an application being not speed critical. And if one try to write cipher decoder on Haskell, I guess he has to make his program full of '!' and 'seq' (or FFI). Dare I say the tradeoff is between a relatively simple operational model (so you can judge space and time usage easily) and semantic simplicity (e.g. the beta rule is unrestricted, easing program transformation). Cool! Thank you. Best regards, Nick. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Does laziness make big difference?
Nick, Roughly, I'd say you can fudge laziness in data structures in a strict language without too much bother. (I don't have much experience with this, but the existence of a streams library for OCaml is the sort of thing I mean. There are plenty of papers on co- iterative streams and suchlike that show the general pattern.) If you wish to add control structures you would need to use the lazy keyword a lot, e.g.: if cond then *lazy* S1 else *lazy* S2 and for more complicated structures it's not going to be always clear what needs to be suspended. Laziness is a conservative default here. (If you want to write an EDSL in a non-lazy language, you'll need to use some kind of preprocessor / macros / ... - in other words, a two- level language - or do thunking by hand, as above, or live with doing too much evaluation.) One way to gauge how useful laziness really is might be to look through big ML projects and see how often they introduce thunks manually. A thunk there is usually something like "fn () => ..." IIRC. Also IIRC, Concurrent ML is full of them. Dare I say the tradeoff is between a relatively simple operational model (so you can judge space and time usage easily) and semantic simplicity (e.g. the beta rule is unrestricted, easing program transformation). cheers peter ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: Does laziness make big difference?
apfelmus, The question is the following: how big the gap between strict languages with lazy constructs and Haskell? Does the default lazyness have irrefutable advantage over default strictness? Laziness is needed to achieve true compositionality. This point is elaborated in "Why functional programming matters" by John Hughes Yes, I like the paper. But the examples can be just decomposed into generator and the rest, and the generator can be made lazy in case of strict languages. For instance, finding "Newton-Raphson Square Roots" can be expressed in strict language (Scala again): repeat (f, a) : Stream[Double] = Stream.cons(repeat f (f a)) or in hypothetical language "L" with some syntactic sugar repeat f a = *lazy* cons a (repeat f (f(a)) To be clear, I don't mind against laziness at all, I want to know what I get if I take laziness everywhere as default semantics. I also think that the laziness in Haskell is already so implicit that 90% of the Haskell code written so far will simply break irreparably if you experimentally remove it. Yes, I understand, that the present Haskell code heavily bases on laziness, but I'm going into the problem in general: how much I get, if I switch from default strictness to default laziness in my hypothetical language L? Or, from other side, how much I throw away in the reverse case? By the way, lazy evaluation is strictly more powerful than eager evaluation (in a pure language, that is) with respect to asymptotic complexity: Richard Bird, Geraint Jones and Oege de Moor. "More Haste, Less Speed." http://web.comlab.ox.ac.uk/oucl/work/geraint.jones/morehaste.html You gave me a lot of food for thought. Thank you for the link. Best regards, Nick. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe