* John Meacham: >> Clean has also declined in these benchmarks but not that much as Haskell. >> According to John van Groningen Clean's binary-trees program in the previous >> shootout version used lazy data structure which resulted in lower memory >> usage and much faster execution. That was removed by the maintainer of the >> shootout and replaced by a much slower one using strict data structure. > > Why was this done?
I suppose the itent of the binary-trees benchmark is to measure allocation performance in the presence of a fairly large (well, not in today's terms) data structure. Using laziness to prevent that data structure from being built (or use additional sharing) kind of defeats the purpose of the benchmark. Note that these are microbenchmarks, not real applications. Imposing such rules makes sense. _______________________________________________ Haskell mailing list [email protected] http://www.haskell.org/mailman/listinfo/haskell
