The problem occurs when the result value is needed and thus the
thunks need to be reduced, starting with the outermost, which can't
be reduced without reducing the next one .... etc and it's these
reduction steps that are pushed on the stack until its size cause a
stack-overflow.
Yes, that's exactly right, and something that's not often pointed out.
Btw, this is kind of relative strictness (when is one part of my program
needed to answer demands on another part) is the kind of example
for which old GHood can be helpful (once you get used to the display).
If you have Java on your machines, try installing GHood [1] (on
hackage thanks to Hugo Pacheco), then things like
ghc -e ':m +Debug.Observe' -e 'printO $ observe "foldr" foldr (+) 0 [1..4] '
ghc -e ':m +Debug.Observe' -e "printO $ observe \"foldl'\" foldl' (+) 0 [1..4] "
ghc -e ':m +Debug.Observe' -e 'printO $ observe "foldl" foldl (+) 0 [1..4] '
This was also among the examples on the GHood home page [2],
so you could try the applet version instead, and in section 4.2 of the
paper [3] (as a "well known strictness problem";-). Page and paper
mention a few other similar examples and discuss some differences
between static (which parts are needed at all) and dynamic strictness
(which parts are needed when, relative to other demands).
Claus
[1] http://hackage.haskell.org/cgi-bin/hackage-scripts/package/GHood
[2] http://www.cs.kent.ac.uk/~cr3/toolbox/haskell/GHood
[3] http://www.cs.kent.ac.uk/~cr3/publications/GHood.html
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe