On 1/29/13 4:25 AM, Junior White wrote:
Hi Cafe,
    I have two programs for the same problem "Eight queens problem",
the link is http://www.haskell.org/haskellwiki/99_questions/90_to_94.
    My two grograms only has little difference, but the performance, this is
my solution:

The difference is what's called "dynamic programming" (an utterly non-intuitive an un-insightful name). When we have the program:

    [ f x xs | xs <- g, x <- h ]

we're saying, first get me a partial solution (xs), and then try every possible way of extending that to a larger solution (x). It should be obvious from this description that the computation of each partial solution xs will be shared among all candidates x, but that the computation of x will not be shared between each xs.

On the other hand, when we have the program:

    [ f x xs | x <- h, xs <- g ]

we're saying, first get me all ways to start a solution (x), and then try to solve the rest of the problem (xs). It should be obvious from this description that the computation of each x will be shared, but the computation of each xs will not.

Imperatively, this is exactly the same distinction as between the following programs:

    for xs in g:
        for x in h:
            yield f(x,xs)

    for x in h:
        for xs in g:
            yield f(x,xs)

This difference in sharing can, as you've seen, cause huge differences in runtime. Usually it's the difference between a polytime algorithm and some exptime algorithm. To see why, just think about the call graph. It may be more helpful here to think about something like Fibbonaci numbers. In the memoizing version, you're storing the work from solving smaller problems and sharing that among the different ways of extending the solution; whereas in the naive version, you're recomputing the same thing over and over. The call graph for the former is a DAG (or more generally, a packed forest) whereas the call graph for the latter is the tree you get by unfurling all the shared structure in the DAG.

This distinction has nothing whatsoever to do with Haskell, and has everything to do with Intro Algorithms. Loop ordering matters in every language with loops, from Haskell to C to Python to Prolog.

--
Live well,
~wren

_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to