Dan Weston wrote:
According to Lennart Augustsson (http://haskell.org/pipermail/haskell-cafe/2007-July/029603.html) you can have uninterruptible threads in ghc. If a thread never allocates it will never be preempted.

I am aware of that. I think I heard GHC devs acknowledge that it is indeed a bug, though. Of course, it occurs only in small examples, so it is not in the urgent queue of bugs to be fixed.

So I wonder if what you are proposing really is unsafe because by the time (take 1) is even invoked you might not regain control of the CPU being held hostage by parBuffer trying to evaluate undefined. After all, undefined can encompass non-halting calculations as well as abortive ones.

In other words, undefined is not an exception (despite the error message). Converting it to a catchable exception is tantamount to pretending that the threading model is fully preemptive.

As it should be, IMO.

Note that other examples do show that exceptions raised in sparked threads are ignored:

   undefined `par` x == x

even if x does perform allocations. Of course, if that undefined is indeed demanded in the main thread, it will raise an exception.

Note that if you take my previous example and replace undefined with a plain non-termination, say a well-behaved "allocating" one,
I still think that
   take 1 $ parBuffer 10 r0 (1:2:3:nonTerminating)
still should terminate, evaluating to 1.

The problem, I think, lies in parBuffer being more strict than needed. The comments in the code claim

   parBuffer n r0 == id

when it is actually stricter than id, as I shown. This also means there's a loss of parallelism: in

  parBuffer n strat . map f . parBuffer m strat

the second parBuffer will not output its results unless after m of them are "ready", so the first parBuffer can not start applying f to them.

Zun.
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to