On 24 Oct 2009, at 14:10, Gabriel Genellina wrote:
En Thu, 22 Oct 2009 23:18:32 -0300, Brian Quinlan
<br...@sweetapp.com> escribió:
I don't like a few things in the code:
def _do(i):
print('Run:', i)
q = multiprocessing.Queue()
for j in range(30):
q.put(i*30+j)
processes = _make_some_processes(q)
while not q.empty():
pass
I'd use time.sleep(0.1) or something instead of this busy wait, but
see below.
This isn't my actual code, it is a simplification of my code designed
to minimally demonstrate a possible bug in multiprocessing.
# The deadlock only occurs on Mac OS X and only when these lines
# are commented out:
# for p in processes:
# p.join()
I don't know how multiprocessing deals with it, but if you don't
join() a process it may become a zombie, so it's probably better to
always join them. In that case I'd just remove the wait for
q.empty() completely.
I'm actually not looking for workarounds. I want to know if this is a
multiprocessing bug or if I am misunderstanding the multiprocessing
docs somehow and my demonstrated usage pattern is somehow incorrect.
Cheers,
Brian
for i in range(100):
_do(i)
Those lines should be guarded with: if __name__ == '__main__':
I don't know if fixing those things will fix your problem, but at
least the code will look neater...
--
Gabriel Genellina
--
http://mail.python.org/mailman/listinfo/python-list
--
http://mail.python.org/mailman/listinfo/python-list