[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread Jesse Noller
Jesse Noller <[EMAIL PROTECTED]> added the comment: No problem David, you're the 4th person to ask me about this in the past 2 months :) ___ Python tracker <[EMAIL PROTECTED]> ___

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
David Decotigny <[EMAIL PROTECTED]> added the comment: Thank you Jesse. When I read this passage, I thought naively that a timeout raised in a get() would not be harmful: that somehow the whole get() request would be aborted. But now I realize that it would make things rather complicated and dang

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread Jesse Noller
Changes by Jesse Noller <[EMAIL PROTECTED]>: -- resolution: -> invalid status: open -> closed ___ Python tracker <[EMAIL PROTECTED]> ___ __

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread Jesse Noller
Jesse Noller <[EMAIL PROTECTED]> added the comment: In a later release, I'd like to massage this in such a way that you do not have to wait for a child queue to be drained prior to calling join. One way to work around this David, is to call Queue.cancel_join_thread(): def f(datasize, q): q

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread Jesse Noller
Jesse Noller <[EMAIL PROTECTED]> added the comment: See http://docs.python.org/dev/library/multiprocessing.html#multiprocessing- programming Specifically: Joining processes that use queues Bear in mind that a process that has put items in a queue will wait before terminating until all the buff

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
David Decotigny <[EMAIL PROTECTED]> added the comment: A quick fix in the user code, when we are sure we don't need the child process if a timeout happens, is to call worker.terminate() in an except Empty clause. ___ Python tracker <[EMAIL PROTECTED]>

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread Benjamin Peterson
Changes by Benjamin Peterson <[EMAIL PROTECTED]>: -- assignee: -> jnoller nosy: +jnoller ___ Python tracker <[EMAIL PROTECTED]> ___ ___

[issue3789] multiprocessing deadlocks when sending large data through Queue with timeout

2008-09-05 Thread David Decotigny
New submission from David Decotigny <[EMAIL PROTECTED]>: With the attached script, then demo() called with for example datasize=40*1024*1024 and timeout=1 will deadlock: the program never terminates. The bug appears on Linux (RHEL4) / intel x86 with "multiprocessing" coming with python 2.6b3 and