New submission from anatoly techtonik <techto...@gmail.com>:

There is no way to write a program in Python capable to process large/unlimited 
output coming from a subprocess stream without deadlocks.

http://docs.python.org/library/subprocess.html#subprocess.Popen.communicate
    "Note The data read is buffered in memory, so do not use this method if the 
data size is large or unlimited."

http://docs.python.org/library/subprocess.html#subprocess.Popen.stdin
http://docs.python.org/library/subprocess.html#subprocess.Popen.stdout
http://docs.python.org/library/subprocess.html#subprocess.Popen.stderr
    "Warning Use communicate() rather than .stdin.write, .stdout.read or 
.stderr.read to avoid deadlocks due to any of the other OS pipe buffers filling 
up and blocking the child process."


So, what should I use?

----------
components: Library (Lib)
messages: 161294
nosy: techtonik
priority: normal
severity: normal
status: open
title: subprocess is not safe from deadlocks
versions: Python 2.7, Python 3.3

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue14872>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to