On Tue, Mar 27, 2018 at 11:18 AM,  <jlada...@itu.edu> wrote:
> I have used multiprocessing before when I wrote some parallelized code.  That 
> program required significant communication between processes, and it's 
> overkill for my purpose here.  I don't need communication between the 
> spawning (live data) program and the spawned program.  In fact, to the extent 
> that the live data program has to pay attention to anything besides the data 
> stream, I think it could be bad.
>
> I have been investigating the subprocess module.  I'm looking for something 
> which behaves like subprocess.run("python3 my_program.py"), but which does 
> not "Wait for command to complete, then return a CompletedProcess instance."
>

Yeah, that sounds like the right solution. So you want a "fire and
forget" system. First, a caveat: you MAY have problems if the parent
process ends before the child does. (They can be solved but I don't
know whether naive use of subprocess will do that.)

As far as I know, subprocess.run() will always wait for the process to
complete. But you can use the Popen constructor.

>>> subprocess.run(["/bin/bash", "-c", "sleep 2; echo bash"]); 
>>> print("Python")bash
CompletedProcess(args=['/bin/bash', '-c', 'sleep 2; echo bash'], returncode=0)
Python
>>> subprocess.Popen(["/bin/bash", "-c", "sleep 2; echo bash"]); print("Python")
<subprocess.Popen object at 0x7f5d6a5b1240>
Python
>>> bash

You get back a subprocess.Popen object immediately. What I did here
left its standard streams connected to my parent process's streams, so
if the child process tries to print to stdout, it appears in the same
console. If you set those to be pipes or anything, you have to worry
about them filling up; otherwise, this should work for you.

BTW, if you need to run something using the same Python interpreter
that started you, "sys.executable" may help.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to