2012/8/1 Laszlo Nagy <gand...@shopzeus.com>: >> I was just surprised that it worked better than I expected even >> without Pipes and Queues, but now I understand why.. >> >> Anyway now I would like to be able to detach subprocesses to avoid the >> nasty code reloading that I was talking about in another thread, but >> things get more tricky, because I can't use queues and pipes to >> communicate with a running process that it's noit my child, correct? >> > Yes, I think that is correct. Instead of detaching a child process, you can > create independent processes and use other frameworks for IPC. For example, > Pyro. It is not as effective as multiprocessing.Queue, but in return, you > will have the option to run your service across multiple servers. > > The most effective IPC is usually through shared memory. But there is no OS > independent standard Python module that can communicate over shared memory. > Except multiprocessing of course, but AFAIK it can only be used to > communicate between fork()-ed processes.
Thanks, there is another thing which is able to interact with running processes in theory: https://github.com/lmacken/pyrasite I don't know though if it's a good idea to use a similar approach for production code, as far as I understood it uses gdb.. In theory though I could be able to set up every subprocess with all the data they need, so I might not even need to share data between them. Anyway now I had another idea to avoid to be able to stop the main process without killing the subprocesses, using multiple forks. Does the following makes sense? I don't really need these subprocesses to be daemons since they should quit when done, but is there anything that can go wrong with this approach? from os import fork from time import sleep from itertools import count from sys import exit from multiprocessing import Process, Queue class LongProcess(Process): def __init__(self, idx, queue): Process.__init__(self) # self.daemon = True self.queue = queue self.idx = idx def run(self): for i in count(): self.queue.put("%d: %d" % (self.idx, i)) print("adding %d: %d" % (self.idx, i)) sleep(2) if __name__ == '__main__': qu = Queue() # how do I do a multiple fork? for i in range(5): pid = fork() # if I create here all the data structures I should still be able to do things if pid == 0: lp = LongProcess(1, qu) lp.start() lp.join() exit(0) else: print("started subprocess with pid ", pid) -- http://mail.python.org/mailman/listinfo/python-list