Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-08 Thread Martin Di Paola
Then, you must put the initialization (dynamically loading the modules) into the function executed in the foreign process. You could wrap the payload function into a class instances to achieve this. In the foreign process, you call the instance which first performs the initialization and then exe

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-07 Thread Dieter Maurer
Martin Di Paola wrote at 2022-3-6 20:42 +: >>Try to use `fork` as "start method" (instead of "spawn"). > >Yes but no. Indeed with `fork` there is no need to pickle anything. In >particular the child process will be a copy of the parent so it will >have all the modules loaded, including the dyna

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-07 Thread Martin Di Paola
g: import multiprocessing import multiprocessing.reduction import pickle pickle.dumps(multiprocessing.reduction.ForkingPickler) In a separated Python console run the following: import pickle import sys 'multiprocessing' in sys.modules False pickle.loads() 'multiprocessing

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-07 Thread Barry
> On 7 Mar 2022, at 02:33, Martin Di Paola wrote: > > Yes but I think that unpickle (pickle.loads()) does that plus > importing any module needed Are you sure that unpickle will import code? I thought it did not do that. Barry -- https://mail.python.org/mailman/listinfo/python-list

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Martin Di Paola
Yeup, that would be my first choice but the catch is that "sayhi" may not be a function of the given module. It could be a static method of some class or any other callable. Ah, fair. Are you able to define it by a "path", where each step in the path is a getattr() call? Yes but I think th

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Martin Di Paola
that into account. I agree that if the developer uses multiprocessing he/she needs to know its implications. But if I can "smooth" any rough corner, I will try to do it. For example, the main project (developed by me) uses threads for concurrency. It would be simpler to load the pl

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Greg Ewing
On 7/03/22 9:36 am, Martin Di Paola wrote: It *would* be my fault if multiprocessing.Process fails only because I'm loading the code dynamically. I'm not so sure about that. The author of the plugin knows they're writing code that will be dynamically loaded, and can therefore expect the kind of

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Chris Angelico
On Mon, 7 Mar 2022 at 07:37, Martin Di Paola wrote: > > > > > > >The way you've described it, it's a hack. Allow me to slightly redescribe it. > > > >modules = loader() > >objs = init(modules) > > > >def invoke(mod, func): > ># I'm assuming that the loader is smart enough to not load > >#

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Martin Di Paola
Try to use `fork` as "start method" (instead of "spawn"). Yes but no. Indeed with `fork` there is no need to pickle anything. In particular the child process will be a copy of the parent so it will have all the modules loaded, including the dynamic ones. Perfect. The problem is that `fork` is t

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Martin Di Paola
may not be a function of the given module. It could be a static method of some class or any other callable. And doing the lookup by hand sounds complex. The thing is that the use of multiprocessing is not something required by me (by my plugin-engine), it was a decision of the developer of

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Dieter Maurer
Martin Di Paola wrote at 2022-3-6 12:42 +: >Hi everyone. I implemented time ago a small plugin engine to load code >dynamically. > >So far it worked well but a few days ago an user told me that he wasn't >able to run in parallel a piece of code in MacOS. > >He was using multiprocessing.Process

Re: Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Chris Angelico
de, we can assume that a module is itself, no matter what; it won't be a perfect clone of itself, it will actually be the same module. If you want to support multiprocessing, I would recommend disconnecting yourself from the concept of loaded modules, and instead identify the target by its mod

Execute in a multiprocessing child dynamic code loaded by the parent process

2022-03-06 Thread Martin Di Paola
Hi everyone. I implemented time ago a small plugin engine to load code dynamically. So far it worked well but a few days ago an user told me that he wasn't able to run in parallel a piece of code in MacOS. He was using multiprocessing.Process to run the code and in MacOS, the default start metho

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Chris Angelico
On Thu, 3 Feb 2022 at 13:32, Avi Gross via Python-list wrote: > > Jen, > > I would not be shocked at incompatibilities in the system described making it > hard to exchange anything, including text, but am not clear if there is a > limitation of four bytes in what can be shared. For me, a charact

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Avi Gross via Python-list
@python.org Sent: Wed, Feb 2, 2022 1:27 pm Subject: Re: Data unchanged when passing data to Python in multiprocessing shared memory An ASCII string will not work.  If you convert 32894 to an ascii string you will have five bytes, but you need four.  In my original post I showed the C program I

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Barry
> On 2 Feb 2022, at 18:19, Jen Kris via Python-list > wrote: > > It's not clear to me from the struct module whether it can actually > auto-detect endianness. It is impossible to auto detect endian in the general case. > I think it must be specified, just as I had to do with int.from_byte

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Dennis Lee Bieber
On Wed, 2 Feb 2022 19:16:19 +0100 (CET), Jen Kris declaimed the following: >It's not clear to me from the struct module whether it can actually >auto-detect endianness.  I think it must be specified, just as I had to do >with int.from_bytes().  In my case endianness was dictated by how the four

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Jen Kris via Python-list
y is also not. > > -Original Message- > From: Dennis Lee Bieber > To: python-list@python.org > Sent: Wed, Feb 2, 2022 12:30 am > Subject: Re: Data unchanged when passing data to Python in multiprocessing > shared memory > > > On Wed, 2 Feb 2022 00:40:22 +0100

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Jen Kris via Python-list
It's not clear to me from the struct module whether it can actually auto-detect endianness.  I think it must be specified, just as I had to do with int.from_bytes().  In my case endianness was dictated by how the four bytes were populated, starting with the zero bytes on the left.  Feb 1, 202

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Avi Gross via Python-list
m: Dennis Lee Bieber To: python-list@python.org Sent: Wed, Feb 2, 2022 12:30 am Subject: Re: Data unchanged when passing data to Python in multiprocessing shared memory On Wed, 2 Feb 2022 00:40:22 +0100 (CET), Jen Kris declaimed the following: > > breakup = int.from_bytes(byte_va

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Dennis Lee Bieber
On Wed, 2 Feb 2022 00:40:22 +0100 (CET), Jen Kris declaimed the following: > > breakup = int.from_bytes(byte_val, "big") >print("this is breakup " + str(breakup)) > >Python prints:  this is breakup 32894 > >Note that I had to switch from little endian to big endian.  Python is little >endian by

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-02 Thread Barry Scott
> On 1 Feb 2022, at 23:40, Jen Kris wrote: > > Barry, thanks for your reply. > > On the theory that it is not yet possible to pass data from a non-Python > language to Python with multiprocessing.shared_memory, I bypassed the problem > by attaching 4 bytes to my FIFO pipe message from NASM

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-01 Thread Jen Kris via Python-list
Barry, thanks for your reply.  On the theory that it is not yet possible to pass data from a non-Python language to Python with multiprocessing.shared_memory, I bypassed the problem by attaching 4 bytes to my FIFO pipe message from NASM to Python: byte_val = v[10:14] where v is the message re

Re: Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-01 Thread Barry
> On 1 Feb 2022, at 20:26, Jen Kris via Python-list > wrote: > > I am using multiprocesssing.shared_memory to pass data between NASM and > Python. The shared memory is created in NASM before Python is called. > Python connects to the shm: shm_00 = > shared_memory.SharedMemory(name='shm_

Data unchanged when passing data to Python in multiprocessing shared memory

2022-02-01 Thread Jen Kris via Python-list
I am using multiprocesssing.shared_memory to pass data between NASM and Python.  The shared memory is created in NASM before Python is called.  Python connects to the shm:  shm_00 = shared_memory.SharedMemory(name='shm_object_00',create=False).  I have used shared memory at other points in thi

Re: threading and multiprocessing deadlock

2021-12-06 Thread Dieter Maurer
Johannes Bauer wrote at 2021-12-6 00:50 +0100: >I'm a bit confused. In my scenario I a mixing threading with >multiprocessing. Threading by itself would be nice, but for GIL reasons >I need both, unfortunately. I've encountered a weird situation in which >multiprocessing

Re: threading and multiprocessing deadlock

2021-12-06 Thread Barry Scott
> On 5 Dec 2021, at 23:50, Johannes Bauer wrote: > > Hi there, > > I'm a bit confused. In my scenario I a mixing threading with > multiprocessing. Threading by itself would be nice, but for GIL reasons > I need both, unfortunately. I've encount

Re: threading and multiprocessing deadlock

2021-12-06 Thread Johannes Bauer
Am 06.12.21 um 13:56 schrieb Martin Di Paola: > Hi!, in short your code should work. > > I think that the join-joined problem is just an interpretation problem. > > In pseudo code the background_thread function does: > > def background_thread() >   # bla >   print("join?") >   # bla >   print("j

Re: threading and multiprocessing deadlock

2021-12-06 Thread Martin Di Paola
threads that you spawned (background_thread functions). I hope that this can guide you to fix or at least narrow the issue. Thanks, Martin. On Mon, Dec 06, 2021 at 12:50:11AM +0100, Johannes Bauer wrote: Hi there, I'm a bit confused. In my scenario I a mixing threading with multiprocess

threading and multiprocessing deadlock

2021-12-05 Thread Johannes Bauer
Hi there, I'm a bit confused. In my scenario I a mixing threading with multiprocessing. Threading by itself would be nice, but for GIL reasons I need both, unfortunately. I've encountered a weird situation in which multiprocessing Process()es which are started in a new thread don't

Re: How Do I Get A Bug In Multiprocessing Fixed?

2021-06-18 Thread Terry Reedy
On 6/17/2021 5:02 PM, Michael Boom wrote: The below issue is pretty serious and it is preventing me from using a system I wrote on a larger scale. How do I get this bug fixed? Thanks. https://bugs.python.org/issue43329 Reduce your code to the minimum needed to exhibit the problem. Then run

Re: How Do I Get A Bug In Multiprocessing Fixed?

2021-06-18 Thread Oscar Benjamin
y quickly and I couldn't immediately tell if the problem was a bug in multiprocessing or a mistake in the code shown. Just figuring that out would take more than the very little time I was prepared to spend looking at it so I moved on. If the OP hopes that someone else will use their limited ti

Re: How Do I Get A Bug In Multiprocessing Fixed?

2021-06-17 Thread Barry
; Got exception , ConnectionResetError(10054, > 'An existing connection was forcibly closed by the remote host', None, > 10054, None) > Reconnecting > Got exception , ConnectionResetError(10054, > 'An existing connection was forcibly closed by the remote host&#

Re: How Do I Get A Bug In Multiprocessing Fixed?

2021-06-17 Thread Alexander Neilson
;No connection could be made because the target machine actively refused it', None, 10061, None) Reconnecting Traceback (most recent call last): File "C:\Users\Alexander\AppData\Local\Programs\Python\Python37\lib\multiprocessing\connection.py", line 619, in SocketClient s.connect(address

How Do I Get A Bug In Multiprocessing Fixed?

2021-06-17 Thread Michael Boom
The below issue is pretty serious and it is preventing me from using a system I wrote on a larger scale. How do I get this bug fixed? Thanks. https://bugs.python.org/issue43329 -- https://mail.python.org/mailman/listinfo/python-list

Re: Threading plus multiprocessing plus cv2 error

2020-09-03 Thread Python
On Sat, Aug 29, 2020 at 06:24:10PM +1000, John O'Hagan wrote: > Dear list > > Thanks to this list, I haven't needed to ask a question for > a very long time, but this one has me stumped. > > Here's the minimal 3.8 code, on Debian testing: > > - >

Re: Threading plus multiprocessing plus cv2 error

2020-09-01 Thread John O'Hagan
rror without the sleep(1), nor if the Process is > >> started before the Thread, nor if two Processes are used instead, > >> nor if two Threads are used instead. IOW the error only occurs if > >> a Thread is started first, and a Process is started a little later. > >>

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread Stephane Tougard via Python-list
On 2020-08-30, Barry wrote: >* The child process is created with a single thread—the one that > called fork(). The entire virtual address space of the parent is > replicated in the child, including the states of mutexes, > condition variables, and other pthr

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread Barry
> On 30 Aug 2020, at 11:03, Stephane Tougard via Python-list > wrote: > > On 2020-08-30, Chris Angelico wrote: >>> I'm not even that makes sense, how 2 processes can share a thread ? >>> >> They can't. However, they can share a Thread object, which is the >> Python representation of a threa

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread Stephane Tougard via Python-list
On 2020-08-30, Chris Angelico wrote: >> I'm not even that makes sense, how 2 processes can share a thread ? >> > They can't. However, they can share a Thread object, which is the > Python representation of a thread. That can lead to confusion, and > possibly the OP's error (I don't know for sure,

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread Barry Scott
e used instead, nor if two >> Threads are used instead. IOW the error only occurs if a Thread is >> started first, and a Process is started a little later. >> >> Any ideas what might be causing the error? >> > > Under Linux, multiprocessing creates proce

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread John O'Hagan
instead, nor if two > >Threads are used instead. IOW the error only occurs if a Thread is > >started first, and a Process is started a little later. > > > >Any ideas what might be causing the error? > > > > Under Linux, multiprocessing creates proces

Re: Threading plus multiprocessing plus cv2 error

2020-08-30 Thread Karen Shaeffer via Python-list
> On Aug 29, 2020, at 10:12 PM, Stephane Tougard via Python-list > wrote: > > On 2020-08-29, Dennis Lee Bieber wrote: >> Under Linux, multiprocessing creates processes using fork(). That means >> that, for some fraction of time, you have TWO processes sharing th

Re: Threading plus multiprocessing plus cv2 error

2020-08-29 Thread Chris Angelico
On Sun, Aug 30, 2020 at 4:01 PM Stephane Tougard via Python-list wrote: > > On 2020-08-29, Dennis Lee Bieber wrote: > > Under Linux, multiprocessing creates processes using fork(). That > > means > > that, for some fraction of time, you have TWO processes sharing

Re: Threading plus multiprocessing plus cv2 error

2020-08-29 Thread Stephane Tougard via Python-list
On 2020-08-29, Dennis Lee Bieber wrote: > Under Linux, multiprocessing creates processes using fork(). That means > that, for some fraction of time, you have TWO processes sharing the same > thread and all that entails (if it doesn't overlay the forked process with > a new

Threading plus multiprocessing plus cv2 error

2020-08-29 Thread John O'Hagan
Dear list Thanks to this list, I haven't needed to ask a question for a very long time, but this one has me stumped. Here's the minimal 3.8 code, on Debian testing: - from multiprocessing import Process from threading import Thread from time import sleep import cv2 def show

Re: Multiprocessing vs. concurrent.futures, Linux vs. Windows

2020-05-05 Thread Terry Reedy
use in the subprocess through args. That would include the Pipe connection. Using multiprocessing in Linux requires the reference names to be global, however the use of args is not required. Finally, Linux does not appear to cause any problems if args are specified. 2. Even if you fix problem

Re: Multiprocessing vs. concurrent.futures, Linux vs. Windows

2020-05-04 Thread John Ladasky
On Monday, May 4, 2020 at 4:09:53 PM UTC-7, Terry Reedy wrote: > On 5/4/2020 3:26 PM, John Ladasky wrote: > > Several years ago I built an application using multiprocessing. It only > > needed to work in Linux. I got it working fine. At the time, > > concurrent.

Re: Multiprocessing vs. concurrent.futures, Linux vs. Windows

2020-05-04 Thread Terry Reedy
On 5/4/2020 3:26 PM, John Ladasky wrote: Several years ago I built an application using multiprocessing. It only needed to work in Linux. I got it working fine. At the time, concurrent.futures did not exist. My current project is an application which includes a PyQt5 GUI, and a live video

Multiprocessing vs. concurrent.futures, Linux vs. Windows

2020-05-04 Thread John Ladasky
Several years ago I built an application using multiprocessing. It only needed to work in Linux. I got it working fine. At the time, concurrent.futures did not exist. My current project is an application which includes a PyQt5 GUI, and a live video feed with some real-time image processing

multiprocessing

2020-04-20 Thread Edward Montague
Upon using sympy's rubi_integrate upon my quad core computer, I find that the first CPU is being used 100%, whilst the other three are around 1% and 2% . I'm wondering if you have some code to overcome this limitation. -- https://mail.python.org/mailman/listinfo/python-list

Re: Multiprocessing queue sharing and python3.8

2020-04-06 Thread Israel Brewster
forcing it back to using fork does indeed “fix” the issue. Of course, as is noted there, the fork start method should be considered unsafe, so I guess I get to re-architect everything I do using multiprocessing that relies on data-sharing between processes. The Queue example was just a mini

Re: Multiprocessing queue sharing and python3.8

2020-04-06 Thread Israel Brewster
alize the Queue >mp_comm_queue = mp.Queue() > >#Set up a pool to process a bunch of stuff in parallel >pool = mp.Pool(initializer = pool_init, initargs = (mp_comm_queue,)) >... > > Gotcha, thanks. I’ll look more into that initializer argument and see how I can lev

RE: Multiprocessing queue sharing and python3.8

2020-04-06 Thread David Raymond
= mp.Pool(initializer = pool_init, initargs = (mp_comm_queue,)) ... -Original Message- From: David Raymond Sent: Monday, April 6, 2020 4:19 PM To: python-list@python.org Subject: RE: Multiprocessing queue sharing and python3.8 Attempting reply as much for my own understanding. Are you on

RE: Multiprocessing queue sharing and python3.8

2020-04-06 Thread David Raymond
se the function that the Pool is executing finishes so quickly. Add a little extra info to the print calls (and/or set up logging to stdout with the process name/id included) and you can see some of this. Here's the hacked together changes I did for that. import multiprocessing as mp import

Multiprocessing queue sharing and python3.8

2020-04-06 Thread Israel Brewster
Under python 3.7 (and all previous versions I have used), the following code works properly, and produces the expected output: import multiprocessing as mp mp_comm_queue = None #Will be initalized in the main function mp_comm_queue2=mp.Queue() #Test pre-initalized as well def

Re: Multiprocessing, join(), and crashed processes

2020-02-05 Thread Cameron Simpson
On 05Feb2020 15:48, Israel Brewster wrote: In a number of places I have constructs where I launch several processes using the multiprocessing library, then loop through said processes calling join() on each one to wait until they are all complete. In general, this works well, with the

Multiprocessing, join(), and crashed processes

2020-02-05 Thread Israel Brewster
In a number of places I have constructs where I launch several processes using the multiprocessing library, then loop through said processes calling join() on each one to wait until they are all complete. In general, this works well, with the *apparent* exception of if something causes one of

Re: multiprocessing article on PYMOTW - subclassing with 'def run' and 'logging'

2019-11-16 Thread Veek M
answered here https://www.reddit.com/r/Python/comments/dxhgec/ how_does_multiprocessing_convert_a_methodrun_in/ basically starts two PVMs - the whole fork, check 'pid' trick.. one process continues as the main thread and the other calls 'run' -- https://mail.python.org/mailman/listinfo/python-li

multiprocessing article on PYMOTW - subclassing with 'def run' and 'logging'

2019-11-16 Thread Veek M
https://pymotw.com/2/multiprocessing/basics.html https://pymotw.com/2/threading/ I didn't follow this 1. >The logger can also be configured through the logging configuration file >API, using the name multiprocessing. and 2. >it is also possible to use a custom subc

Re: Random signal capture when using multiprocessing

2019-07-06 Thread José María Mateos
On Sat, Jul 06, 2019 at 04:54:42PM +1000, Chris Angelico wrote: > But if I comment out the signal.signal line, there seem to be no ill > effects. I suspect that what you're seeing here is the multiprocessing > module managing its own subprocesses, telling some of them to shut >

Re: Random signal capture when using multiprocessing

2019-07-05 Thread Chris Angelico
On Sat, Jul 6, 2019 at 12:13 AM José María Mateos wrote: > > Hi, > > This is a minimal proof of concept for something that has been bugging me for > a few days: > > ``` > $ cat signal_multiprocessing_poc.py > > import random > import multiprocessing >

Re: Random signal capture when using multiprocessing

2019-07-05 Thread dieter
xpected signal 15! > ... "multiprocessing.util" may register an exit function which calls "terminate" which signals SIGTERM. There is also an "os.kill(..., SIGTERM)" in "multiprocessing.popen_fork". I would put some print at those places to determine if the SIGTERM comes from "multiprocessing". -- https://mail.python.org/mailman/listinfo/python-list

Random signal capture when using multiprocessing

2019-07-05 Thread José María Mateos
Hi, This is a minimal proof of concept for something that has been bugging me for a few days: ``` $ cat signal_multiprocessing_poc.py import random import multiprocessing import signal import time def signal_handler(signum, frame): raise Exception(f"Unexpected signal {signum}!&q

Re: Multiprocessing and memory management

2019-07-04 Thread Thomas Jollans
On 03/07/2019 18.37, Israel Brewster wrote: > I have a script that benefits greatly from multiprocessing (it’s generating a > bunch of images from data). Of course, as expected each process uses a chunk > of memory, and the more processes there are, the more memory used. The amount &

Re: Multiprocessing and memory management

2019-07-03 Thread Peter J. Holzer
On 2019-07-03 08:37:50 -0800, Israel Brewster wrote: > 1) Determine the total amount of RAM in the machine (how?), assume an > average of 10GB per process, and only launch as many processes as > calculated to fit. Easy, but would run the risk of under-utilizing the > processing capabilities and tak

Re: Multiprocessing and memory management

2019-07-03 Thread Gary Herron
On 7/3/19 9:37 AM, ijbrews...@alaska.edu wrote: I have a script that benefits greatly from multiprocessing (it’s generating a bunch of images from data). Of course, as expected each process uses a chunk of memory, and the more processes there are, the more memory used. The amount used per

Multiprocessing and memory management

2019-07-03 Thread Israel Brewster
I have a script that benefits greatly from multiprocessing (it’s generating a bunch of images from data). Of course, as expected each process uses a chunk of memory, and the more processes there are, the more memory used. The amount used per process can vary from around 3 GB (yes, gigabytes) to

Re: Multiprocessing vs subprocess

2019-03-13 Thread oliver
With multiprocessing you can take advantage of multi-core processing as it launches a separate python interpreter process and communicates with it via shared memory (at least on windows). The big advantage of multiprocessing module is that the interaction between processes is much richer than

RE: Multiprocessing vs subprocess

2019-03-12 Thread Schachner, Joseph
Re: " My understanding (so far) is that the tradeoff of using multiprocessing is that my manager script can not exit until all the work processes it starts finish. If one of the worker scripts locks up, this could be problematic. Is there a way to use multiprocessing where processe

Re: Multiprocessing vs subprocess to run Python scripts in parallel

2019-03-12 Thread Chris Angelico
ipts may end > up running in parallel. There are no dependencies between individual worker > scripts. I'm looking for the pros and cons of using multiprocessing or > subprocess to launch these worker scripts. Looking for a solution that works > across Windows and Linux. Open to usin

Multiprocessing vs subprocess to run Python scripts in parallel

2019-03-12 Thread Malcolm Greene
individual worker scripts. I'm looking for the pros and cons of using multiprocessing or subprocess to launch these worker scripts. Looking for a solution that works across Windows and Linux. Open to using a 3rd party library. Hoping to avoid the use of yet another system component like Cele

Re: Multiprocessing performance question

2019-02-21 Thread DL Neil
y split the list up into chunks and process each chunk in parallel on a separate core. To that end, I created a multiprocessing pool: I recall a similar discussion when folk were being encouraged to move away from monolithic and straight-line processing to modular functions - it is more (CPU-

Re: Multiprocessing performance question

2019-02-21 Thread Israel Brewster
imply split the list up into chunks >> and process each chunk in parallel on a separate core. To that end, I >> created a multiprocessing pool: > > > I recall a similar discussion when folk were being encouraged to move away > from monolithic and straight-line processing t

Re: Multiprocessing performance question

2019-02-20 Thread george trojan
x(). This > > takes about 10 seconds to run. > > > > Looking at this, I am thinking it would lend itself well to > > parallelization. Since the box at each “coordinate" is independent of all > > others, it seems I should be able to simply split the list up in

Re: Multiprocessing performance question

2019-02-20 Thread DL Neil
f all others, it seems I should be able to simply split the list up into chunks and process each chunk in parallel on a separate core. To that end, I created a multiprocessing pool: I recall a similar discussion when folk were being encouraged to move away from monolithic and straight-line proce

Re: Multiprocessing performance question

2019-02-20 Thread george trojan
split the list up into chunks and process each chunk in parallel on a separate core. To that end, I created a multiprocessing pool: pool = multiprocessing.Pool() And then called pool.map() rather than just “map”. Somewhat to my surprise, the execution time was virtually identical. Given the simplici

Re: Multiprocessing performance question

2019-02-18 Thread Israel Brewster
> On Feb 18, 2019, at 6:37 PM, Ben Finney wrote: > > I don't have anything to add regarding your experiments with > multiprocessing, but: > > Israel Brewster writes: > >> Which creates and populates an 800x1000 “grid” (represented as a flat >> list at

Re: Multiprocessing performance question

2019-02-18 Thread Ben Finney
I don't have anything to add regarding your experiments with multiprocessing, but: Israel Brewster writes: > Which creates and populates an 800x1000 “grid” (represented as a flat > list at this point) of “boxes”, where a box is a > shapely.geometry.box(). This takes about 10

Multiprocessing performance question

2019-02-18 Thread Israel Brewster
f all others, it seems I should be able to simply split the list up into chunks and process each chunk in parallel on a separate core. To that end, I created a multiprocessing pool: pool = multiprocessing.Pool() And then called pool.map() rather than just “map”. Somewhat to my surprise

Re: Multiprocessing "Pool" aborts without any error message or return code? (Python 3.6.4, cygwin 32 bit, Windows Server 2012)

2018-08-13 Thread Niels Kristian Jensen
fredag den 10. august 2018 kl. 15.35.46 UTC+2 skrev Niels Kristian Jensen: > Please refer to: > (cut) It appears, that Python is simply not supported on Cygwin (!): https://bugs.python.org/issue30563 Best regards, Niels Kristian -- https://mail.python.org/mailman/listinfo/python-list

Re: Embedded Python and multiprocessing on Windows?

2018-08-10 Thread applemask
On Friday, August 10, 2018 at 2:28:45 AM UTC-4, Léo El Amri wrote: > That may be something simple: Did you actually protected the entry-point > of your Python script with if __name__ == '__main__': ? That was my first thought too; the script technically doesn't have top-level code, so I figured I

Multiprocessing "Pool" aborts without any error message or return code? (Python 3.6.4, cygwin 32 bit, Windows Server 2012)

2018-08-10 Thread aenkaa
or not: adminnkj@DTDKCPHAS1060 ~ $ python3 -V Python 3.6.4 adminnkj@DTDKCPHAS1060 ~ $ cat test.py from multiprocessing import Pool def f(x): return x*x if __name__ == '__main__': p = Pool(5) print(p.map(f, [1, 2, 3])) adminnkj@DTDKCPHAS1060 ~ $ python3 test.py --->

Re: Embedded Python and multiprocessing on Windows?

2018-08-09 Thread Léo El Amri via Python-list
On 09/08/2018 19:33, Apple wrote:> So my program runs one script file, and multiprocessing commands from that script file seem to fail to spawn new processes. > > However, if that script file calls a function in a separate script file that > it has imported, and that fu

Re: Multiprocessing on a remote host

2018-06-10 Thread Peter J. Holzer
On 2018-03-21 09:27:37 -0400, Larry Martell wrote: > Yeah, I saw that and I wasn't trying to reinvent the wheel. On this > page https://docs.python.org/2/library/multiprocessing.html it says > this: > > The multiprocessing package offers both local and remote concurrency

RE: logging with multiprocessing

2018-06-08 Thread Schachner, Joseph
Multiprocessing, not multithreading. Different processes. This is pretty easy to do. I have done this from a Python script to run an analysis program on many sets of data, at once. To do it: 1) if there is going to be an output file, each output file must have a distinct name. 2) To use

logging with multiprocessing

2018-06-07 Thread jenil . desai25
Hello, I am new to logging module. I want to use logging module with multiprocessing. can anyone help me understand how can I do it?. Any help would be appreciated. Thank you. -- https://mail.python.org/mailman/listinfo/python-list

Re: python notifying calling script that multiprocessing tasks have finished at lower level scripts

2018-04-13 Thread Thomas Jollans
1st sequence (script_1 to 3), master will execute the > 2nd sequence (script_4 to 6). > > Each child script will be calling a multiprocessing function to process a > task. I could ask what motivates this convoluted-sounding structure... > > [snip] > > > I like to know how

python notifying calling script that multiprocessing tasks have finished at lower level scripts

2018-04-13 Thread Daiyue Weng
), master will execute the 2nd sequence (script_4 to 6). Each child script will be calling a multiprocessing function to process a task. Master script is like, for seq in seqs_to_launch: for script in seq: script().execute(data) Each child script is like, import multi_process_update

Re: Multiprocessing on a remote host

2018-03-21 Thread Larry Martell
On Tue, Mar 20, 2018 at 11:15 PM, Steven D'Aprano wrote: > On Wed, 21 Mar 2018 02:20:16 +, Larry Martell wrote: > >> Is there a way to use the multiprocessing lib to run a job on a remote >> host? > > Don't try to re-invent the wheel. This is a solved prob

Re: Multiprocessing on a remote host

2018-03-20 Thread Steven D'Aprano
On Wed, 21 Mar 2018 02:20:16 +, Larry Martell wrote: > Is there a way to use the multiprocessing lib to run a job on a remote > host? Don't try to re-invent the wheel. This is a solved problem. https://stackoverflow.com/questions/1879971/what-is-the-current-choice-for-doing-rp

Multiprocessing on a remote host

2018-03-20 Thread Larry Martell
Is there a way to use the multiprocessing lib to run a job on a remote host? -- https://mail.python.org/mailman/listinfo/python-list

in multiprocessing pool map how to stop one process from executing ahead

2018-03-20 Thread Subramanian P V
I am excecting custom commands like shell on multiple linux hosts. and if in one host one of the commands fail. I want that process not to proceed. If the remote command throws an error i am logging it .. but the process goes to next command . but if i terminate the command, the process will t

Re: Import statements and multiprocessing

2018-01-31 Thread Nicholas Cole
On Tue, Jan 30, 2018 at 7:26 PM, Terry Reedy wrote: > On 1/30/2018 10:54 AM, Nicholas Cole wrote: > >> I have a strange problem on python 3.6.1 > > [involving multiprocessing] Interestingly it seems to have been a very subtle circular import problem that was showing up only

Re: Import statements and multiprocessing

2018-01-30 Thread Terry Reedy
On 1/30/2018 10:54 AM, Nicholas Cole wrote: I have a strange problem on python 3.6.1 [involving multiprocessing] I think the first thing you should do is upgrade to 3.6.4 to get all the bugfixes since 3.6.4. I am pretty sure there have been some for multiprocessing itself. *Then* see if you

Re: Import statements and multiprocessing

2018-01-30 Thread Nicholas Cole
not importing a single file but a subpackage __init__ file. That __init__ file does not have its own __all__ statement, but seems to just be relying on importing from different files in the subpackage. Could that be the problem? Even so, I'm unsure why it is showing up only when used in multipro

Re: Import statements and multiprocessing

2018-01-30 Thread Nicholas Cole
*simplified* demonstration? A minimal sample program > which we can run that demonstrates the issue? [snip] I find it extremely odd. File A: the multiprocessing code and the map function. file B: a set of library functions called by the function called in file A. file C: included in file B by us

Re: Import statements and multiprocessing

2018-01-30 Thread Steven D'Aprano
On Tue, 30 Jan 2018 15:54:30 +, Nicholas Cole wrote: [...] > The function I am passing to map calls a function in another file within > the same model. And that file has a > > from .some_file_in_the_package import * > > line at the top. > > However, in each function called in that file, I

Import statements and multiprocessing

2018-01-30 Thread Nicholas Cole
Dear List, I have a strange problem on python 3.6.1 I am using the multiprocessing function to parallelize an expensive operation, using the multiprocessing.Pool() and Pool.map() functions. The function I am passing to map calls a function in another file within the same model. And that file

Re: multiprocessing shows no benefit

2017-10-20 Thread Michele Simionato
multiprocessing will never help, since the operation is too fast with respect to the overhead involved in multiprocessing. In that case just give up and think about ways of changing the original problem. -- https://mail.python.org/mailman/listinfo/python-list

  1   2   3   4   5   6   7   8   9   10   >