Hi ,

I have a multiple processes which are modifying hash of hash of array. For
multiprocessing I am using Parallel::ForkManager

The requirement is I set Max processes to say 5. Each process is fired by
the script and max 5 parallel process runs and it performs some actions on
a complex datastructure like each process after performing a task based on
success moves user entry from pending hash key to completed hash key.

Now this task needs to be run on around 10K users and I need not run this
more than 1 hr after which I need to store the state disc which I can
revisit and start from point where I left.

I am planning to use Storable module for storing and retrieve later.

My question is do I need to implement locks on write operation as multiple
Procs will be writing data to same complex datastructure ? or just using
store() in each process will take care of simultaneous writes ?

Any pointers to some code will be helpful.

Thanks and Regards,
Puneet

Reply via email to