Folks,
 
I need some help with process forking on Solaris. What I'm trying to do is as follows:
 
    I have 60+ directories which I need to copy. Each directory has a different destination and so it is impossible to do the copy with a single command i.e. I must have one copy statement per directory.
 
    I'd also like to copy all 60+ directories simultaneously i.e. issue 60 commands at the same time.
 
    The perl script must wait until all of the 60+ copies have been completed, before generating a log file containing the results.
 
Although it is fairly simple to launch the 60+ copy commands, how do I monitor each of the processes and know when each process has died.
 
A shell script which someone wrote (I'm trying to convert it into perl) does exactly that, but I can't work out how to do it in perl:
 
set FILERDIRPATH=/net/server/destination/
#This will create the 60+ copy processes
foreach DIR ( * )
 cd $DIR
 cp -p -r backup $FILERDIRPATH/$DIR &
 cd ..
 echo "Copying $DIR to $FILERDIRPATH" >> $LOG
end
 
# wait until all the cp jobs are done
while 1
 jobs > /tmp/jobs.$$
 if (`wc -l /tmp/jobs.$$ | awk '{print $1}'` == "0" ) break
end
 
rm /tmp/jobs.$$
 
Thanks,
Bran.

Reply via email to