I am an inexperienced Perl hacker with a problem.  I am writing a script that takes a 
long time to execute.  Each invocation may take 2+ minutes to process.  While the 
script is processing  I do not want the user to have to wait.  

In short  the script

1. lwp-rgets a website to a local copy
2. machine translates each file into 7 seven languages
3. post-process links so the files/links work on original site
4. tar/zips it
5. present this file to the user to install on their server

Step 2 and 3 can take a while.  If I use fork  I have to wait for all child processes 
to end before exiting.  If the end user gets bored  they kill the script.  I tried to 
following with little success:

Write and display an html (temp) file which refreshes q 3 seconds.  When the child 
process is complete  the temp file is rewritten to refresh to the tar file for 
downloading. -  Problem: child process dies on first refresh.  

No interim page - time out occurs somewhere along the line and process dies.  besides 
no one will wait this long.

My "Debug mode" - all debug text lines through script print to STDOUT/browser.  Lots 
of info on the screen - and the processing works albeit after minutes of debug 
messages fill the screen.

I would like the processing to be nearly instant but that is not possible since I do 
not have a couple hundred thousand lying around and I am alone.  Given this  I would 
like to present a "We are working on it" page and let the user move on.  When the task 
is done  the scrpit will email a link with the tar file for download   presumably 
within a few minutes.  

The answer is probably simple but unfortunately my programming skills are limited.  I 
appreciate any help.

Ray

Reply via email to