On Thu, 5 Oct 2000, Sean D. Cook wrote:

> > On Wed, Oct 04, 2000 at 02:42:50PM -0700, David E. Wheeler wrote:
> > > Yeah, I was thinking something along these lines. Don't know if I need
> > > something as complex as IPC. I was thinking of perhaps a second Apache
> > > server set up just to handle long-term processing. Then the first server
> > > could send a request to the second with the commands it needs to execute
> > > in a header. The second server processes those commands independantly of
> > > the first server, which then returns data to the browser.
> > 
> > In a pinch, I'd just use something like a 'queue' directory. In other
> > words, when your mod_perl code gets some info to process, it writes
> > this into a file in a certain directory (name it with a timestamp /
> > cksum to ensure the filename is unique). Every X seconds, have a
> 
> It might be safer to do this in a db rather than the file system.  That
> way there is less chance for colision and you don't have to worry about
> the file being half written when the daemon comes along and tries to read
> the file while mod_perl/apache is trying to write it.  Let the DB do the
> storage side and let the damon do a select to gather the info.

If you don't have a db easily available, I've had good luck using temp
files.  You can avoid partially written file errors by exploiting the
atomic nature of moving  (renaming) files.  NFS does *not* have this nice
behavior, however.

-Tim 

Reply via email to