I'd like to try to disagree here. I have built several file-related webapps where I have implemented virtual filesystems which require special perl modules to access the files at all. mod_perl takes care of serving the requests. If I need a restart, then I can still safely use graceful. Admittedly there are times when something could very well get screwed up, but my solution to that is to develop a better front-end server with it's own buffer so that the back-end can swiftly serve the files leaving much more idle time (in comparison to directly connecting remote client to fileserver) for backend restarts if needed.
Issac Per Einar Ellefsen wrote: > At 23:54 20.05.2002, Allen Day wrote: > >> I've noticed that if I restart apache while I'm in the middle of a >> download (MP3 stream), after the buffer in my MP3 player runs out, it >> skips to the next track -- presumably because the connection was closed. >> >> This might cause a problem for you if your users are downloading big >> files. They might have to restart from the beginning if they didn't >> cache >> the partial download somewhere. > > > Hmm, if you are serving big files off of mod_perl, memory leaks are > the least of your problems :) That doesn't apply to Apache::MP3 of > course, for which it's normal, but in no case should your mod_perl > server be serving your big files. > >> On Mon, 20 May 2002, Matt Sergeant wrote: >> >> > On Monday 20 May 2002 9:30 pm, Gregory Matthews wrote: >> > > I too thought of setting a cron job to restart the server once >> per day in >> > > order to keep the memory "fresh". >> > > >> > > In a production environment, are there any downsides to doing >> this, i.e., >> > > server inaccessibility, etc..? >> > >> > It's very rare to have a site that can't cope with just a few seconds >> > downtime. Most users won't even notice, save for some slight delay >> in getting >> > their request through. Users tend to be pretty used to trying again >> in this >> > world of "reliable" computing. > >