I've read the manual https://httpd.apache.org/docs/2.4/stopping.html#graceful and believe I'm doing things "the right way". I know I've seen anecdotes and forum discussions where people complain of long restarts, but I'm confident that those are the result of some particular environment issue (like long-running child processes) or misconfiguration. I've also heard anecdotes that sometimes during a deployment of new MediaWiki versions (thousands of php files) that you might see weird bugs because a specific user request could get a "mixed" set of files (aka some from Vx and some from Vy). I assume the best way to handle roll-outs is to take a server out of rotation from the loadbalancer; update it; and then add it back in. But what about deployments where there is only one server? Short of stopping the server, I guess the technique there would be to make all file updates to a shadow directory, and then replace the symlink or mv the shadow directory into the real directory.
I can look at the scoreboard in server-status during my next deploy to check how things go. Or even better, I could install https://github.com/humbedooh/server-status to keep an eye on things. Still, if anyone on list can confirm their practice for rolling out changes to php.ini + clearing opcache + pushing new code to production under Apache and mod_php, that would be appreciated. Greg Rundlett https://eQuality-Tech.com https://freephile.org On Wed, May 2, 2018 at 3:12 PM, Greg Rundlett (freephile) < g...@freephile.com> wrote: > If I do an apache2ctl -k graceful on Ubuntu (or service httpd restart in > CentOS), using mod_php and a max_execution_time = 30 in php.ini, then is > there any reason why the server would take more than say 1 minute to serve > all requests with the new php.ini + Apache configuration (+ php files)? > > I know max_execution_time doesn't include system calls, so if a large file > were being uploaded and simultaneously thumbnailed at various sizes with > imagemagick or something, then it could take more than 30 seconds. > > I'm asking because I'm doing DevOps and I don't want to introduce delay > into deployments (which already take 15 minutes), but I feel that if I'm > deploying new PHP files (MediaWiki), then each client request should get a > consistent set of files, rather than mixed content from two different > releases which could happen if I just deploy updates without making a > simultaneous graceful restart. > > Thanks, > > Greg > > Greg Rundlett > https://eQuality-Tech.com > https://freephile.org >