No problem, I'm glad to answer your questions to ensure I'm providing all
relevant info. I do have five wikis, not just one as used in the previous
explanation. Each of the five wikis has its own Apache vhost and
documentroot directory on each of the four servers, making for 20 copies of
MediaWiki.
Another question if this is a single wiki why not again rsync that across
the 4 servers that way when you update one you have them all updated easily
via rsync when a new release of MW comes out.
I think its best I step back on this as I am no wiki expert at all. I can
provide solutions to certain
I think my explanation was not the clearest it could have been. Let's say
for the moment that I have one wiki. That wiki is served by a load balancer
in front of a server farm consisting of four Apache vhosts, one per
physical server, each with its own copy of MediaWiki, LocalSettings.php,
etc. Thu
You are mentioning NFS why not use rsync to replicate to a 2ndary nfs
server and set it to run lets say every 5 to 10 min or how ever often you
want to keep the 2ndary server updated.
On Tue, Oct 28, 2014 at 1:13 AM, Justin Lloyd wrote:
> Hi all,
>
> Currently I have five wikis with the largest
Hi all,
Currently I have five wikis with the largest one being about 35k articles
(109k pages) and pretty heavily trafficked. My basic server architecture is
four web servers behind a load balancer and with a single NFS server that
shares out a directory that contains the upload directory content
On 24 Oct 2014, at 17:31, James Forrester wrote:
>
> How much longer? 1.25 is May 2015; Wikimedia's ZAP -> HAT migration is
> nominally to be finished within a month…
>
Since neither ZAP or HAT is defined anywhere on wikitech or mediawiki.org,
created:
https://wikitech.wikimedia.org/wiki/Z