I think said person wants one DB for multiple wiki installs.
On Tue, Oct 22, 2013 at 2:18 AM, pramod.r.anugu wrote:
> If install the software again i would have to maintain number of
> databases as the number of wikis. I am looking for one software
> installation with multiple wikis and mutliple
cd root-of-destination
wget -r http://site.to.be.copied/wiki
But I don't think that's what the OP wants.
I'd just copy the database tables over. But since he's copying to an "upgraded
server," that may not work well.
Maybe copy the database tables over to an install of the same rev, then upgrad
> On Oct 21, 2013, at 7:58 PM, John Foster wrote:
> Thanks.
> I am aware of that solution & in fact it is my preferred method for
> moving a wiki. However; the reason the mediawiki is slow is a totally
> messed up MySql database system, & I don't know how to fix it. I tried
> for over a year, as t
I can do a little better than that, I can whip up something that copies
everything based off Special:Allpages. If you want me to do that drop me a
email off list and we can work out the details.
On Mon, Oct 21, 2013 at 8:19 PM, Wjhonson wrote:
> How about a script that Googles site:www.myur
How about a script that Googles site:www.myurl.com and then walks every page
and copies it ;)
-Original Message-
From: John Foster
To: mediawiki-list
Cc: MediaWiki announcements and site admin list
Sent: Mon, Oct 21, 2013 4:58 pm
Subject: Re: [MediaWiki-l] Mediawiki articla
If install the software again i would have to maintain number of
databases as the number of wikis. I am looking for one software
installation with multiple wikis and mutliple uploads.
On Fri, Oct 18, 2013 at 11:33 AM, Wjhonson wrote:
> Why would that be a problem?
> Could you not install the soft
On Mon, 2013-10-21 at 16:46 -0700, Yan Seiner wrote:
> John W. Foster wrote:
> > Is there any way to export ALL the articles & or pages from a very slow
> > but working mediawiki. I want to move them to a much faster upgraded
> > mediawiki server.
> > I have tried the dumpbackup script in /maintai
John W. Foster wrote:
Is there any way to export ALL the articles & or pages from a very slow
but working mediawiki. I want to move them to a much faster upgraded
mediawiki server.
I have tried the dumpbackup script in /maintainence, but that didn't get
all the pages, only some, a& I dont know wh
Is there any way to export ALL the articles & or pages from a very slow
but working mediawiki. I want to move them to a much faster upgraded
mediawiki server.
I have tried the dumpbackup script in /maintainence, but that didn't get
all the pages, only some, a& I dont know why. Any tips are apprecia
Thanks for those :) will look into those :)
On Mon, Oct 21, 2013 at 7:03 PM, OQ wrote:
> https://www.mediawiki.org/wiki/Manual:$wgEnableDnsBlacklist
> https://www.mediawiki.org/wiki/Manual:$wgDnsBlacklistUrls
>
>
> On Mon, Oct 21, 2013 at 11:55 AM, Jonathan Aquilina
> wrote:
>
> > Al i dunno wh
https://www.mediawiki.org/wiki/Manual:$wgEnableDnsBlacklist
https://www.mediawiki.org/wiki/Manual:$wgDnsBlacklistUrls
On Mon, Oct 21, 2013 at 11:55 AM, Jonathan Aquilina
wrote:
> Al i dunno why bringing this up struck a note in my mind but i am thinking
> if we want to block spam and abuse on on
Al i dunno why bringing this up struck a note in my mind but i am thinking
if we want to block spam and abuse on one's wiki couldnt this function be
tied into something like an email dns black list of known spam ips?
On Mon, Oct 21, 2013 at 6:41 PM, Al wrote:
> There is no documentation on this
There is no documentation on this on the entire world wide web. There seems to
be a variable missing for the user's IP address that can be passed-in to the
function. I tried user_name, but that didn't work and I'm not sure about the
range parameter either... I tried "66.187.0.0/16" but that di
13 matches
Mail list logo