Enrique escribió:
> Hello all
> I have an wikipedia for my local intranet reading, i update my local 
> wikipedia somes times depending of download of pages-articles and date .. , 
> and make the proces with mwdumper.jar and runnig rebuildall.php but this 
> proces is largue and the sever responde slow while pthe proces is running, 
> my cuestion is if exist some precedure to make all this automatically in 
> some date monthly directily form wikipedia, i realy want  dump to my 
> database the changes maked on the  original wikipedia, for example if i 
> haved dumped 02-05-2009-eswiki-pages-articles.xml to my database and the 
> official wikipedia exist 02-06-2009-eswiki-pages-articles.xml uptodate my 
> local wiki without dump all entry  xml??

There's a commercial feed service for getting the changes, accompanied
with software to integrate it.
The poor man solution would be to watch the recentchange events.


_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to