The three processes we had going for "largish" wikis had been restarted
from a particilar step, since I had to interrupt them earlier for kernel
upgrade and reboot.  These stop at the end of the run.  Three regular
jobs are now running; these cycle through the list of the ten largish
wikis in the usual way.

While we're on the subject of de wiki, I have been considering starting
to produce smaller output files much as we do for en wikipedia.  100GB
is pretty large for someone to download and process, and it takes a
while to produce as well.  Any thoughts?  CC-ing wikitech-l since some
people on that list may be users of the dump that don't subscribe to
xmldatadumps-l (but they should!)

Ariel

Στις 19-11-2011, ημέρα Σαβ, και ώρα 18:03 +0100, ο/η Andreas Meier
έγραψε:
> Hello,
> today we ja-dump was finished, so a new de-dump should start.
> 
> Best regards
> Andreas



_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to