It's completely broken: https://code.google.com/p/wikiteam/issues/detail?id=56 It will download only a fraction of the wiki, 500 pages at most per namespace.

Let me reiterate that https://code.google.com/p/wikiteam/issues/detail?id=44 is a very urgent bug and we've seen no work on it in many months. We need an actual programmer with some knowledge of python to fix it and make the script work properly; I know there are several on this list (and elsewhere), please please help. The last time I, as a non-coder, tried to fix a bug, I made things worse (https://code.google.com/p/wikiteam/issues/detail?id=26).

Only after API is implemented/fixed, I'll be able to re-archive the 4-5 thousands wikis we've recently archived on archive.org (https://archive.org/details/wikiteam) and possibly many more. Many of those dumps contain errors and/or are just partial because of the script's unreliability, and wikis die on a daily basis. (So, quoting emijrp, there IS a deadline.)

Nemo

P.s.: Cc'ing some lists out of desperation; sorry for cross-posting.

_______________________________________________
Pywikipedia-l mailing list
Pywikipedia-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l

Reply via email to