Hello,

I recently discover kiwix and the zim file. It seems a really great
project for offline use, even if there is still a lot of challenges.

For now, as a traveller, I would want to have a recent & usable dump
of some wiki (wikipedia, wikitravel, wiki.couchsurfing, wikivoyage,
...) and find really hard to achieve it.

As a reference, I found
http://en.wikipedia.org/wiki/Wikipedia:Database_download
http://www.kiwix.org/index.php/Main_Page
http://wikitravel.org/en/Wikitravel:Offline_Reader_Expedition
http://www.wikivoyage.org/tech/Database_dumps

But in most case, the best I have is
* a mediawiki xml archive
* last, need to mirror website with httrack or wget (and ensuring to
copy only one language release: only en, fr, de, ...)

In this case, is there any available scripts to build a zim file ?
I have checked http://www.kiwix.org/index.php/Tools but as I'm in
travel, I don't have time to make each atomic operations, would really
prefer a batch.

Else, thanks a lot for your work.
Can't wait to have it available on different devices (computer,
smartphone, ...) with all wiki :)

Cheers

Julien
_______________________________________________
dev-l mailing list
[email protected]
https://intern.openzim.org/mailman/listinfo/dev-l

Reply via email to