Am 02.10.23 um 10:32 schrieb Juliano David Hilario:
If I were to download the entire wiki with the exception of the
discussion files and the user pages, do you think it would be wise? I
would try to make it dynamically updatable too, if possible, and also
try to make as offline-friendly as the Python docs. Do you think that
would be wise? Thank you for your help, the wiki is very useful.
I don’t know if it’s wise.
It’s a MediaWiki, there are solutions around, e.g.:
https://www.mediawiki.org/wiki/Exporting_all_the_files_of_a_wiki
https://github.com/WikiTeam/wikiteam
One is even built-in: https://wiki.contextgarden.net/Special:Export (you
would need a local MediaWiki instance a import the pages there).
See also https://www.mediawiki.org/wiki/Help:Export
You can also use curl to recursively download all contents of a website.
Hraban
___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the
Wiki!
maillist : ntg-context@ntg.nl / https://www.ntg.nl/mailman/listinfo/ntg-context
webpage : https://www.pragma-ade.nl / http://context.aanhet.net
archive : https://bitbucket.org/phg/context-mirror/commits/
wiki : https://contextgarden.net
___________________________________________________________________________________