On Wed, Nov 25, 2020 at 4:41 PM Thad Guidry wrote:
>
> Gerhard,
>
> I'm curious what you mean by "processing" and "comb through".
> Can you describe how your processing and what system or database the output
> gets loaded into?
I'm doing embarrassingly little with the data yet and there is no
re
Gerhard,
I'm curious what you mean by "processing" and "comb through".
Can you describe how your processing and what system or database the output
gets loaded into?
Perhaps you have your scripts publicly available on something like GitHub?
It would be nice to know a bit more on what you also are
On Wed, Nov 25, 2020 at 1:22 PM Daniel Garijo wrote:
>
> Hello,
>
> I am writing this message because I am analyzing the Wikidata JSON dumps
> available in the Internet Archive and I have found there are no dumps
> available after Feb 8th, 2019 (see
> https://archive.org/details/wikimediadownloads
Hi Daniel,
I am the one managing the archival process and indeed, it was around
end-2018 when the archival process just died (you can see the status here:
https://dumps.wmflabs.org/status.php).
The current status is that the software behind the archival process is
being reworked and will come wit