On 05/28/2015 07:52 PM, Lars Aronsson wrote:
> With proper release management, it
should be possible to run the old version of the process
until the new version has been tested, first on some smaller
wikis, and gradually on the larger ones.
I understand your frustration; however release
hear hear
Gerard
On 29 May 2015 at 01:52, Lars Aronsson l...@aronsson.se wrote:
The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
here,
https://phabricator.wikimedia.org/T98585
However, that information doesn't
The XML database dumps are missing all through May, apparently
because of a memory leak that is being worked on, as described
here,
https://phabricator.wikimedia.org/T98585
However, that information doesn't reach the person who wants to
download a fresh dump and looks here,
You can create a script that uses Special:Export to export all articles in
the deletion categories just before they are deleted.
Then import them into your Deletionpedia.
2012/5/17 Mike Dupont jamesmikedup...@googlemail.com
Hi,
I am thinking about how to collect articles deleted based on the
Well I whould be happy for items like this :
http://en.wikipedia.org/wiki/Template:Db-a7
would it be possible to extract them easily?
mike
On Thu, May 17, 2012 at 2:23 PM, Ariel T. Glenn ar...@wikimedia.org wrote:
There's a few other reasons articles get deleted: copyright issues,
personal
Create a script that makes a request to Special:Export using this category
as feed
https://en.wikipedia.org/wiki/Category:Candidates_for_speedy_deletion
More info https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export
2012/5/21 Mike Dupont jamesmikedup...@googlemail.com
Well I
We now have three mirror sites, yay! The full list is linked to from
http://dumps.wikimedia.org/ and is also available at
http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors
Summarizing, we have:
C3L (Brazil) with the last 5 good known dumps,
Masaryk
Good work. We are approaching finally to an indestructible corpus of
knowledge.
2012/5/17 Ariel T. Glenn ar...@wikimedia.org
We now have three mirror sites, yay! The full list is linked to from
http://dumps.wikimedia.org/ and is also available at
Hi,
I am thinking about how to collect articles deleted based on the not
notable criteria,
is there any way we can extract them from the mysql binlogs? how are
these mirrors working? I would be interested in setting up a mirror of
deleted data, at least that which is not spam/vandalism based on
There's a few other reasons articles get deleted: copyright issues,
personal identifying data, etc. This makes maintaning the sort of
mirror you propose problematic, although a similar mirror is here:
http://deletionpedia.dbatley.com/w/index.php?title=Main_Page
The dumps contain only data
You can follow the updates here
http://wikitech.wikimedia.org/history/Dataset1
2010/11/21 masti mast...@gmail.com
On 11/10/2010 06:44 AM, Ariel T. Glenn wrote:
We noticed a kernel panic message and stack trace in the logs on the
server that servers XML dumps. The web server that provides
On 11/10/2010 06:44 AM, Ariel T. Glenn wrote:
We noticed a kernel panic message and stack trace in the logs on the
server that servers XML dumps. The web server that provides access to
these files is temporarily out of commission; we hope to have it back on
line in 12 hours or less. Dumps
We noticed a kernel panic message and stack trace in the logs on the
server that servers XML dumps. The web server that provides access to
these files is temporarily out of commission; we hope to have it back on
line in 12 hours or less. Dumps themselves have been suspended while we
investigate.
13 matches
Mail list logo