Also, for the curious, the request for dedicated HTML dumps is tracked in
this Phabricator task: https://phabricator.wikimedia.org/T182351
On Thu, 3 May 2018 at 15:19, Bartosz Dziewoński wrote:
> On 2018-05-03 20:54, Aidan Hogan wrote:
> > I am wondering what is the fastest/best way to get a loc
On 2018-05-03 20:54, Aidan Hogan wrote:
I am wondering what is the fastest/best way to get a local dump of
English Wikipedia in HTML? We are looking just for the current versions
(no edit history) of articles for the purposes of a research project.
The Kiwix project provides HTML dumps of Wiki
Hey Aidan!
I would suggest checking out RESTBase (
https://www.mediawiki.org/wiki/RESTBase), which offers an API for
retrieving HTML versions of Wikipedia pages. It's maintained by the
Wikimedia Foundation and used by a number of production Wikimedia services,
so you can rely on it.
I don't belie
Hi Fae,
On 03-05-2018 16:18, Fæ wrote:
On 3 May 2018 at 19:54, Aidan Hogan wrote:
Hi all,
I am wondering what is the fastest/best way to get a local dump of English
Wikipedia in HTML? We are looking just for the current versions (no edit
history) of articles for the purposes of a research pro
Good luck / בהצלחה!
On Thu, May 3, 2018 at 7:39 PM, Amir E. Aharoni <
amir.ahar...@mail.huji.ac.il> wrote:
> Welcome / ברוכה הבאה!
>
> בתאריך יום ה׳, 3 במאי 2018, 19:27, מאת Hagar Shilo <
> hagarshi...@mail.tau.ac.il>:
>
> > Hi All,
> >
> > My name is Hagar Shilo. I'm a web developer and a stud
On 3 May 2018 at 19:54, Aidan Hogan wrote:
> Hi all,
>
> I am wondering what is the fastest/best way to get a local dump of English
> Wikipedia in HTML? We are looking just for the current versions (no edit
> history) of articles for the purposes of a research project.
>
> We have been exploring u
Hello everyone,
I would like to share my deepest gratitude for everyone who responded to
the Wikimedia Communities and Contributors Survey. The survey has already
closed for this year. The quality of the results has improved because more
people responded. We heard from over 200 people who work in
Hi all,
I am wondering what is the fastest/best way to get a local dump of
English Wikipedia in HTML? We are looking just for the current versions
(no edit history) of articles for the purposes of a research project.
We have been exploring using bliki [1] to do the conversion of the
source m
Welcome / ברוכה הבאה!
בתאריך יום ה׳, 3 במאי 2018, 19:27, מאת Hagar Shilo <
hagarshi...@mail.tau.ac.il>:
> Hi All,
>
> My name is Hagar Shilo. I'm a web developer and a student at Tel Aviv
> University, Israel.
>
> This summer I will be working on a user search menu and user filters for
> Wikiped
Hi All,
My name is Hagar Shilo. I'm a web developer and a student at Tel Aviv
University, Israel.
This summer I will be working on a user search menu and user filters for
Wikipedia's "Recent changes" section. Here is the workplan:
https://phabricator.wikimedia.org/T190714
My mentors are Moriel a
Hello all,
can somebody with +2 on Scribunto please review&merge
https://gerrit.wikimedia.org/r/403603 for
https://phabricator.wikimedia.org/T184512?
Thanks!
Martin
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/
Hi,
I'd like to announce the immediate availability of MediaWiki 1.31.0-rc.0,
the first release candidate for 1.31.x. Links at the end of the e-mail. The
tag has been signed and pushed to Git.
This is not a final release and should not be used for production websites.
There's several major outsta
12 matches
Mail list logo