Re: [Wikitech-l] Technical Writer - Contract - 3 Months (+)

2013-11-14 Thread Matthew Flaschen

On 11/13/2013 01:22 PM, Quim Gil wrote:

Wikimedia's Engineering Community team is responsible for developing
clear documentation for MediaWiki. All of our documentation is written
collaboratively in wiki pages, involving all kinds of profiles, from WMF
professional developers to anonymous users. There are some areas of our
documentation that lack content or are outdated, while others have grown
organically and they need pruning and polishing. It is complex to
recruit volunteers for this type of work.


Some of it is also in repositories. Doxygen and JSDuck are used for API 
documentation.  docs/hooks.txt documents hooks, though hooks.txt has an 
awkward and inefficient relationship with the on-wiki hook documentation 
(this was discussed a earlier, and legoktm has an in-progress script).


API documentation may not be within the scope of the position, but it 
may be worth mentioning to avoid all of our documentation being 
misleading.


Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] jQuery UI may not be loaded by default on your wiki

2013-11-14 Thread Matthew Flaschen

On 10/31/2013 01:14 AM, Matthew Flaschen wrote:

// Load jquery.ui.button so button styles work in wikitext
mw.loader.using( 'jquery.ui.button' );


Commons is now doing this conditionally, only if the CSS class is 
present on the page (or specifically the HTML from wikipage.content). 
See https://commons.wikimedia.org/wiki/MediaWiki:Common.js .


There's also some discussion at the English Wikipedia village pump 
((https://en.wikipedia.org/w/index.php?title=Wikipedia:Village_pump_%28technical%29#Dark_blue_over_the_word_.22wikicode.22_to_the_point_where_I_can_hardly_see_it). 
 For when that gets archived, the permalink is 
https://en.wikipedia.org/w/index.php?title=Wikipedia:Village_pump_(technical)oldid=581603372#Dark_blue_over_the_word_.22wikicode.22_to_the_point_where_I_can_hardly_see_it


Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-implementing PDF support

2013-11-14 Thread Strainu
Thanks Brad,

I'm wondering if it wouldn't make sense to have a dedicated bugday at
the end of the sprint?

Strainu

2013/11/13 Brad Jorsch (Anomie) bjor...@wikimedia.org:
 Note these are my own thoughts and not anything representative of the team.

 On Wed, Nov 13, 2013 at 6:55 AM, Strainu strain...@gmail.com wrote:
 b. If the robots should _not_ be credited, how do we detect them?
 Ideally, there should be an automatical way to do so, but according to
 http://www.mediawiki.org/wiki/Bots, it only works for recent changes.
 Less ideally, only users with bot at the end should be removed, in
 order to keep users like
 https://ro.wikipedia.org/wiki/Utilizator:Vitalie_Ciubotaru (which is
 not a robot, but has bot in the name) in the contributor list.

 Another way to exclude (most) bots would be to skip any user with the
 bot user right. Note though that this would still include edits by
 unflagged bots, or by bots that have since been decommissioned and the
 bot flag removed.

 Personally, though, I do agree that excluding any user with bot in
 the name (or even with a name ending in bot) is a bad idea even if
 just applied to enwiki, and worse when applied to other wikis that may
 have different naming conventions.

 . The idea is to decide if and how to credit:
 a. vandals
 b. reverters
 c. contributors which had their valid contributions rephrased or
 replaced from the article.
 d. contributors with valid contributions but invalid names

 The hard part there is detecting these, particularly case (c). And
 even then, the article may still be based on the original work in a
 copyright sense even if no single word of the original edit remains.

 Then there's also the situation where A makes an edit that is
 partially useful and partially bad, B reverts, then C comes along and
 incorporates parts of C's edit.


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-implementing PDF support

2013-11-14 Thread C. Scott Ananian
Let's see what sorts of bugs crop up?  In my (limited) experience, the most
common issues are probably article content which renders poorly as a PDF
for some reason.  Those bugs aren't easy to fix in a bug day sprint, since
they tend to crop up slowly over time as people use the service and collect
lists of suboptimal pages.  (And some of these issues might be eventually
traced to Parsoid, and we know from experience that fixing those ends up
being a gradual collaboration between authors and developers to determine
whether the wikitext should be rewritten or the parser extended, etc.)

On the other hand, if our servers are crashing or the UI code is buggy,
etc, then a bug day would probably be useful to squash those sorts of
things.
 --scott


On Thu, Nov 14, 2013 at 9:49 AM, Strainu strain...@gmail.com wrote:

 Thanks Brad,

 I'm wondering if it wouldn't make sense to have a dedicated bugday at
 the end of the sprint?

 Strainu

 2013/11/13 Brad Jorsch (Anomie) bjor...@wikimedia.org:
  Note these are my own thoughts and not anything representative of the
 team.
 
  On Wed, Nov 13, 2013 at 6:55 AM, Strainu strain...@gmail.com wrote:
  b. If the robots should _not_ be credited, how do we detect them?
  Ideally, there should be an automatical way to do so, but according to
  http://www.mediawiki.org/wiki/Bots, it only works for recent changes.
  Less ideally, only users with bot at the end should be removed, in
  order to keep users like
  https://ro.wikipedia.org/wiki/Utilizator:Vitalie_Ciubotaru (which is
  not a robot, but has bot in the name) in the contributor list.
 
  Another way to exclude (most) bots would be to skip any user with the
  bot user right. Note though that this would still include edits by
  unflagged bots, or by bots that have since been decommissioned and the
  bot flag removed.
 
  Personally, though, I do agree that excluding any user with bot in
  the name (or even with a name ending in bot) is a bad idea even if
  just applied to enwiki, and worse when applied to other wikis that may
  have different naming conventions.
 
  . The idea is to decide if and how to credit:
  a. vandals
  b. reverters
  c. contributors which had their valid contributions rephrased or
  replaced from the article.
  d. contributors with valid contributions but invalid names
 
  The hard part there is detecting these, particularly case (c). And
  even then, the article may still be based on the original work in a
  copyright sense even if no single word of the original edit remains.
 
  Then there's also the situation where A makes an edit that is
  partially useful and partially bad, B reverts, then C comes along and
  incorporates parts of C's edit.
 
 
  --
  Brad Jorsch (Anomie)
  Software Engineer
  Wikimedia Foundation
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-14 Thread C. Scott Ananian
On Mon, Nov 11, 2013 at 1:51 AM, Tim Starling tstarl...@wikimedia.orgwrote:

 On 08/11/13 03:40, C. Scott Ananian wrote:
  Basically, every major piece of WP should have a module owner.

[...]

  Certain people 'own' larger collections of modules -- like there are
  subsystem owners in the linux kernel dev world.  For example, ideally
 there
  should be someone who owns the WMF deployed mediawiki who can weigh in
 on
  changes which affect the configuration and collection of modules which
  actually constitute wikipedia.  And then there are the big three
  (architects) who are really just the top-level module owners (the Linus
  Torvalds, if you will).

 My concern with this kind of maintainer model is that RFC review would
 tend to be narrower -- a consensus of members of a single WMF team
 rather than a consensus of all relevant experts.


To clarify, I wasn't actually suggesting that RFCs would be reviewed by the
minimal set of module owners.  The opposite, actually: I think that
explicitly creating a hierarchy of ownership would allow the review process
to efficiently progress from narrow to broad focus, ensuring that neither
end gets short shrift.  The line down the tree from big three to person
who last touched a particular file is a way to capture everyone who is
relevant to an RFC, making sure that neither the forest nor the trees are
neglected.  The main thrust is to flesh out the middle levels of the
hierarchy, lieutenants who have a moderately broad focus and who are
trusted to offload some of the work from the top 3 architects.  The top
three would still weigh in, but hopefully they can concentrate on the
broadest scale issues and wouldn't have to do as much of the heavy lifting.

I'm not proposing that RFC review should be done solely by the narrow-focus
people who happened to last touch the affected files, obviously.

That said, this is a relatively minor point; it seems we've reached good
consensus regarding the bigger picture decoupling of architectural
responsibilities and job titles.  Quibbling over the number and scope of
'architects' can be deferred (esp since the big 3 don't seem to be loudly
complaining of overwork at present).
  --scott

-- 
(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Operations buy in on Architecture of mwlib Replacement

2013-11-14 Thread C. Scott Ananian
And I'll add that there's another axis: gwicke (and others?) have been
arguing for a broader collection of services architecture for mw.  This
would decouple some of the installability issues.  Even if PDF rendering
(say) was a huge monster, Jimmy MediaWiki might still be able to simply
install the core of the system.  Slow progress making PDF rendering more
friendly wouldn't need to hamper all the Jane MediaWikis who don't need
that feature.

These issues cross-couple.  Making a really super-easy giant VM blob that
contained an entire complicated MediaWiki setup with all bells and whistles
might as a side-effect make it less pressing to decouple the services and
simplify the installation -- so long as the giant blob worked, no one needs
to know what darkness lay beneath the hood.  (Is that a good or a bad
thing?)  Conversely, making 'apt-get install mediawiki mediawiki-pdf' Just
Work would make it less relevant whether 'mediawiki-pdf' was a separate
service or a tightly-coupled mediawiki extension.

In practice, what is needed most are people to actually work on making the
process friendly, one way or another.  (I've done my part by aggressively
patching extension READMEs as I come across them to keep them up to date
and accurate.)
 --scott

(http://cscott.net)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-14 Thread Tyler Romeo
On Mon, Nov 11, 2013 at 1:51 AM, Tim Starling tstarl...@wikimedia.orgwrote:

 My concern with this kind of maintainer model is that RFC review would
 tend to be narrower -- a consensus of members of a single WMF team
 rather than a consensus of all relevant experts.


I'd also like to point out that we can still implement a maintainer system
without changing the RFC process. There's no requirement that RFC review
and maintainers be the same people. This thread is mainly a discussion
about Architects (or the concept thereof), and how we might want to
change the MediaWiki code review structure.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-11-14 Thread Quim Gil
On 11/13/2013 08:27 AM, Željko Filipin wrote:
 On Fri, Oct 25, 2013 at 8:07 PM, Quim Gil q...@wikimedia.org wrote:
 
 See Top closers and Top openers of all-time, last year and last month
 at http://korma.wmflabs.org/browser/top.html
 PS: yes, this information should appear at http://korma.wmflabs.org/
 browser/its.html - I will file a report.

 
 Did you create the bug? I could not find it in Bugzilla.

Sorry, I created it now:

Bug 57060 - Top Bugzilla contributors should be listed in the Issues page
https://bugzilla.wikimedia.org/show_bug.cgi?id=57060

Thank you for the marking.

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-14 Thread Quim Gil
On 11/13/2013 08:49 AM, Mark A. Hershberger wrote:
 On 11/13/2013 05:44 AM, Nathan Larson wrote:
 TL;DR: How can we collaboratively put together a list of non-spammy sites
 that wikis may want to add to their interwiki tables for whitelisting
 purposes; and how can we arrange for the list to be efficiently distributed
 and imported?
 
 I like the idea.  Unless I'm mistaken, it seems like most of this idea
 could be implemented and improved on as an extension.
 
 Alternatively, we could work with WikiApiary to tag spammy wikis that
 his bot finds.
 
 Also, since we're talking about spam and MediaWiki, another good site to
 check out would be http://spamwiki.org/mediawiki/.

With my hat of third party wiki admin I personally agree with all this.

Another possibility further down in the roadmap:

Imagine a World in which you could transclude in your wiki content from
a subset of this interwiki table of wikis, based on license
compatibility and whatever other filters. To avoid performance problems,
the check with the sources could be done by a cron periodically, etc.

The most interesting part of this proposal is to start an interwiki
table of friendly and compatible wikis willing to ease the task of
linking and sharing content among them. The attributes of the table and
a decentralized system to curate and maintain the data could open many
possibilities of collaboration between MediaWiki sites.

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Facebook Open Academy

2013-11-14 Thread Quim Gil
On 11/13/2013 04:21 PM, Tyler Romeo wrote:
 MediaWiki participates in a number of student competitions and programs as
 an open source mentor (such as GSoC, Code-In, etc.). Today I ran into
 another one: Facebook's Open Academy Program.
 
 https://www.facebook.com/OpenAcademyProgram
 
 I'm not sure how we would get involved in this program, but I'm sure people
 would agree it might be a good thing to become a mentor organization and
 have students contribute to MediaWiki as part of a college credit program.
 
 Any thoughts?

Interesting. Thank you!

After reading
https://www.facebook.com/notes/open-academy/welcome-to-the-open-academy-program/216150918554109
and liking the second comment (in the Facebook sense of the word), I
sent an email to Jamie Lockwood CCing Tyler requesting more information:


Hello Jamie,

I'm the coordinator of mentorship programs at Wikimedia, including
Google Summer of Code and FOSS Outreach Program for Women. I just
learned about Open Academy Program thanks to Tyler, one of our veteran
community contributors, and a mentor as well.

I just read
https://www.facebook.com/notes/open-academy/welcome-to-the-open-academy-program/216150918554109
and I wonder if you think it makes sense for Wikimedia to get involved.

As you will know, participating properly in mentorship programs requires
a lot of energy from anybody. We are about to start our first Google
Code-in, at the same time than our third FOSS OPW. Still, we want to be
open to any proposals and initiatives, and get involved as much as our
possibilities permit.

Thank you very much for promoting free software to new contributors!

Best regards,

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-14 Thread Nathan Larson
On Thu, Nov 14, 2013 at 12:18 PM, Quim Gil q...@wikimedia.org wrote:

 With my hat of third party wiki admin I personally agree with all this.

 Another possibility further down in the roadmap:

 Imagine a World in which you could transclude in your wiki content from
 a subset of this interwiki table of wikis, based on license
 compatibility and whatever other filters. To avoid performance problems,
 the check with the sources could be done by a cron periodically, etc.

 The most interesting part of this proposal is to start an interwiki
 table of friendly and compatible wikis willing to ease the task of
 linking and sharing content among them. The attributes of the table and
 a decentralized system to curate and maintain the data could open many
 possibilities of collaboration between MediaWiki sites.


Is there reason to think that a decentralized system would be likely to
evolve, or that it would be optimal? It seems to me that most stuff in the
wikisphere is centered around WMF; e.g. people usually borrow templates,
the spam blacklist, MediaWiki extensions, and so on, from WMF sites. Most
wikis that attempted to duplicate what WMF does have failed to catch on;
e.g. no encyclopedia that tried to copy Wikipedia's approach (e.g.
allegedly neutral point of view and a serious, rather than humorous, style
of writing) came close to Wikipedia's size and popularity, and no wiki
software caught on as much as MediaWiki. It's just usually more efficient
to have a centralized repository and widely-applied standards so that
people aren't duplicating their labor too much.

But if one were to pursue centralization of interwiki data, what would be
the central repository? Would WMF be likely to be interested? Hardly
anything at https://meta.wikimedia.org/wiki/Proposals_for_new_projects has
been approved for creation, so I'm not sure how one would go about getting
something like this established through WMF.

Some advantages of WMF are that we can be pretty confident its projects
will be around for awhile, and none of them are clogged up with the kind of
advertising we see at, say, Wikia. Non-WMF wikis come and go all the time;
one never knows when the owner will get hit by a bus, lose interest, etc.
and then the users are left high and dry. That could be a problem if the
wiki in question is a central repository that thousands of wikis have come
to rely upon.

Perhaps the MediaWiki Foundation could spearhead this? Aside from its
nonexistence, I think that organization could be a pretty good venue for
getting this done. I'll have to bring this up with some of my imaginary
friends who sit on the MWF board of trustees.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Facebook Open Academy

2013-11-14 Thread Mark Holmquist
On Wed, Nov 13, 2013 at 07:21:01PM -0500, Tyler Romeo wrote:
 MediaWiki participates in a number of student competitions and programs as
 an open source mentor (such as GSoC, Code-In, etc.). Today I ran into
 another one: Facebook's Open Academy Program.
 
 https://www.facebook.com/OpenAcademyProgram

Where is this being actually organized? The only thing I see is a page
and a half at that URL, a blog post from Facebook engineering, and one
email address.

The lack of buzz around other official fora makes me worry that the
organizing software will be Facebook itself.

-- 
Mark Holmquist
Software Engineer, Multimedia
Wikimedia Foundation
mtrac...@member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The interwiki table as a whitelist of non-spammy sites

2013-11-14 Thread Quim Gil
On 11/14/2013 09:53 AM, Nathan Larson wrote:
 Is there reason to think that a decentralized system would be likely to
 evolve, or that it would be optimal? It seems to me that most stuff in the
 wikisphere is centered around WMF; e.g. people usually borrow templates,
 the spam blacklist, MediaWiki extensions, and so on, from WMF sites. Most
 wikis that attempted to duplicate what WMF does have failed to catch on;

As mentioned by Mark and quoted in my email, http://wikiapiary.com/
could be a good starting point.

Just improvising a hypothetical starting point for a process to maintain
the decentralized interwiki table:

In order to become a candidate, a wiki must have the extension installed
and a quantifiable score based on age, size, license, and lack of
reports as spammer.

The extension could perhaps check how much a wiki is linked by how many
wikis pof which characteristics, calculating a popularity index of
sorts. Maybe you can even have a classification of topics filtering the
langage and type of content that matters to your wiki. By default, only
wikis above some popularity index would be included in your local
interwiki table. The admins could fine tune locally.

The master interwiki table could be hosted in Wikiapiary or wherever. It
would be mirrored in some wahy by the wikis with the extension installed
willing to do so.

The maintenance of the table itself doesn't even look like a big deal,
compared to developing the extension and adding new interwiki features.
It would be based on the userbase of wiki installing the extension.

Whether Wikimedia projects join the interwiki party of not, that would
depend on the extension being ready for Wikimedia adoption annd a
decision to deploy it. But that would be a Wikimedia discussion, not a
Interwiki project discussion.

As said, all of the above is improvised and hypothetical. Sorry in
advance for any planning flaws.  :)

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikipedia Issue

2013-11-14 Thread Derric Atzrott
Just got this when trying to read an article.  Not sure if anyone is doing
anything, but if you are, you might have broken something.

 

If you report this error to the Wikimedia System Administrators, please include
the details below.

Request: GET http://en.wikipedia.org/wiki/Johnny_the_Homicidal_Maniac, from
10.64.32.106 via cp1067 cp1067 ([10.64.0.104]:3128), Varnish XID 2127871567

Forwarded for: 162.17.205.153, 208.80.154.76, 10.64.32.106

Error: 503, Service Unavailable at Thu, 14 Nov 2013 18:48:33 GMT 

 

Thank you,

Derric Atzrott

Computer Specialist

Alizee Pathology

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Derric Atzrott
Just got this when trying to read an article.  Not sure if anyone is doing
anything, but if you are, you might have broken something. 

If you report this error to the Wikimedia System Administrators, please include
the details below.

Request: GET http://en.wikipedia.org/wiki/Johnny_the_Homicidal_Maniac, from
10.64.32.106 via cp1067 cp1067 ([10.64.0.104]:3128), Varnish XID 2127871567
Forwarded for: 162.17.205.153, 208.80.154.76, 10.64.32.106
Error: 503, Service Unavailable at Thu, 14 Nov 2013 18:48:33 GMT 

The issue seems to be confined to Wikipedia.  I am not seeing it affect 
Mediawiki.org

Additionally the issue seems to be affecting more than just the English 
language Wikipedia.  The lojban language Wikipedia is also experiencing 
difficulties.

Thank you,
Derric Atzrott


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Nathan Larson
On Thu, Nov 14, 2013 at 2:05 PM, Derric Atzrott 
datzr...@alizeepathology.com wrote:

 The issue seems to be confined to Wikipedia.  I am not seeing it affect
 Mediawiki.org

 Additionally the issue seems to be affecting more than just the English
 language Wikipedia.  The lojban language Wikipedia is also experiencing
 difficulties.

 Thank you,
 Derric Atzrott


I'm getting it at MediaWiki.org intermittently too. Just lost an edit, in
fact (my cache didn't save it for some reason; maybe it expired).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Nathan
I get an error, and beneath it there is a message that says Your cache
administrator is: nobody Looks like there is a bug on it that's 4 or 5
years old ;)


On Thu, Nov 14, 2013 at 2:10 PM, Nathan Larson
nathanlarson3...@gmail.comwrote:

 On Thu, Nov 14, 2013 at 2:05 PM, Derric Atzrott 
 datzr...@alizeepathology.com wrote:

  The issue seems to be confined to Wikipedia.  I am not seeing it affect
  Mediawiki.org
 
  Additionally the issue seems to be affecting more than just the English
  language Wikipedia.  The lojban language Wikipedia is also experiencing
  difficulties.
 
  Thank you,
  Derric Atzrott


 I'm getting it at MediaWiki.org intermittently too. Just lost an edit, in
 fact (my cache didn't save it for some reason; maybe it expired).
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Andre Klapper
On Thu, 2013-11-14 at 13:50 -0500, Derric Atzrott wrote:
 If you report this error to the Wikimedia System Administrators, please 
 include
 the details below.
 
 Request: GET http://en.wikipedia.org/wiki/Johnny_the_Homicidal_Maniac, from
 10.64.32.106 via cp1067 cp1067 ([10.64.0.104]:3128), Varnish XID 2127871567
 
 Forwarded for: 162.17.205.153, 208.80.154.76, 10.64.32

Thanks! The Operations team is currently working on it and there likely
will be an analysis available at
https://wikitech.wikimedia.org/wiki/Incident_documentation afterwards.

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Jeremy Baron
On Thu, Nov 14, 2013 at 7:18 PM, Andre Klapper aklap...@wikimedia.org wrote:
 Thanks! The Operations team is currently working on it and there likely
 will be an analysis available at
 https://wikitech.wikimedia.org/wiki/Incident_documentation afterwards.

See also #wikimedia-tech, #wikimedia-operations, (both of those have
logs linked from /topic so you can catch up) icinga and
https://wikitech.wikimedia.org/wiki/SAL

-Jeremy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Operations buy in on Architecture of mwlib Replacement

2013-11-14 Thread Ryan Lane
On Thu, Nov 14, 2013 at 8:13 AM, C. Scott Ananian canan...@wikimedia.orgwrote:

 And I'll add that there's another axis: gwicke (and others?) have been
 arguing for a broader collection of services architecture for mw.  This
 would decouple some of the installability issues.  Even if PDF rendering
 (say) was a huge monster, Jimmy MediaWiki might still be able to simply
 install the core of the system.  Slow progress making PDF rendering more
 friendly wouldn't need to hamper all the Jane MediaWikis who don't need
 that feature.


Definitely and others.  Apart from decoupling instability issues it also
breaks the application into separately maintainable applications that can
have teams of people working on them separately. The only thing needed to
ensure compatibility with other teams is a stable API, and that's what API
versioning is for. Having multiple services doesn't complicate things much,
unless you're running on a shared host.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Issue

2013-11-14 Thread Erik Moeller
We were dealing with cascading site issues due to excessive database
queries, and are still investigating the root cause, but site should
be recovered by now.

-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Architecture RFC review meetings

2013-11-14 Thread Quim Gil
On 11/06/2013 03:18 PM, Quim Gil wrote:
 The next RFC review meeting is planned for November 20.

Wednesday, November 20, 2013 at 10:00 PM UTC at #wikimedia-meetbot
https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2013-11-20

There are no RFCs scheduled yet. Seeing how the previous iterations
went, perhaps the schedule could be defined by these priorities:

# RFCs previously discussed with actions completed.
# RFCs previously scheduled that were left out because of lack of time.
# New RFCs pushed by their promoters.
# RFCs called by the architects for a resolution.

Up to you. I will be there operating MeetBot.  :)

PS: let's reuse this thread for future meetings' announcements and notes.

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Security Release: 1.21.3, 1.20.8 and 1.19.9

2013-11-14 Thread Chris Steipp
I would like to announce the release of MediaWiki 1.21.3, 1.20.8 and
1.19.9. These releases fix 2 security related bugs that could affect users
of MediaWiki. Download links are given at the end of this email.

* Kevin Israel (Wikipedia user PleaseStand) identified and reported two
vectors for injecting Javascript in CSS that bypassed MediaWiki's blacklist
(CVE-2013-4567, CVE-2013-4568).
https://bugzilla.wikimedia.org/show_bug.cgi?id=55332

* Internal review while debugging a site issue discovered that MediaWiki
and the CentralNotice extension were incorrectly setting cache headers when
a user was autocreated, causing the user's session cookies to be cached,
and returned to other users (CVE-2013-4572).
https://bugzilla.wikimedia.org/show_bug.cgi?id=53032


Additionally, the following extensions have been updated to fix security
issues:

* CleanChanges: MediaWiki steward Teles reported that revision-deleted IP's
are not correctly hidden when this extension is used (CVE-2013-4569).
https://bugzilla.wikimedia.org/show_bug.cgi?id=54294

* ZeroRatedMobileAccess: Tomasz Chlebowski reported an XSS vulnerability
(CVE-2013-4573).
https://bugzilla.wikimedia.org/show_bug.cgi?id=55991

* CentralAuth: MediaWiki developer Platonides reported a login CSRF in
CentralAuth (CVE-2012-5394).
https://bugzilla.wikimedia.org/show_bug.cgi?id=40747


Full release notes for 1.21.3:
https://www.mediawiki.org/wiki/Release_notes/1.21

Full release notes for 1.20.8:
https://www.mediawiki.org/wiki/Release_notes/1.20

Full release notes for 1.19.9:
https://www.mediawiki.org/wiki/Release_notes/1.19

For information about how to upgrade, see
https://www.mediawiki.org/wiki/Manual:Upgrading


**
   1.21.3
**
Download:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.3.tar.gz

Patch to previous version (1.21.2), without interface text:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.3.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-i18n-1.21.3.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-core-1.21.3.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.3.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.3.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-i18n-1.21.3.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html


**
   1.20.8
**
Download:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.8.tar.gz

Patch to previous version (1.20.7), without interface text:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.8.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-i18n-1.20.8.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.20/mediawiki-core-1.20.8.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.8.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.20/mediawiki-1.20.8.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.20/mediawiki-i18n-1.20.8.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html


**
   1.19.9
**
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.9.tar.gz

Patch to previous version (1.19.8), without interface text:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.9.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.9.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-core-1.19.9.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.9.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.9.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.9.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html

**
   Extension:CentralAuth
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:CentralAuth

**
   Extension:CentralNotice
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:CentralNotice

**
   Extension:CleanChanges
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:CleanChanges


[Wikitech-l] Comet

2013-11-14 Thread Lee Worden
In the MW extension development I'm doing, I'm thinking of writing some 
operations that use [[Comet_(programming)]] to deliver continuous 
updates to the client, rather than the Ajax pattern of one request, one 
response.


Has anyone messed with this?  Any code I should crib from, or advice or 
cautionary tales?  Also, if it develops into something useful, I could 
split it out for others to use.


Thanks,
LW

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Aran
Wouldn't WebSocket be the better choice for a full duplex channel?

On 14/11/13 21:12, Lee Worden wrote:
 In the MW extension development I'm doing, I'm thinking of writing
 some operations that use [[Comet_(programming)]] to deliver continuous
 updates to the client, rather than the Ajax pattern of one request,
 one response.

 Has anyone messed with this?  Any code I should crib from, or advice
 or cautionary tales?  Also, if it develops into something useful, I
 could split it out for others to use.

 Thanks,
 LW

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Tyler Romeo
On Thu, Nov 14, 2013 at 6:12 PM, Lee Worden worden@gmail.com wrote:

 Has anyone messed with this?  Any code I should crib from, or advice or
 cautionary tales?  Also, if it develops into something useful, I could
 split it out for others to use.


I have not messed with it personally, but I think it is a good idea. You
should also know that the HTML5 standard has standardized the Comet model
into server-sent events (SSE). [1] Mozilla also provides a nice tutorial on
how to use it. [2] However, one big catch is that this is not currently
implemented in Internet Explorer or mobile browsers. [3] So you'd have to
have your own custom pure-JavaScript implementation for IE support.

WebSocket, as another mentioned, is also an approach you could use.
However, WebSockets are meant for full duplex communication, meaning the
client is also talking back to the server, which may or may not be what you
want. Also using WebSockets means the internals of what is sent over the
socket and what it means is left to you to design, rather than being
standardized. Not to mention the fact that you have to implement WebSockets
in PHP or find a reliable library that will do it for you. And even then,
WebSockets are only supported in IE 10 and later, so you're still a bit
screwed in terms of backwards compatibility.

[1] http://www.w3.org/TR/eventsource/
[2]
https://developer.mozilla.org/en-US/docs/Server-sent_events/Using_server-sent_events
[3] http://caniuse.com/#feat=eventsource

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Comet

2013-11-14 Thread Daniel Friesen
On 2013-11-14 5:22 PM, Tyler Romeo wrote:
 On Thu, Nov 14, 2013 at 6:12 PM, Lee Worden worden@gmail.com wrote:

 Has anyone messed with this?  Any code I should crib from, or advice or
 cautionary tales?  Also, if it develops into something useful, I could
 split it out for others to use.

 I have not messed with it personally, but I think it is a good idea. You
 should also know that the HTML5 standard has standardized the Comet model
 into server-sent events (SSE). [1] Mozilla also provides a nice tutorial on
 how to use it. [2] However, one big catch is that this is not currently
 implemented in Internet Explorer or mobile browsers. [3] So you'd have to
 have your own custom pure-JavaScript implementation for IE support.

 WebSocket, as another mentioned, is also an approach you could use.
 However, WebSockets are meant for full duplex communication, meaning the
 client is also talking back to the server, which may or may not be what you
 want. Also using WebSockets means the internals of what is sent over the
 socket and what it means is left to you to design, rather than being
 standardized. Not to mention the fact that you have to implement WebSockets
 in PHP or find a reliable library that will do it for you. And even then,
 WebSockets are only supported in IE 10 and later, so you're still a bit
 screwed in terms of backwards compatibility.

 [1] http://www.w3.org/TR/eventsource/
 [2]
 https://developer.mozilla.org/en-US/docs/Server-sent_events/Using_server-sent_events
 [3] http://caniuse.com/#feat=eventsource
Rather than natively using WebSockets you could use a higher-level library.
There are two of these in existence, Socket.IO[1] and SockJS[2].
Socket.IO is the popular implementation.
However when I looked into it I found SockJS to be the superior
implementation IMHO.

These libraries use WebSockets when available and fall back to a variety
of other methods when WebSockets aren't available.
So they support practically every browser.

However you're probably going to have to give up on PHP.

Neither Socket.IO nor SockJS have a PHP server implementation.
The only thing that shows up for Socket.IO is Elephant.IO[3] which is a
Socket.IO *client*.

If you go down to raw WebSockets there is Ratchet[4]. However that
brings up a number of issues that accumulate up to losing every single
point you could possibly have to run it using PHP.
Ratchet needs to spawn its own server. This means it needs a dedicated
port, domain, or a reverse proxy in front of it.
It also means that even though it's written in PHP you can no longer run
it on any form of shared hosting.
PHP is also not designed to run long running daemons. It can do it, but
you will have to watch this very carefully and be prepared to fix any
bugs that show up.
You'd expect that since it's in PHP you at least have the one remaining
advantage that you can directly interact with MediaWiki.
However all of MediaWiki's code is synchronous. This means that if you
directly use MW every time it needs to do something with the database,
memcached, filesystem, other storage, or a slow MW code path your
WebSocket server will seize up for a moment for ALL clients connected to
it. If this happens to be a parser cache miss waiting for a parser run
that takes 15s to complete. Your real-time application will be
completely unresponsive for those 15s.
The only way to avoid that would be to run the MW code in processes
separate from the WebSocket server.
But at that point there's absolutely no remaining advantage to the
server being in PHP instead of Node.JS, Python, Ruby, etc...

The situation is different but similarly hopeless[citation needed] for
EventSource / Server-sent events.
SSE's protocol is simple and can be implemented simply in PHP.
However in PHP every SSE connection will have it's own open connection
in the webserver and it's own dedicated PHP process.
This means that as people connect to your server you will quickly reach
the webserver's limit of maximum open connections (maybe even ram with
all that overhead).
You cannot avoid that fatal flaw without doing a bunch of strange and
hacky things that ending up with the same faults to using Ratchet.

-- summary --
Basically in the end your best bet is probably to just use something
other than PHP for this.
The native implementations for both Socket.IO's and SockJS' servers are
in Node.JS.
So that's probably the best bet for doing this.
You can communicate with something in PHP running MW stuff from the
Node.JS server using another protocol.
Such as STOMP, ZeroMQ, Thrift, D-Bus, etc... or even simply HTTP calls
to the webserver/API or execution of PHP processes from the Node.JS server.


[1] http://socket.io/
[2] http://sockjs.org/
[3] http://elephant.io/
[4] http://socketo.me/

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Comet

2013-11-14 Thread Tyler Romeo
On Thu, Nov 14, 2013 at 10:12 PM, Daniel Friesen dan...@nadir-seen-fire.com
 wrote:

 Basically in the end your best bet is probably to just use something
 other than PHP for this.
 The native implementations for both Socket.IO's and SockJS' servers are
 in Node.JS.
 So that's probably the best bet for doing this.
 You can communicate with something in PHP running MW stuff from the
 Node.JS server using another protocol.
 Such as STOMP, ZeroMQ, Thrift, D-Bus, etc... or even simply HTTP calls
 to the webserver/API or execution of PHP processes from the Node.JS server.


Agreed on this. PHP isn't meant for continuous processing, regardless of if
you use SSE or WebSockets. Also it should be noted that SockJS also has
Tornado and Twisted implementations, if you want to use Python. And, of
course, if your extension needs to be scalable, they also have vert.x,
which works with Java.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] FOSS OPW Round 7 Featured Project synopsis:Complete the MediaWiki development course at Codeacademy

2013-11-14 Thread Diwanshi Pandey
Hi Everyone,
   As an applicant of FOSS-OPW Round 7, I took a deep and profound interest
in one of the featured projects Complete the MediaWiki development course
at Codeacademy. I chosen I am planning to focus and contribute to this
project.

***Project Synopsis***

The project is about completing the media wiki development course on Code
Academy and also enhancing the course.

   - The Primary goal of the project is to develop instructional materials
   to teach students to use Wikimedia API. Currently the course shows only
   rudimentary API usage. What needs to be done is significantly more
   sections, better explanations and tests/walkthroughs for students to
   complete.
   - The secondary goal of the project is that the tutorial does not need
   to cover all options (there are too many of them), but should get students
   started with the basic usage scenarios for most of the common tasks API
   users expect.

-Mentor, Yuri Astrakhan


-- 

*Regards,*
*Diwanshi Pandey*
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l