[Wikitech-l] Search results on English Wikipedia not updating

2014-06-03 Thread ENWP Pine
Hi,

It appears that suggested search results are not being updated on English 
Wikipedia.

The article [[Veterans Health Administration scandal of 2014]] doesn't appear 
in suggested search results when I type "Veterans Health Administration", and 
when I searched earlier today "VA scandal" in suggested search results showed 
its previous redirect destination instead of its current destination, although 
actually clicking on "VA scandal" redirected to the correct page.

Any idea what's happening?

Thanks,

Pine
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC: Composer managed libraries for use on WMF cluster

2014-06-03 Thread Bryan Davis
I have converted my email on using composer to manage a set of library
dependencies for MediaWiki-Core [0] into an RFC [1]. Work is
continuing on the implementation of this project, but there are still
debatable implementation details and the RFC process is meant to not
only validate ideas but leave behind a record of the design decisions
that have been made and trade offs that were considered in the
process.

In particular, the current draft RFC omits discussion of the concept
of library "ownership" for long term updates and security fixes and
could use more detail around the process of forking, patching and
subsequently maintaining a external library. I will attempt to fill in
some of these details as I see them over the next day or so, but now
would be a great time for people with strong ideas or opinions on
these aspects to comment on the talk page.

[0]: http://www.gossamer-threads.com/lists/wiki/wikitech/467520?page=last
[1]: 
https://www.mediawiki.org/wiki/Requests_for_comment/Composer_managed_libraries_for_use_on_WMF_cluster

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Search results on English Wikipedia not updating

2014-06-03 Thread Andre Klapper
On Tue, 2014-06-03 at 00:59 -0700, ENWP Pine wrote:
> It appears that suggested search results are not being updated on
> English Wikipedia.

See https://bugzilla.wikimedia.org/show_bug.cgi?id=66011

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-03 Thread james harvey
Sorry for the email spam.  Worked through it, I think.  Not too familiar
with wiki internals.  :-)

This particular page doesn't have the content I'm looking for in it.  It
references a template which is used by a few other versions of the same
image, presumably so the data can be stored once and be given consistently.
 Not being familiar with wiki internals, that was looking to me like it
wasn't returning the entire page content... But it is, so I'll have to
recognize this situation and pull referenced templates when the information
I need isn't already there.


On Tue, Jun 3, 2014 at 2:45 AM, james harvey 
wrote:

> I may have stumbled upon it.  If I change the API call from
> "titles=File:XYZ.jpg" to "titles=Template:XYZ" (note: dropped the .jpg)
> then it *appears* to get me what I need.
>
> Is this correct, or did I run across a case where it appears to work but
> isn't going to be the right way to go?  (Like, I'm not sure if
> "Template:XYZ" directly relates to the Summary information on the
> "File:XYZ.jpg" page, or if it's duplicated data that in this case matches.
>  And, I'm confused why the .jpg gets dropped switching "File:" to
> "Template:")
>
> And, will this always get me the full template information, or if someone
> just updates the "Year" portion, would it only return back that part --
> since the revisions seem to be returning data as much as they can based on
> changes from the previous revision, rather than the answer ignoring any
> other revisions.
>
> On Tue, Jun 3, 2014 at 1:59 AM, james harvey 
> wrote:
>
>> Given a Wikimedia Commons description page URL - such as:
>> https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
>>
>> I would like to be able to programmatically retrieve the information in
>> the "Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
>> "Dimensions", "Current location", etc.)
>>
>> I believe all this information is in "Template:Artwork".  I can't figure
>> out how to get the wikitext/json-looking template data.
>>
>> If I use the API and call:
>> https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
>> 
>>
>> Then I don't get the information I'm looking for.  This shows the most
>> recent revision, and its changes.  Unless the most recent revision changed
>> this data, it doesn't show up.
>>
>> To see all the information I'm looking for, it seems I'd have to specify
>> rvlimit=max and go through all the past revisions to figure out which is
>> most current.  For example, if I do so and I look at revid 79665032, that
>> includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
>> = 1889 | Technique = {{Oil on canvas}} | . . ."
>>
>> Isn't there a way to get the current version in whatever format you'd
>> call that - the wikitext/json looking format?
>>
>> In my API call, I can specify rvexpandtemplates which even with only the
>> most recent revision gives me the information I need, but it's largely in
>> HTML tables/divs/etc format rather than wikitext/json/xml/etc.
>>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unclear Meaning of $baseRevId in WikiPage::doEditContent

2014-06-03 Thread Adam Wight
It looks like we should leave the existing hook parameters values alone for
the moment, but it would improve the situation if we renamed variables
which seem to be overloaded or unclear, in MediaWiki core and in
FlaggedRevs.  What do you think of the following conventions,

'oldid' (index.php parameter) -- keep this name only to preserve interface
compatibility.  This refers to a historical revision when used in the
action=view case, and to the latest revision ID of the page at the time an
edit session begins.

$oldid -- keep as-is in the action=view codepath, rename to $parentRevId in
action=edit

$parentRevId -- latest available revision ID at the time an edit session
begins.  Used to detect conflicts, and identify the parent revision record
upon save.  This is updated during successful automatic rebase.  I don't
see a good use case for preserving what Daniel calls the "reference
revision," the parentRevId before rebase.

$baseRevId and $baseId -- rename everywhere to $contentsRevId, but
examining the code contexts for the smell of confounding with $parentRevId.

$contentsRevId -- revision ID of the source text to copy when performing
undo or rollback.  We will probably want to supplement hooks that only
passed $contentsRevId, such as NewRevisionFromEditComplete, with
$parentRevId as an additional parameter.

A refactor along these lines would keep me from losing already scant
marbles as I attempt to fix related issues in core:
https://gerrit.wikimedia.org/r/#/c/94584/ , I see now that I've already
begun to introduce mistakes caused by the difficult common-sense
interpretation of current variable naming.

-Adam


On Mon, Jun 2, 2014 at 1:22 AM, Daniel Kinzler  wrote:

> Am 30.05.2014 15:38, schrieb Brad Jorsch (Anomie):
> > I think you need to look again into how FlaggedRevs uses it, without the
> > preconceptions you're bringing in from the way you first interpreted the
> > name of the variable. The current behavior makes perfect sense for that
> > specific use case. Neither of your proposals would work for FlaggedRevs.
>
> As far as I understand the rather complex FlaggedRevs.hooks.php code, it
> assumes
> that
>
> a) if $newRev === $baseRevId, it's a null edit. As far as I can see, this
> does
> not work, since $baseRevId will be null for a null edit (and all other
> regular
> edits).
>
> b) if $newRev !== $baseRevId but the new rev's hash is the same as the base
> rev's hash, it's a rollback. This works with the current implementation of
> commitRollback(), but does not for manual reverts or trivial undos.
>
> So, FlaggedRevs assumes that EditPage resp WikiPage set $baseRevId to the
> edits
> logical parent (basically, the revision the user loaded when starting to
> edit).
> That's what I described as option (3) in my earlier mail, except for the
> rollback case; It would be fined with me to use the target rev as the base
> for
> rollbacks, as is currently done.
>
> FlaggedRevs.hooks.php also injects a baseRevId form field and uses it in
> some
> cases, adding to the confusion.
>
> In order to handle manual reverts and null edits consistently, EditPage
> should
> probably have a base revision as a form field, and pass it on to
> doEditContent.
> As far as I can tell, this would work with the current code in FlaggedRevs.
>
> > As for the EditPage code path, note that it has already done edit
> conflict
> > resolution so "base revision = current revision of the page". Which is
> > probably the intended meaning of false.
>
> Right. If that's the case though, WikiPage::doEditContent should probably
> set
> $baseRevId = $oldid, before passing it to the hooks.
>
> Without changing core, it seems that there is no way to implement a
> late/strict
> conflict check based on the base rev id. That would need an additional
> "anchor
> revision" for checking.
>
> The easiest solution for the current situation is to simply drop the strict
> conflict check in Wikibase and accept a race condition that may cause a
> revision
> to be silently overwritten, as is currently the case in core.
>
> -- daniel
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-03 Thread Derk-Jan Hartman
Actually, what you really seem to want is to make use of
iiprop=extmetadata, which is an API that makes use of
https://commons.wikimedia.org/wiki/Commons:Machine-readable_data
included in the various templates. The MultimediaViewer project also
uses this API.

https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=extmetadata|timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content

Where this is not accurate, you might have to fix up some templates to
make them better machine readable. It's all pretty new, and it's
basically a managed web scraper in itself, but it's probably better to
have one web scraper, than multiple.

DJ

On Tue, Jun 3, 2014 at 10:18 AM, james harvey  wrote:
> Sorry for the email spam.  Worked through it, I think.  Not too familiar
> with wiki internals.  :-)
>
> This particular page doesn't have the content I'm looking for in it.  It
> references a template which is used by a few other versions of the same
> image, presumably so the data can be stored once and be given consistently.
>  Not being familiar with wiki internals, that was looking to me like it
> wasn't returning the entire page content... But it is, so I'll have to
> recognize this situation and pull referenced templates when the information
> I need isn't already there.
>
>
> On Tue, Jun 3, 2014 at 2:45 AM, james harvey 
> wrote:
>
>> I may have stumbled upon it.  If I change the API call from
>> "titles=File:XYZ.jpg" to "titles=Template:XYZ" (note: dropped the .jpg)
>> then it *appears* to get me what I need.
>>
>> Is this correct, or did I run across a case where it appears to work but
>> isn't going to be the right way to go?  (Like, I'm not sure if
>> "Template:XYZ" directly relates to the Summary information on the
>> "File:XYZ.jpg" page, or if it's duplicated data that in this case matches.
>>  And, I'm confused why the .jpg gets dropped switching "File:" to
>> "Template:")
>>
>> And, will this always get me the full template information, or if someone
>> just updates the "Year" portion, would it only return back that part --
>> since the revisions seem to be returning data as much as they can based on
>> changes from the previous revision, rather than the answer ignoring any
>> other revisions.
>>
>> On Tue, Jun 3, 2014 at 1:59 AM, james harvey 
>> wrote:
>>
>>> Given a Wikimedia Commons description page URL - such as:
>>> https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
>>>
>>> I would like to be able to programmatically retrieve the information in
>>> the "Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
>>> "Dimensions", "Current location", etc.)
>>>
>>> I believe all this information is in "Template:Artwork".  I can't figure
>>> out how to get the wikitext/json-looking template data.
>>>
>>> If I use the API and call:
>>> https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
>>> 
>>>
>>> Then I don't get the information I'm looking for.  This shows the most
>>> recent revision, and its changes.  Unless the most recent revision changed
>>> this data, it doesn't show up.
>>>
>>> To see all the information I'm looking for, it seems I'd have to specify
>>> rvlimit=max and go through all the past revisions to figure out which is
>>> most current.  For example, if I do so and I look at revid 79665032, that
>>> includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
>>> = 1889 | Technique = {{Oil on canvas}} | . . ."
>>>
>>> Isn't there a way to get the current version in whatever format you'd
>>> call that - the wikitext/json looking format?
>>>
>>> In my API call, I can specify rvexpandtemplates which even with only the
>>> most recent revision gives me the information I need, but it's largely in
>>> HTML tables/divs/etc format rather than wikitext/json/xml/etc.
>>>
>>
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] learning Ops infrastructure (was: Re: 404 errors)

2014-06-03 Thread Sumana Harihareswara
Hi, Pine.

I, too, am interested in building our understanding of our TechOps
infrastructure. https://www.mediawiki.org/wiki/Presentations has some
explanations of some parts, as does http://wikitech.wikimedia.org/ . I
welcome more links to guides/overviews.

At the recent Zurich hackathon, other developers agreed that it would be
good to have a guide to Wikimedia's digital infrastructure, especially
how MediaWiki is used.
https://www.mediawiki.org/wiki/Overview_of_Wikimedia_infrastructure is
 a homepage with approximately nothing on it right now except this
diagram of our server architecture:
https://commons.wikimedia.org/wiki/File:Wikimedia_Server_Architecture_%28simplified%29.svg

You might find the Performance Guidelines illuminating
https://www.mediawiki.org/wiki/Performance_guidelines and you might also
like the recent tech talk about how we make Wikipedia fast, by Ori
Livneh and Aaron Schulz, recently - see
http://www.youtube.com/watch?v=0PqJuZ1_B6w (I don't know when the video
is going up on Commons).

-- 
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation


On 05/30/2014 06:30 PM, ENWP Pine wrote:
> 
> Ori, thanks for following up.
> 
> I think I saw somewhere that there is a list of postmortems for tech ops 
> disruptions
> that includes reports like this one. Do you know where the list is? I tried a 
> web search
> and couldn't find a copy of this report outside of this email list.
> 
> I personally find this report interesting and concise, and I am interested in
> understanding more about the tech ops infrastructure. Reports like this one
> are useful in building that understanding. If there's an overview of tech ops
> somewhere I'd be interested in reading that too. The information on English
> Wikipedia about WMF's server configuration appears to be outdated.
> 
> Thanks,
> 
> Pine
> 
> 
>> Date: Thu, 29 May 2014 22:38:10 -0700
>> From: Ori Livneh 
>> To: Wikimedia developers 
>> Subject: Re: [Wikitech-l] 404 errors
>> Message-ID:
>>  
>> Content-Type: text/plain; charset=UTF-8
>>
>> On Thu, May 29, 2014 at 1:34 PM, ENWP Pine  wrote:
>>
>>> Hi, I'm getting some 404 errors consistently when trying to load some
>>> English Wikipedia articles. Other pages load ok. Did something break?
>>>
>>
>> TL;DR: A package update went badly.
>>
>> Nitty-gritty postmortem:
>>
>> At 20:25 (all times UTC), change Ie5a860eb9[0] ("Remove
>> wikimedia-task-appserver from app servers") was merged. There were two
>> things wrong with it:
>>
>> 1) The appserver package was configured to delete the mwdeploy and apache
>> users upon removal. The apache user was not deleted because it was logged
>> in, but the mwdeploy user was. The mwdeploy account was declared in Puppet,
>> but there was a gap between the removal of the package and the next Puppet
>> run during which the account would not be present.
>>
>> 2) The package included the symlinks /etc/apache2/wmf and
>> /usr/local/apache/common, which were not Puppetized. These symlinks were
>> unlinked when the package was removed.
>>
>> Apache was configured to load configuration files from /etc/apache2/wmf,
>> and these include the files that declare the DocumentRoot and Directory
>> directives for our sites. As a result, users were served with 404s. At
>> 20:40 Faidon Liambotis re-installed wikimedia-task-appserver on all
>> Apaches. Since 404s are cached in Varnish, it took another five minutes for
>> the rate of 4xx responses to return to normal (20:45).[1]
>>
>> [0]: https://gerrit.wikimedia.org/r/#/c/136151/
>> [1]:
>> https://graphite.wikimedia.org/render/?title=HTTP%204xx%20responses%2C%202014-05-29&from=20:00_20140529&until=21:00_20140529&target=reqstats.4xx&hideLegend=true


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new auth/debug/Zero RfCs, + adding more office hours

2014-06-03 Thread Brad Jorsch (Anomie)
On Mon, Jun 2, 2014 at 5:58 PM, Sumana Harihareswara 
wrote:

> * I'm going to make more of an effort to invite specific people from
> diverse disciplines to RfC discussions, e.g.,

[...]

> design/product people for user-visible changes


I suggest also inviting actual users for user-visible changes. Not to
disrespect Design or Product, but some things are easy to overlook if
you're not involved in the day-to-day usage of things.


-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [ImageMap] Looking for a reviewer: Link directly to media file when [[Media:...]] namespace is used within an imagemap.

2014-06-03 Thread Antoine Musso
Le 02/06/2014 17:00, Remco de Boer a écrit :
> Hi,
> 
> I'm looking for someone to review and (hopefully) accept a very small (3
> line) patch I wrote for the ImageMap extension. The patch improves the
> behavior of image maps with links to [[Media:...]] files. Instead of
> linking to the image page, these links should go directly to the media file
> (comparable to how regular [[Media:...]] links in wiki text work).
> 
> My patch has been open for over three months now, and there is no
> registered maintainer of the ImageMap extension (
> https://www.mediawiki.org/wiki/Developers/Maintainers#MediaWiki_extensions_deployed_by_WMF).
> I've tried to ping some folks whom I thought might be able to review my
> patch, but none of them has responded.
> 
> The patch is in Gerrit, and can be viewed at
> https://gerrit.wikimedia.org/r/#/c/114439/

Hello,

I have added a basic parser test with
https://gerrit.wikimedia.org/r/#/c/137015/  you might want to rebase
your change on top of it and add a new test covering the feature
introduced there.  That will surely help review.

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-03 Thread Jeroen De Dauw
Hey,

As for the accusation that the current approach favors the WMF, it's almost
> not worth responding to.
>

It seems like what James is getting at is the core community, not the WMF.
The problem being that several people seem to thing that the concerns
raised are "not relevant" and "not worth responding to".

Composer is simply not meant to be used in the way it was shoe-horned into
> MediaWiki.
>

Starwman. I happen to have discussed the situation and the approach with
the main people behind Composer in person, as well as gone over details
with contributors on IRC. They did not seem to share your opinion.

I’m not going to re-explain this every time because it is in multiple
> places on Bugzilla and in this mailing list.

Who is asking you to re-explain this? What I want is you and others to stop
dismissing the concerns raised and coming up with a solution for the
problems caused, rather then repeating the same lines over and over again.

We are not going to misuse libraries and hack together MediaWiki just so
> extension installation can be *slightly* easier.
>

In other words, you are fine with breaking existing support, and screwing
over both users and developers of extensions such as SMW. In case of SMW,
the different is not slight, as it uses libraries itself.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Upcoming changes to GuidedTour extension

2014-06-03 Thread Matthew Flaschen

On 06/02/2014 08:16 PM, Steven Walling wrote:

GuidedTour's backend has also undergone a major refactor, which is close
to being merged. This is described in full at the commit, which is just
waiting on us to update logging: https://gerrit.wikimedia.org/r/#/c/116228/


As Steven noted, you should not have to change your code using the old 
API immediately.  However, we hope you will see a benefit to the new 
API, in flexibility, new features, and readability.


We are polishing the API and finishing final testing.  The documentation 
for the upcoming API is available at 
http://growthdoc.wmflabs.org/NonLinearGuidedTourPreview/ (this URL is 
temporary), so if you think something could be clarified or tweaked, 
please let us know.


The basic idea of the new API is that you start by constructing a 
TourBuilder.  From that, you construct StepBuilder objects by calling 
firstStep (for the entry point) 
(http://growthdoc.wmflabs.org/NonLinearGuidedTourPreview/#!/api/mw.guidedTour.TourBuilder-method-firstStep) 
or step 
(http://growthdoc.wmflabs.org/NonLinearGuidedTourPreview/#!/api/mw.guidedTour.TourBuilder-method-step) 
for other steps.


Steps 
(http://growthdoc.wmflabs.org/NonLinearGuidedTourPreview/#!/api/mw.guidedTour.StepBuilder) 
can listen for events, which can then trigger a transition to another 
step, hide the tour, or end it.  This is controlled by .transition() 
(http://growthdoc.wmflabs.org/NonLinearGuidedTourPreview/#!/api/mw.guidedTour.StepBuilder-method-transition). 
 .next() sets the next step, which can also be dynamic.


The details are covered by the API documentation; I just wanted to give 
a broad overview.


The old API is deprecated, and will not be maintained permanently (we 
will keep you updated on any plans to remove it).


Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Quarterly reviews of high priority WMF initiatives

2014-06-03 Thread Tilman Bayer
Minutes and slides from last week's quarterly review of the
Foundation's Mobile Contributions team are now available at
https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews/Mobile_contributions/May_2014
.

On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller  wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> ___
> Wikimedia-l mailing list
> wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



-- 
Tilman Bayer
Senior Operations Analyst (Movement Communications)
Wikimedia Foundation
IRC (Freenode): HaeB

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HTML templating decision soon?

2014-06-03 Thread Sumana Harihareswara
Hi everyone,

We've been talking about the HTML templating RFC since the Architecture
Summit in January. And in the Mobile quarterly review the other day,
they said - see
https://commons.wikimedia.org/w/index.php?title=File%3AMobile_Web_%26_App_Quarterly_Review_05-2014.pdf&page=62
- that they would love a decision soon. So it seems like we ought to
figure it out so it can get off the RfC docket. :-)


https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library

https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library/Knockoff_-_Tassembly

I think this is what's up: Many developers like the work that Matt and
Gabriel have been doing on Knockoff, but we're currently lacking all of
the pieces necessary to make a final call. Some folks were testing it
out and need to report back to the list with their verdicts. In the
meantime, some developers (such as the Mobile and Flow teams) have
short-term needs that can't wait up for Knockoff to become a complete
solution, and so are working out interim standardizations outside of
this mailing list so that they can move forward while Knockoff work
continues. (Not sure what all of them are.)

Is this about right? Should I be saying Knockoff or Knockout? Can I put
the RfC decision meeting on the calendar for next week? :-)

Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-03 Thread Krinkle
The link Derk-Jan mentioned is a great resource indeed.

However consumption of that data (mostly HTML structures) isn't very straight 
forward in most programming languages.

A tool has been written to, until we have a better way in the MediaWIki 
software, expose this information:

https://commons.wikimedia.org/wiki/Commons:Commons_API

https://tools.wmflabs.org/magnus-toolserver/commonsapi.php

https://tools.wmflabs.org/magnus-toolserver/commonsapi.php?image=Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&meta

— Krinkle

On 3 Jun 2014, at 14:11, Derk-Jan Hartman  wrote:

> Actually, what you really seem to want is to make use of
> iiprop=extmetadata, which is an API that makes use of
> https://commons.wikimedia.org/wiki/Commons:Machine-readable_data
> included in the various templates. The MultimediaViewer project also
> uses this API.
> 
> https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=extmetadata|timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
> 
> Where this is not accurate, you might have to fix up some templates to
> make them better machine readable. It's all pretty new, and it's
> basically a managed web scraper in itself, but it's probably better to
> have one web scraper, than multiple.
> 
> DJ
> 
> On Tue, Jun 3, 2014 at 10:18 AM, james harvey  
> wrote:
>> Sorry for the email spam.  Worked through it, I think.  Not too familiar
>> with wiki internals.  :-)
>> 
>> This particular page doesn't have the content I'm looking for in it.  It
>> references a template which is used by a few other versions of the same
>> image, presumably so the data can be stored once and be given consistently.
>> Not being familiar with wiki internals, that was looking to me like it
>> wasn't returning the entire page content... But it is, so I'll have to
>> recognize this situation and pull referenced templates when the information
>> I need isn't already there.
>> 
>> 
>> On Tue, Jun 3, 2014 at 2:45 AM, james harvey 
>> wrote:
>> 
>>> I may have stumbled upon it.  If I change the API call from
>>> "titles=File:XYZ.jpg" to "titles=Template:XYZ" (note: dropped the .jpg)
>>> then it *appears* to get me what I need.
>>> 
>>> Is this correct, or did I run across a case where it appears to work but
>>> isn't going to be the right way to go?  (Like, I'm not sure if
>>> "Template:XYZ" directly relates to the Summary information on the
>>> "File:XYZ.jpg" page, or if it's duplicated data that in this case matches.
>>> And, I'm confused why the .jpg gets dropped switching "File:" to
>>> "Template:")
>>> 
>>> And, will this always get me the full template information, or if someone
>>> just updates the "Year" portion, would it only return back that part --
>>> since the revisions seem to be returning data as much as they can based on
>>> changes from the previous revision, rather than the answer ignoring any
>>> other revisions.
>>> 
>>> On Tue, Jun 3, 2014 at 1:59 AM, james harvey 
>>> wrote:
>>> 
 Given a Wikimedia Commons description page URL - such as:
 https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
 
 I would like to be able to programmatically retrieve the information in
 the "Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
 "Dimensions", "Current location", etc.)
 
 I believe all this information is in "Template:Artwork".  I can't figure
 out how to get the wikitext/json-looking template data.
 
 If I use the API and call:
 https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
 
 
 Then I don't get the information I'm looking for.  This shows the most
 recent revision, and its changes.  Unless the most recent revision changed
 this data, it doesn't show up.
 
 To see all the information I'm looking for, it seems I'd have to specify
 rvlimit=max and go through all the past revisions to figure out which is
 most current.  For example, if I do so and I look at revid 79665032, that
 includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
 = 1889 | Technique = {{Oil on canvas}} | . . ."
 
 Isn't there a way to get the current version in whatever format you'd
 call that - the wikitext/json looking format?
 
 In my API call,

Re: [Wikitech-l] [Huggle] Huggle 3 released / Mac people needed

2014-06-03 Thread James Alexander
I'm going to be playing around over the next couple days trying to build a
mac installer/dmg if anyone wants to help test it let me know (I'll post
here when I have a beta too).

James Alexander
Legal and Community Advocacy
Wikimedia Foundation
(415) 839-6885 x6716 @jamesofur


On Mon, Jun 2, 2014 at 12:03 AM, Bryan Davis  wrote:

> On Sun, Jun 1, 2014 at 2:57 PM, Petr Bena  wrote:
> > Yes there has been an update recently, we released 3.0.0 :)
> >
> > I totally agree, actually ubuntu users can install & run huggle using
> > 1 line in terminal, and a goal is to have it as much accessible for
> > everyone as possible (unfortunately no apt-get for windows, nor mac).
> > If you are able to build it (the latest version) please provide
> > details. Thanks
>
> I updated the the build steps at
> https://en.wikipedia.org/wiki/Wikipedia:Huggle/Huggle3_Beta#Mac_OS
> with the steps I needed to build the binary.
>
> Bryan
> --
> Bryan Davis  Wikimedia Foundation
> [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
> irc: bd808v:415.839.6885 x6855
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [BREAKING CHANGE] Upcoming jQuery upgrade: Removing jQuery Migrate

2014-06-03 Thread Krinkle
Hey all,

TL;DR:
* We did not make the breaking change last week for Wikimedia; it is postponed.
* MediaWiki 1.24.0 will ship with jQuery Migrate switched off.
* Wikimedia & non-Wikimedia wikis can enable jQuery Migrate if needed.
* When MediaWiki 1.24 is released, we will switch off jQuery Migrate for 
Wikimedia wikis.

As we said last month, we are upgrading MediaWiki's jQuery to version 1.11.x 
[1],
which removes some long-deprecated functions. These were done in phase one [2]
and two [3] as of last week.

However, we have held off with phase 3 (removing the jQuery Migrate plugin) for
a while to give migration a little more time.

The amount of migration work necessary has been much less than I anticipated.
Very few scripts have actually needed changing, probably because these features
have been deprecated for quite a while now. Some of newer developers may never
have even known of these APIs.

Having said that, it does take a while for a message like this to spread out to
some members of our large community. While those wiki scripts and gadgets which
have active maintainers have been looked into, and (where needed) updated
accordingly, a large number of gadgets and scripts in the cluster of Wikimedia
wikis have not yet had a chance to get a grip on this, let alone extensions and
wikis outside of Wikimedia.

While I don't want to set another deadline, I think it makes sense to ship the
jQuery Migrate plugin for one release (with MediaWiki 1.24), and then remove it
in MediaWiki 1.25. This will give all wikis' communities and extension authors
a few more months until the MediaWiki 1.24 branch point (in Autumn 2014) before
they will need to make adjustments to work with MediaWiki master.

MediaWiki 1.24.0 stable will disable support for legacy jQuery by default. If
you find you have scripts, gadgets or extensions still using legacy jQuery APIs,
you can enable them with a simple line in LocalSettings.php:

$wgIncludejQueryMigrate = true;

And of course when you find you need to do this, please make an effort to
ensure the maintainers of those items still using these legacy features are
aware of this so they may patch the code accordingly (using the upgrade guide[4]
and my earlier announcement [5]). When MediaWiki 1.24 is released, we will
switch off jQuery Migrate for Wikimedia wikis.

— Krinkle

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44740
[2] https://gerrit.wikimedia.org/r/#/c/131494/
[3] https://gerrit.wikimedia.org/r/#/c/133477/
[4] http://jquery.com/upgrade-guide/1.9/
[5] http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg75735.html

On 7 May 2014, at 18:29, Krinkle  wrote:

> Hey all,
> 
> TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
> major release removes deprecated functionality. Please migrate away from this
> deprecated functionality as soon as possible.
> 
> It's been a long time coming but we're now finally upgrading the jQuery 
> package
> that ships with MediaWiki.
> 
> We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a 
> couple
> of years ago due to lack of time and concern about disruption. Because of 
> this,
> many developers have needed to work around bugs that were already fixed in 
> later
> versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
> the first release in jQuery history that needed an upgrade guide[1][2]. It's a
> major release that cleans up deprecated and dubious functionality.
> 
> Migration of existing code in extensions, gadgets, and user & site scripts
> should be trivial (swapping one method for another, maybe with a slight change
> to the parameters passed). This is all documented in the upgrade guide[1][2].
> The upgrade guide may look scary (as it lists many of your favourite methods),
> but they are mostly just addressing edge cases.
> 
> == Call to action ==
> 
> This is a call for you, to:
> 
> 1) Get familiar with http://jquery.com/upgrade-guide/1.9/.
> 
> 2) Start migrating your code.
> 
> jQuery v1.9 is about removing deprecated functionality. The new functionality 
> is
> already present in jQuery 1.8 or, in some cases, earlier.
> 
> 3) Look out for deprecation warnings.
> 
> Once instrumentation has begun, using "?debug=true" will log jQuery 
> deprecation
> warnings to the console. Look for ones marked "JQMIGRATE" [7]. You might also
> find deprecation notices from mediawiki.js, for more about those see the mail
> from last October [8].
> 
> == Plan ==
> 
> 1) Instrumentation and logging
> 
> The first phase is to instrument jQuery to work out all the areas which will
> need work. I have started work on loading jQuery Migrate alongside the current
> version of jQuery. I expect that to land in master this week [6], and roll 
> out on
> Wikimedia wikis the week after. This will enable you to detect usage of most
> deprecated functionality through your browser console. Don't forget the 
> upgrade
> guide[1], as Migrate cannot detect everything.
> 
> 2) 

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-06-03 Thread Matthew Flaschen

On 05/14/2014 01:07 PM, Siebrand Mazeland wrote:



Op 14 mei 2014 om 14:58 heeft Krinkle  het volgende 
geschreven:

I don't think it is possible or worth the effort to scan for these in an 
automated fashion within Jenkins.

Static analysis is virtually impossible due to the method names being too 
simple, and lots of it relies on details of how methods are called, as well.


At translatewiki.net, we log client side issues using a script[1]. Might 
something like that be of any benefit?


That's catching actual errors in production, which is definitely 
suboptimal compared to catching it during development/maintenance. 
However, it's better than not catching it at all until a technical user 
notices and reports the bug.


Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-06-03 Thread Matthew Flaschen

On 05/22/2014 09:25 PM, James Forrester wrote:

Possibly, though I would suggest that it is not loaded by default. Frankly
if an extension's authors have abandoned their extension to the extent that
after several years' clear warning and a six month-long notice period they
still didn't do a relatively trivial set of fixes, then it's reasonable to
make it necessary for sysadmins to make a (small) effort acknowledging that
this code is toxic and should only be used if you're willing to wade into
"here be dragons" territory.


What notice period are you referring to?

Matt Flaschen


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-06-03 Thread James Forrester
On 3 June 2014 17:01, Matthew Flaschen  wrote:

> On 05/22/2014 09:25 PM, James Forrester wrote:
>
>> Possibly, though I would suggest that it is not loaded by default. Frankly
>> if an extension's authors have abandoned their extension to the extent
>> that
>> after several years' clear warning and a six month-long notice period they
>> still didn't do a relatively trivial set of fixes, then it's reasonable to
>> make it necessary for sysadmins to make a (small) effort acknowledging
>> that
>> this code is toxic and should only be used if you're willing to wade into
>> "here be dragons" territory.
>>
>
> What notice period are you referring to?
>

I don't recall; probably the announcement at the architecture summit?

There have been a number of high-profile semi-announcements as well as
comments and code reviews relating to this, most obviously this one:

http://lists.wikimedia.org/pipermail/wikitech-l/2012-November/064280.html​

​J.
-- 
James D. Forrester
Product Manager, VisualEditor
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating decision soon?

2014-06-03 Thread S Page
On Jun 3, 2014 2:56 PM, "Sumana Harihareswara"  wrote:
> Some folks were testing [Knockoff]
> out and need to report back to the list with their verdicts.

Who? Tell us more!

> In the
> meantime, some developers (such as the Mobile and Flow teams) have
> short-term needs that can't wait up for Knockoff to become a complete
> solution, and so are working out interim standardizations outside of
> this mailing list so that they can move forward while Knockoff work
> continues. (Not sure what all of them are.)

MobileFrontend has been using Hogan JS templating since January 2013.

Flow recently chose handlebars JS templating because it has a working
fast PHP re-implementation (lightncandy) to support no-JavaScript
clients.

For client-side templating you need ResourceLoader to supply the
templates to the client. Jon Robson has developed the Mantle
extension[1] that implements
* a ResourceLoaderTemplateModule that does this
* JS functions that abstract getting a template, compiling and caching
it, and rendering it
* specific implementations of these functions for the handlebars and
hogan JS libraries.

MobileFrontend and Flow will start using this shared code in
production in the next few weeks or so.

In order for Flow to share templates between front-end JS and server
PHP, Flow has had to write helper functions in both JS and PHP[2].
Some like message i18n, human-friendly timestamps, escaping, etc. are
more generic than others.

These experiences in generalized JS template support and developing
helper functions across JS and PHP could inform Knockoff development.
So far the Flow team is doing well with handlebars/lightncandy
templating but we're not advocating it over Knockout/Knockoff. The
reactive model-view updating of Knockout (in JavaScript) is an
attractive additional feature missing from Hogan and handlebars
templating; again, Flow couldn't wait.

> Should I be saying Knockoff or Knockout?
From the RFC page, Gabriel WIcke & Matthew Walker's "Knockoff"
templates are KnockoutJS compatible. AIUI, GW&MW have a JS compiler
that compiles them into GW&MW's "Knockoff - Tassembly" intermediate
representation, and their goal is to to render templates in the latter
format from both PHP and JavaScript. In JavaScript you'd still load
the Knockout JS for its reactive model-view updates.

Hope this helps. No slight intended to any others working on GW&MW MW KO code :)

[1] https://www.mediawiki.org/wiki/Extension:Mantle#Templates
[2] https://www.mediawiki.org/wiki/Flow/Architecture/Templating

--
=S Page  WikiMedia Features engineer

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l