Re: [Wikitech-l] Disabling JS support in additional browsers

2014-08-24 Thread Krinkle
Opera  12 has already been dropped by jQuery for a while now[1], and was also 
removed from Grade A support for MediaWiki core[2] since MediaWiki 1.17.

That is to say, we've not tested or prioritised anything pertaining Opera 11 or 
below for several years now. It wasn't blacklisted yet from the startup module 
because we didn't yet know whether it could pass as Grade X and apparently 
there hasn't been enough traffic/interest from anyone to bother adding the 
blacklist for it. I mean, we don't (and aren't going to) blacklist Netscape 
either.

But I'd say old Opera is significant enough that it's worth blacklisting  12.

Firefox 3.5 and 3.6 was dropped in MediaWiki 1.22.0 (from Grade A to Grade B; 
we blacklisted Firefox  4), but we changed that to Firefox  3 because we 
found that (despite Firefox 3.6 not being officially supported by Mozilla and 
jQuery) it's feature set was good enough to just give it everything (Grade X 
instead of Grade B) per a Village Pump thread requesting it.[3]

— Krinkle

[1] https://jquery.com/browser-support/
[2] 
https://www.mediawiki.org/w/index.php?title=Compatibilityoldid=1119439#Browser_support_matrix
[3] 
https://github.com/wikimedia/mediawiki-core/commit/ceaa7ddada7d1426cab2b76b9d6570e2dce4162d

On 6 Aug 2014, at 20:52, Erik Moeller e...@wikimedia.org wrote:

 Following up on disabling JavaScript support for IE6 [1], here is some
 additional research on other browsers. I'd appreciate if people with
 experience testing/developing for/with these browsers would jump in
 with additional observations. I think we should wait with adding other
 browsers to the blacklist until the IE6 change has been rolled out,
 which may expose unanticipated consequences (it already exposed that
 Common.js causes errors in blacklisted browsers, which should be fixed
 once [2] is reviewed and merged).
 
 As a reminder, the current blacklist is in resources/src/startup.js.
 
 As a quick test, I tested basic browsing/editing operation on English
 Wikipedia with various browsers. Negative results don't necessarily
 indicate that we should disable JS support for these browsers, but
 they do indicate the quality of testing that currently occurs for
 those browsers. Based on a combination of test results, unpatched
 vulnerabilities and usage share, an initial recommendation for each
 browser follows.
 
 Note that due to the heavy customization through gadgets/site scripts,
 there are often site-specific issues which may not be uncovered
 through naive testing.
 
 == Microsoft Internet Explorer 7.x ==
 
 Last release in series: April 2009
 
 - Browsing: Most pages work fine (some styling issues), but pages with
 audio files cause JavaScript errors (problem in TMH).
 - Editing: Throws JS error immediately (problem in RefToolbar)
 
 Both of these errors don't occur in IE8.
 
 Security vulnerabilities:
 
 Secunia reports 15 out of 87 vulnerabilities as unpatched, with the
 most serious one being rated as moderately critical (which is the
 same as IE6, while the most serious IE8 vulnerability is rated less
 critical).
 
 Usage: 1%
 
 Recommendation: Add to blacklist
 
 == Opera 8.x ==
 
 Last release in series: September 2005
 
 Browsing/editing: Works fine, but all JS fails due to a script
 execution error (which at least doesn't cause a pop-up).
 
 Security: Secunia reports 0 unpatched vulnerabilities (out of 26).
 
 Usage: 0.25%
 
 Recommendation: Add to blacklist
 
 == Opera 10.x-12.x ==
 
 Last release in series: April 2014
 
 Browsing/editing: Works fine, including advanced features like
 MediaViewer (except for 10.x)
 
 Security: No unpatched vulnerabilities in 12.x series according to
 Secunia, 2 unpatched vulnerabilities in 11.x (less critical) and 1
 unpatched vulnerability in 10.x (moderately critical)
 
 Usage: 1%
 
 Recommendation: Maintain basic JS support, but monitor situation re:
 10.x and add that series to blacklist if maintenance cost too high
 
 == Firefox 3.6.* ==
 
 Last release in series: March 2012
 
 Browsing/editing: Works fine (MediaViewer disables itself)
 
 Security: 0 unpatched vulnerabilities according to Secunia
 
 Recommendation: Maintain basic JS support
 
 == Firefox 3.5.* ==
 
 Last release in series: April 2011
 
 Browsing/editing: Works fine (MediaViewer disables itself)
 
 Security: 0 unpatched vulnerabilities according to Secunia
 
 Recommendation: Maintain basic JS support
 
 == Safari 4.x ==
 
 Last release in series: November 2010
 
 Browsing/editing: Works fine
 
 Security: 1 unpatched highly critical vulnerability according to
 Secunia (exposure of sensitive information)
 
 Recommendation: Maintain basic JS support, but monitor
 
 == Safari 3.x ==
 
 Last release in series: May 2009
 
 Browsing/editing: Completely messed up, looks like CSS doesn't get loaded at 
 all
 
 Security: 2 unpatched vulnerabilities, highly critical
 
 Usage share: Usage reports for Safari in [3] are broken, all Safari
 versions are reported as 0.0. However, [4] suggests

Re: [Wikitech-l] Moving Vector and MonoBook to separate repositories and what it means for you

2014-07-29 Thread Krinkle
@John: Extensions are git repositories. Moving it to an extension involves 
moving them in their own repo, like any other extension. I guess you're mostly 
concerned about it being repositories not under mediawiki/extensions, because 
it'll be a repository either way.

@Bartosz:

I'm inclined to agree. I personally do not see any use in having 
mediawiki/skins/* in Gerrit as separate structure for repositories that are 
extensions in every way. An extension can provide hooks, messages, config 
variables, special pages, skins, api modules, resource modules and more. A skin 
repo would typically provide at least three of these (messages, skin, resource 
modules) and possibly hooks and config variables as well. What's the point in 
having a separate scheme for repos that provide some arbitrary subset of these 
features?

But more important than the naming scheme in Gerrit (which I could care less 
about) is the local development workflow which affects developer productivity 
and understandability of our eco system. Let's aim to keep be as simple as 
possible without compromising on important benefits.

We have an mediawiki/extensions.git meta repository. To avoid conflicts with 
MediaWiki core's extensions directory (which, albeit being almost empty, will 
still conflict in unproductive ways when working with wmf branches), I always 
advocate people set up an extensions directory on their disk elsewhere (e.g. 
next to mediawiki-core, not inside), either as the meta repo clone or as your 
own container directory with git clones of individual extensions inside of it. 
Then simply configure $wgExtensionAssetsPath to point at 
localhost/git/extensions or whatever and require_once in LocalSettings from 
../extensions instead.

That's a one time setup quite a few developers have already from what I 
understand (Reedy, Chad and Roan all recommended it to me originally, not sure 
who had it first) and from then on you just git-clone path or 
git-submodule-update--init path any extension you run into when working in 
different projects, and can add a require_once and it all just works. 

This could be set up for skins as well, but it's tricky. Aside from having two 
systems then, it's still tricky. At least for a while to come (working with 
current wmf branches, and release branches) we can't guarantee skins/ to be 
empty. And unless we introduce a separate core skins path / external skins path 
variable, you can't really set one without breaking the other.

It is possible to get it right but it comes at the cost of several months / up 
to 2-3 years of inconvenience locally for everyone. We haven't committed to 
this new structure yet, and instead of taking this opportunity to create a mess 
for years to come, I'd rather take this opportunity to get rid of the mess that 
is mediawiki/skins altogether and just fold it all back into extensions.

To get the ball rolling: What's the downside of going that route? We have quite 
a lot to gain in terms of simplicity and compatibility.

Breaking things can be good, but I haven't seen any short or long term benefits 
so far that would justify it.

— Krinkle

On 28 Jul 2014, at 23:02, John phoenixoverr...@gmail.com wrote:

 I use a standard git checkout. Moving these to their own separate location
 is going to be a pain in the ass. If the skins are moved to the existing
 extension system it causes far fewer problems and does not introduce
 additional steps in upgrading/maintaining a site. When we start having sub
 repos and forking left and right it gets ugly. We already have an existing
 framework for adding modules to mediawiki  (Extensions) let's use that vs
 re-inventing the wheel.
 
 On Monday, July 28, 2014, Bartosz Dziewoński matma@gmail.com wrote:
 
 On Mon, 28 Jul 2014 22:59:40 +0200, John phoenixoverr...@gmail.com
 wrote:
 
 Why not just move them to an extension? moving them to their own repo begs
 for a headaches
 
 
 I don't understand the problem you see here nor the solution you're
 proposing. Elaborate?
 
 --
 Matma Rex
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] jQuery UI 1.10.4 and IE6 support

2014-07-28 Thread Krinkle
Support can mean either depending on the context.

Most of this is uncontroversial, but I found it useful to think through and
sum up.

From a platform perspective to support a browser means that the user
experience is considered acceptable, tested against, and we're committed to
keeping it that way (which may mean moving a browser or mediawiki feature from
one Grade to another). Don't forget that there's a lot more to our platform
than javascript execution!

Supporting users of browsers we provide a javascript-less experience still
requires a lot of things to work; such as:

* Content delivery. (Can't require HTTP features that don't work, e.g. some
sites require every user to be in a session with an ID and for new visitors
they respond with an empty page that sets session cookies based and refreshes
to display the real page, we couldn't do that if we want to support browsers
without cookies or with cookies disabled.)
* Enough styles for the page to be usable. (Let's say background-image were
new in CSS3, then we couldn't exclusively have a design with white text on a
background image without a fallback colour to ensure the text is readable
without that image.)
* HTML implementation. (Say a supported browser doesn't allow relative urls in
a form action attribute, we'd have to make it an absolute url.)
* Character encoding. (If certain unicode literals aren't interpreted
properly, we may have to explicitly encode them.)
* We'd commit to doing our best to keep their stuff secure (e.g. while we may
patch against a browser-specific CSRF exploit for an unsupported browser out
of the goodness of our hearts, we pretty much have to if it affects a
supported browser).

All these areas and more do in fact have problems we account for, but I think
we've been at it long enough that we've got these bases covered in MediaWiki
and in our web servers and caching proxies. However as we keep introducing new
backend code and iterate our infrastructure, we need to ensure we don't miss
anything.

-- Timo

On 27 Jul 2014, at 18:42, Jon Robson jdlrob...@gmail.com wrote:

 A few quick notes:
 
 * we should be killing jQuery ui. not upgrading it :)
 * progressive enhancement != supporting IE6. We should be doing this
 anywhere. Personally I would be fine with giving IE6,7, even 8 and maybe 9
 no JavaScript whatsoever and supporting them simply from simply a css
 perspective. People can edit and read without JavaScript.
 * I think we should be careful when we say support. Does support meaning
 any new features we write in JavaScript have to work on these platforms or
 does in mean they need to be usable? I'd say the latter. It sounds like the
 discussion is around supporting JS..
 On 24 Jul 2014 13:49, Sumana Harihareswara suma...@wikimedia.org wrote:
 
 Replying with a subject line. :) Good luck Thomas.
 
 Sumana Harihareswara
 Senior Technical Writer
 Wikimedia Foundation
 
 
 On Thu, Jul 24, 2014 at 4:24 PM, Thomas Mulhall 
 thomasmulhall...@yahoo.com
 wrote:
 
 Hi should we upgrade jquery ui to version 1.10.4. even though we recently
 upgraded to version 1.9.2 we could upgrade to 1.10.4 in Mediawiki 1.25.
 The
 main difference is it removes internet explorer 6 support which as long
 as
 internet explorer 6 users can edit and view the page it wont matter to
 them. here is the changelog jqueryui.com/changelog/1.10.0/
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Policy on browser support

2014-07-27 Thread Krinkle
On 26 Jul 2014, at 22:32, Steven Walling steven.wall...@gmail.com wrote:

 This seems really reasonable.
 
 Are we still agreed that Grade A means anything over 1% of readership? If
 so, we should reconfirm what our browser share is really like, because last
 time I checked, IE6 was less than 1% of total and thus eligible for
 dropping from Grade A now and forever (he says with great
 antici.pation.)
 

1% was never the figure as far as I know (certainly not for Grade A).

For the longest time most documents and practices used the 0.01% mark (that
is, before MediaWiki 1.17; predating ResourceLoader, jQuery and most
front-end features). During these years it basically meant we didn't
support browsers with less than 0.01% share (e.g. may be broken, insecure,
or otherwise less usable). Support for the rest was cheap considering the
relatively primitive nature of our front-end at the time.

After MediaWiki 1.17 we started to distinguish between support (site is
usable and readable) and feature rich. It was OK to implement features
that would only be available for newer browsers. We disabled resource
delivery for IE 4 and 5 as their engines are not adequate and made it
technically infeasible to implement essentials. As they still had enough
traffic we didn't want them getting a broken page, so came to be a
non-javascript mode.

Around this same time the figure seemed to have been changed from 0.01% to
0.1%. From time onwards we also had two levels of support within javascript
mode (Grade A and B) 0.1% applied to Grade B, *not Grade A*. Meaning our
aim was to dedicate resources to support any browser with 0.1% or more
traffic. Thus guaranteeing that 99.9% of our users would have an experience
that we support (whether it be non-javascript, some features, or all
features). Back then non-javascript mode was actually for browsers with
less than even 0.1% traffic and were neither Grade A or B. This provided
them a a better fallback (though no official support), so that e.g. IE4,
IE5 and ancient Netscape users wouldn't be bothered with loads of scripts
that won't work. Instead they get a relatively lightweight page to try
their best on.

Here minimal support meant 1) Grade B feature support or blacklist in
startup module to trigger non-javascript mode, 2) ensure our basic
stylesheets work in this browser, 3) ensure our backend infrastructure is
compatible with those browsers (url schemas, critical HTTP features,
security, image formats etc.)

Note how then[1], IE6-8 were considered Grade B, not A.

And remember that both 0.01% and 0.1% are both pretty big at our scale. Our
efforts aren't driven by how profitable that 0.1% is but by our mission
statement. For a browser to be truly unsupported takes a lot of moral
consideration (e.g. if the server may use a protocol the browser doesn't
support they might get nothing at all).

After 2010 we kind of started to forgot about all this. Few updates to the
charts and little enforcement of it.

Regardless of what the page said, we never actually distinguished between
Grade A and B. Both were given the same treatment and care. We never had
any features that only work in Grade A. We did optimise for Grade A, but no
exclusive user facing features (except for VisualEditor and other features
that adopted a different browser support matrix).

While I've personally tried to keep minimal browser support somewhat in
check, the document was simply no longer accurate and focussed too much on
individual browser versions and not on the actual policy. Thus we archived
it last year. Yesterday I've published a more refined version of this
policy at https://www.mediawiki.org/wiki/Compatibility#Browsers documenting
mostly current practices, factual state of the software (we have a startup
module, we do blacklist browsers, our basic layout was made to work in
IE6/FF2 etc.) and status quo (to my knowledge all front-end code in core
supports IE6 and I'm quite certain code not working in IE6 would not be
approved by anyone having mediawiki-core merge authority).

Since Grade B never ended up being recognised in any way by the software,
I've kept that out. And the previously undocumented Grade C represents
browsers we are interested in supporting due to their traffic but only via
the non-javascript mode.

https://www.mediawiki.org/wiki/Compatibility#Browsers

[1] 
https://www.mediawiki.org/w/index.php?title=Compatibility/Software_for_using_MediaWikioldid=1054337#Desktop_browsers

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] jQuery UI 1.10.4 and IE6 support

2014-07-26 Thread Krinkle
Before we get all up in an IE6 debate, imho the more important difference
is that jQuery UI 1.10 drops support for the (now deprecated) jQuery 1.8
API.[2]

Considering we only recently upgraded jQuery UI from 1.8 to 1.9, and that
that upgrade was the first of its kind since the introduction of the
framework in MediaWiki 1.16, we should give it a fair amount of time to
allow everyone a fair opportunity to know about this breaking change.

Similar to how MediaWiki maintains releases, jQuery UI continues to support
and maintain 1.9.[1] If there's important security or bug fixes, they'll
make a minor release that we can upgrade to without breaking anything. So
while 1.10 is newer than 1.9, it is still totally supported and there's
no pressure to upgrade from that angle. (There's pressure from other angles
but that's another topic.)

For those aware of the fact that jQuery UI 1.10 is not the latest version
(1.11 is, released last month), the 1.11 release is interesting because,
like jQuery UI 1.10, it also dropped support for a major browser (IE7).[3]
So we'll probably want to upgrade to 1.10 first. Aside from browser
support, our code is distributed so we can only upgrade to a version that
drops support for an API after we first upgrade to a version that provides
the new API.[4]

— Krinkle

[1] http://jqueryui.com/download/ and their source repository continues
  to maintain jQuery UI 1.11.x, 1.10.x and 1.9.x
[2] http://blog.jqueryui.com/2013/01/jquery-ui-1-10-0/
[3] http://blog.jqueryui.com/2014/06/jquery-ui-1-11-0/
[4] http://jqueryui.com/upgrade-guide/1.10/

On 24 Jul 2014, at 22:03, Thomas Mulhall thomasmulhall...@yahoo.com wrote:

 Thanks. 
 
 
 On Thursday, 24 July 2014, 21:49, Sumana Harihareswara 
 suma...@wikimedia.org wrote:
 
 
 
 Replying with a subject line. :) Good luck Thomas.
 
 
 
 Sumana Harihareswara
 Senior Technical Writer
 Wikimedia Foundation 
 
 
 On Thu, Jul 24, 2014 at 4:24 PM, Thomas Mulhall thomasmulhall...@yahoo.com 
 wrote:
 
 Hi should we upgrade jquery ui to version 1.10.4. even though we recently 
 upgraded to version 1.9.2 we could upgrade to 1.10.4 in Mediawiki 1.25. The 
 main difference is it removes internet explorer 6 support which as long as 
 internet explorer 6 users can edit and view the page it wont matter to them. 
 here is the changelog jqueryui.com/changelog/1.10.0/
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] jQuery UI 1.10.4 and IE6 support

2014-07-26 Thread Krinkle
As for IE6, that roadmap is quite simple in my opinion.

At this point MediaWiki already degrades gracefully in older browsers in a 
number of different ways. We've put our cut off point for javascript execution 
in general at IE  6 and Firefox  4. And for stylesheets we also support IE6 
for the basic layout (enough for text to be readable in a way that isn't 
distorted or hard to read).

In any browsers where we don't abort the javascript pipeline from the 
startup[1] module, there must be no fatal errors or uncaught exceptions due to 
browser support.

While a library doesn't have to throw an exception in an older browser per se, 
in case of jQuery UI it's quite simple. We can only upgrade to jQuery 1.10 when 
we drop IE6 support for Grade A. And when we do, IE6 will become 
javascriptless[1] and jQuery UI will no longer be relevant as problem in IE6.

— Krinkle

[1] https://www.mediawiki.org/wiki/Compatibility#Browsers

On 26 Jul 2014, at 11:03, Krinkle krinklem...@gmail.com wrote:

 Before we get all up in an IE6 debate, imho the more important difference
 is that jQuery UI 1.10 drops support for the (now deprecated) jQuery 1.8
 API.[2]
 
 Considering we only recently upgraded jQuery UI from 1.8 to 1.9, and that
 that upgrade was the first of its kind since the introduction of the
 framework in MediaWiki 1.16, we should give it a fair amount of time to
 allow everyone a fair opportunity to know about this breaking change.
 
 Similar to how MediaWiki maintains releases, jQuery UI continues to support
 and maintain 1.9.[1] If there's important security or bug fixes, they'll
 make a minor release that we can upgrade to without breaking anything. So
 while 1.10 is newer than 1.9, it is still totally supported and there's
 no pressure to upgrade from that angle. (There's pressure from other angles
 but that's another topic.)
 
 For those aware of the fact that jQuery UI 1.10 is not the latest version
 (1.11 is, released last month), the 1.11 release is interesting because,
 like jQuery UI 1.10, it also dropped support for a major browser (IE7).[3]
 So we'll probably want to upgrade to 1.10 first. Aside from browser
 support, our code is distributed so we can only upgrade to a version that
 drops support for an API after we first upgrade to a version that provides
 the new API.[4]
 
 — Krinkle
 
 [1] http://jqueryui.com/download/ and their source repository continues
   to maintain jQuery UI 1.11.x, 1.10.x and 1.9.x
 [2] http://blog.jqueryui.com/2013/01/jquery-ui-1-10-0/
 [3] http://blog.jqueryui.com/2014/06/jquery-ui-1-11-0/
 [4] http://jqueryui.com/upgrade-guide/1.10/
 
 On 24 Jul 2014, at 22:03, Thomas Mulhall thomasmulhall...@yahoo.com wrote:
 
 Thanks. 
 
 
 On Thursday, 24 July 2014, 21:49, Sumana Harihareswara 
 suma...@wikimedia.org wrote:
 
 
 
 Replying with a subject line. :) Good luck Thomas.
 
 
 
 Sumana Harihareswara
 Senior Technical Writer
 Wikimedia Foundation 
 
 
 On Thu, Jul 24, 2014 at 4:24 PM, Thomas Mulhall thomasmulhall...@yahoo.com 
 wrote:
 
 Hi should we upgrade jquery ui to version 1.10.4. even though we recently 
 upgraded to version 1.9.2 we could upgrade to 1.10.4 in Mediawiki 1.25. The 
 main difference is it removes internet explorer 6 support which as long as 
 internet explorer 6 users can edit and view the page it wont matter to them. 
 here is the changelog jqueryui.com/changelog/1.10.0/
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] logging out on one device logs user out everywhere

2014-07-23 Thread Krinkle
I think generally user's expectation (and imho desirable behaviour in 
general[1]) is that logging out one session, does not affect other sessions.

However I think it's a valid use case to be able to invalidate other sessions 
remotely (e.g. you lost control over the device or it's inconvenient to get 
at), as well as being able to invalidate all other sessions (paranoia, 
convenience, clean slate, or  I can't remember what device that bloke had when 
I needed to check my e-mail and forgot to log out).

Both Gmail and Facebook currently implement systems like this.

On Gmail, you have a footnote Last account activity: time ago with a 
details link providing an overview of all current sessions (basically extracted 
from session data associated with the session cookies set for your account). It 
shows the device type (user agent or, if not cookie based, the protocol, like 
IMAP/SMTP), the location and IP, and when the session was last active. It has 
an option to Sign out all other session.

On Facebook, the Security Settings feature has a section Where You're Logged 
In which is similar. Though slightly more enhanced in that it also allows 
ending individual sessions.

They also have a section Trusted Browsers which is slightly different in that 
it lists sessions that are of the Remember me type and also lists 
authenticated devices that won't ask for two-step verification again. And the 
ability to revoke any of them.

— Krinkle

[1] E.g. not expectation based on previous negative experience with other sites.

On 23 Jul 2014, at 16:45, Chris Steipp cste...@wikimedia.org wrote:

 On Tuesday, July 22, 2014, MZMcBride z...@mzmcbride.com wrote:
 
 Chris Steipp wrote:
 I think this should be managed similar to https-- a site preference,
 and users can override the site config with a user preference.
 
 Please no. There's been a dedicated effort in 2014 to reduce the number
 of user preferences. They're costly to maintain and they typically
 indicate a design flaw: software should be sensible by default and a user
 preference should only be a tool of last resort. The general issue of user
 preferences-creep remains particularly acute as global (across a wikifarm)
 user preferences still do not exist. Of course in this specific case,
 given the relationship with CentralAuth, you probably could actually have
 a wikifarm-wide user preference, but that really misses the larger point
 that user preferences should be avoided, if at all possible.
 
 I'll start a new thread about my broader thoughts here.
 
 
 I think we have too many preferences also, no disagreement there.
 
 But like Risker, I too want to always destroy all my sessions when I logout
 (mostly because I log in and out of accounts a lot while testing, and I
 like knowing that applies to all the browsers I have open). So I'm biased
 towards thinking this is preference worthy, but I do think it's one of
 those things that if it doesn't behave as a user expects, they're going to
 think it's a flaw in the software and file a bug to change it.
 
 I'm totally willing to admit the expectations I have are going to be the
 minority opinion. If it's a very, very small number of us, then yeah,
 preference isn't needed, and we can probably get by with a gadget.
 
 Your proposal for account info and session management is good too. I hope
 someone's willing to pick that up.
 
 
 
 
 MZMcBride
 
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org javascript:;
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Best practices for loading large, pre-minified JavaScript modules?

2014-07-13 Thread Krinkle
On 13 Jul 2014, at 19:40, Max Semenik maxsem.w...@gmail.com wrote:

 You can bypass minification with raw=true, but it would kill most of RL
 niceties too.
 

This is incorrect. ResourceLoader raw does not, nor ever has, bypassed 
minification.

raw is what bypasses the client loader framework. So instead of returning a 
combined
script/styles/messages response to the mw.loader.implement callback, it returns 
a raw
response for only scripts, styles or messages – whichever is specified in the 
only=
parameter – without a mw.loader.implement or mw.loader.state call.

This is mostly undocumented and discouraged hack that can be used to fetch
ResourceLoader modules into a non-ResourceLoader context (e.g. Toolserver, old
versions of MobileFrontend).

It does not bypass minification. That's what debug=true is for.

To fetch the scripts with only concatenation and not minification, you can use 
modules=my.module.nameonly=scriptsdebug=true. Like all unversioned requests,
this is only cached for 5 to 30 minutes (depending on mediawiki configuration).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Krinkle
On 11 Jun 2014, at 22:05, Antoine Musso hashar+...@free.fr wrote:

 Le 11/06/2014 15:51, Brad Jorsch (Anomie) a écrit :
 
 - needs to install composer
 
 This is one of my pet peeves. It's one thing to have to install
 dependencies to run MediaWiki, but having to install various dependencies
 just to *install* it? Ugh!
 
 Composer is a first step. [..]
 
 composer is being more and more used in the PHP world and I see it as
 essentially replacing Pear that probably nobody is going to regret. [..]
 
 

True, composer should be easier to work with than PEAR[1].

However, afaik we never depended on any PEAR packages.
We either shipped them (Services_JSON), provided a fallback or made the feature 
optional / in an extension.

-- Krinkle

[1] Easier to instal because it requires less permissions since it uses local 
directory. Easier to deploy because of this. And managed with a manifest in 
composer.json instead of reading a manual of sorts etc.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-11 Thread Krinkle
On 9 Jun 2014, at 20:58, Bartosz Dziewoński matma@gmail.com wrote:

 On Mon, 09 Jun 2014 20:52:44 +0200, Martijn Hoekstra 
 martijnhoeks...@gmail.com wrote:
 
 In this case, which post are you replying to in flow when you reply to
 multiple people? In mediawiki you sort of work around the issue, and it
 sort of works because you try to create some ad-hoc solution. When the
 software creates a hard dependency between posts, where it is difficult now
 to keep track of these kinds of discussion, it may become even more
 difficult to follow them then. Since we've established that this is
 something that currently does happen, I think even if it is (to be polite
 (?)) completely insane, it's something that should be supported anyway.
 
 When I encounter this issue on mailing lists, I usually just reply to the 
 lowest common ancestor of all the posts I want to reply to at once, or split 
 my reply and respond to each separately.
 
 (And mailing lists are interesting by itself, because most actual e-mail 
 clients display the discussion in a threaded fashion, while most webmails 
 like GMail display a flat list of replies.)
 
 Introducing a structured discussion is hard enough, let's not invent issues 
 where there are none. :)


I have used many different desktop and mobile e-mail applications that aren't 
web based. The last time I've seen it displayed threaded instead of flattened 
by default was when I installed Microsoft Office on Windows 98 SE which hid 
Outlook Express and intruced Microsoft Outlook and with it it exposed me to 
threaded display of mailing lists[1].

Every other mail client I've used did not do this (not for mailing lists, not 
for regular inbox). Outlook Express, Eudora, Thunderbird, Apple Mail and 
various web-based clients.

These different clients may have carried over my preferences, but they all had 
an option to display it flattened. And I believe it was the default.

While we may or may not know what the default was, I'm pretty sure that most 
e-mail clients display discussions in a threaded fashion is pertinently not 
true.


-- Krinkle

[1] Would've roughly looked like this: http://i.imgur.com/fK1NV2G.png
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-08 Thread Krinkle
On 8 Jun 2014, at 17:22, James Forrester jforres...@wikimedia.org wrote:
— Krinkle

 On Sunday, June 8, 2014, Martijn Hoekstra martijnhoeks...@gmail.com wrote:
 
 On Sun, Jun 8, 2014 at 1:18 AM, Martijn Hoekstra 
 martijnhoeks...@gmail.com javascript:;
 wrote:
 
 Flow stores the comments as a structured tree
 
 That seems a fundamental mistake. A discussion isn't a tree, it's a dag at
 best. It's possible for a single comment in a discussion to refer to zero
 or more earlier comments,
 
 
 Flow stores each discussion as a tree, with a Flow Board being a forest
 of discussions for precisely this reason.
 
 
 and it's also possible for a single comment to
 refer to part of an earlier comment, which means a comment isn't an
 indivisable node.
 
 
 Hmm. I'm not convinced that there has ever been a successful/useful/good
 discussion system that encouraged sub-comment structured replies. In my
 experience they are unusable morrasses of confusion. Instead, a lightweight
 quoting tool achieves the specificity at the least complexity and greatest
 clarity for users.
 
 I could be convinced otherwise, but it'd need to be a fairly stunning
 design concept.
 
 J.
 


Throughout the years I've had to use at many different incarnations of 
conversation workflows. Such as:
* Inline comments (such as on StackOverflow).
* Issues trackers (like Bugzila or GitHub Issues).
* Mailing threads (as rendered by Gmail or Apple Mail, both for 1-on-1 threads 
and those from mailing lists).
* Helpdesk ticket systems.
* Disqus.
* Feedback systems (like GetSatisfaction and UserEcho).
* WordPress comments.
* LiquidThreads.
* Your typical 90s-style forum board (like phpBB or vBulletin).
* ..

And I can't really come to any conclusion other than that the user experience 
is significantly worse when any of these used a tree structure (especially 
LiquidThreads and forum boards). It always ends up a mess.

Fortunately, most of these have now either exclusively opted for a linear model 
or have an option to view it as a linear model (I think LiquidThread is the 
only exception on this list). Some systems, like Disqus and WordPress comments, 
handle it by only allowing a very limited number of nesting levels, though I'm 
not convinced this is useful.

I agree with James and feel that having a good system for citing would 
significantly increase user experience more than any tree structure ever would 
(and having a tree of any kind always negatively impacts user experience).

-- Timo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] enwiki display issues

2014-06-04 Thread Krinkle
Regarding the May 16 issue with the bits.wikimedia.org cluster (which was 
affecting availability of various modules, including gadgets), that is filed as 
bug 65424.

https://bugzilla.wikimedia.org/show_bug.cgi?id=65424.

Server admin log entries during the incident:


https://wikitech.wikimedia.org/w/index.php?title=Server_Admin_Logdiff=113204oldid=113199

I documented the incident just now:

https://wikitech.wikimedia.org/wiki/Incident_documentation/20140517-bits

— Krinkle

On 1 Jun 2014, at 17:22, Helder . helder.w...@gmail.com wrote:

 On Sat, May 31, 2014 at 1:21 PM, Andre Klapper aklap...@wikimedia.org wrote:
 On Sat, 2014-05-31 at 07:52 -0300, Helder . wrote:
 Shouldn't we have an incident report[1] about this?
 
 https://wikitech.wikimedia.org/wiki/Incident_documentation/20140529-appservers
  (as mentioned in the other thread)
 That link is about a recent problem, not about the gadgets problem from 16 
 may.
 
 Helder
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-03 Thread Krinkle
The link Derk-Jan mentioned is a great resource indeed.

However consumption of that data (mostly HTML structures) isn't very straight 
forward in most programming languages.

A tool has been written to, until we have a better way in the MediaWIki 
software, expose this information:

https://commons.wikimedia.org/wiki/Commons:Commons_API

https://tools.wmflabs.org/magnus-toolserver/commonsapi.php

https://tools.wmflabs.org/magnus-toolserver/commonsapi.php?image=Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpgmeta

— Krinkle

On 3 Jun 2014, at 14:11, Derk-Jan Hartman d.j.hartman+wmf...@gmail.com wrote:

 Actually, what you really seem to want is to make use of
 iiprop=extmetadata, which is an API that makes use of
 https://commons.wikimedia.org/wiki/Commons:Machine-readable_data
 included in the various templates. The MultimediaViewer project also
 uses this API.
 
 https://commons.wikimedia.org/w/api.php?action=queryformat=xmltitles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpgiilimit=maxiiprop=extmetadata|timestamp|user|comment|url|size|mimeprop=imageinfo|revisionsrvgeneratexml=rvprop=ids|timestamp|user|comment|content
 
 Where this is not accurate, you might have to fix up some templates to
 make them better machine readable. It's all pretty new, and it's
 basically a managed web scraper in itself, but it's probably better to
 have one web scraper, than multiple.
 
 DJ
 
 On Tue, Jun 3, 2014 at 10:18 AM, james harvey jamespharve...@gmail.com 
 wrote:
 Sorry for the email spam.  Worked through it, I think.  Not too familiar
 with wiki internals.  :-)
 
 This particular page doesn't have the content I'm looking for in it.  It
 references a template which is used by a few other versions of the same
 image, presumably so the data can be stored once and be given consistently.
 Not being familiar with wiki internals, that was looking to me like it
 wasn't returning the entire page content... But it is, so I'll have to
 recognize this situation and pull referenced templates when the information
 I need isn't already there.
 
 
 On Tue, Jun 3, 2014 at 2:45 AM, james harvey jamespharve...@gmail.com
 wrote:
 
 I may have stumbled upon it.  If I change the API call from
 titles=File:XYZ.jpg to titles=Template:XYZ (note: dropped the .jpg)
 then it *appears* to get me what I need.
 
 Is this correct, or did I run across a case where it appears to work but
 isn't going to be the right way to go?  (Like, I'm not sure if
 Template:XYZ directly relates to the Summary information on the
 File:XYZ.jpg page, or if it's duplicated data that in this case matches.
 And, I'm confused why the .jpg gets dropped switching File: to
 Template:)
 
 And, will this always get me the full template information, or if someone
 just updates the Year portion, would it only return back that part --
 since the revisions seem to be returning data as much as they can based on
 changes from the previous revision, rather than the answer ignoring any
 other revisions.
 
 On Tue, Jun 3, 2014 at 1:59 AM, james harvey jamespharve...@gmail.com
 wrote:
 
 Given a Wikimedia Commons description page URL - such as:
 https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
 
 I would like to be able to programmatically retrieve the information in
 the Summary header.  (Values for Artist, Title, Date, Medium,
 Dimensions, Current location, etc.)
 
 I believe all this information is in Template:Artwork.  I can't figure
 out how to get the wikitext/json-looking template data.
 
 If I use the API and call:
 https://commons.wikimedia.org/w/api.php?action=queryformat=xmltitles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpgiilimit=maxiiprop=timestamp|user|comment|url|size|mimeprop=imageinfo|revisionsrvgeneratexml=rvprop=ids|timestamp|user|comment|content
 https://commons.wikimedia.org/w/api.php?action=queryformat=xmltitles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpgiilimit=maxiiprop=timestamp%7Cuser%7Ccomment%7Curl%7Csize%7Cmimeprop=imageinfo%7Crevisionsrvgeneratexml=rvprop=ids%7Ctimestamp%7Cuser%7Ccomment%7Ccontent
 
 Then I don't get the information I'm looking for.  This shows the most
 recent revision, and its changes.  Unless the most recent revision changed
 this data, it doesn't show up.
 
 To see all the information I'm looking for, it seems I'd have to specify
 rvlimit=max and go through all the past revisions to figure out which is
 most current.  For example, if I do so and I look at revid 79665032, that
 includes: {{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
 = 1889 | Technique = {{Oil on canvas}} | . . .
 
 Isn't there a way to get the current version in whatever format you'd
 call that - the wikitext/json looking format?
 
 In my API call, I can specify rvexpandtemplates which even with only the
 most recent revision gives me the information I need, but it's largely in
 HTML tables/divs/etc format rather than

[Wikitech-l] [BREAKING CHANGE] Upcoming jQuery upgrade: Removing jQuery Migrate

2014-06-03 Thread Krinkle
Hey all,

TL;DR:
* We did not make the breaking change last week for Wikimedia; it is postponed.
* MediaWiki 1.24.0 will ship with jQuery Migrate switched off.
* Wikimedia  non-Wikimedia wikis can enable jQuery Migrate if needed.
* When MediaWiki 1.24 is released, we will switch off jQuery Migrate for 
Wikimedia wikis.

As we said last month, we are upgrading MediaWiki's jQuery to version 1.11.x 
[1],
which removes some long-deprecated functions. These were done in phase one [2]
and two [3] as of last week.

However, we have held off with phase 3 (removing the jQuery Migrate plugin) for
a while to give migration a little more time.

The amount of migration work necessary has been much less than I anticipated.
Very few scripts have actually needed changing, probably because these features
have been deprecated for quite a while now. Some of newer developers may never
have even known of these APIs.

Having said that, it does take a while for a message like this to spread out to
some members of our large community. While those wiki scripts and gadgets which
have active maintainers have been looked into, and (where needed) updated
accordingly, a large number of gadgets and scripts in the cluster of Wikimedia
wikis have not yet had a chance to get a grip on this, let alone extensions and
wikis outside of Wikimedia.

While I don't want to set another deadline, I think it makes sense to ship the
jQuery Migrate plugin for one release (with MediaWiki 1.24), and then remove it
in MediaWiki 1.25. This will give all wikis' communities and extension authors
a few more months until the MediaWiki 1.24 branch point (in Autumn 2014) before
they will need to make adjustments to work with MediaWiki master.

MediaWiki 1.24.0 stable will disable support for legacy jQuery by default. If
you find you have scripts, gadgets or extensions still using legacy jQuery APIs,
you can enable them with a simple line in LocalSettings.php:

$wgIncludejQueryMigrate = true;

And of course when you find you need to do this, please make an effort to
ensure the maintainers of those items still using these legacy features are
aware of this so they may patch the code accordingly (using the upgrade guide[4]
and my earlier announcement [5]). When MediaWiki 1.24 is released, we will
switch off jQuery Migrate for Wikimedia wikis.

— Krinkle

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44740
[2] https://gerrit.wikimedia.org/r/#/c/131494/
[3] https://gerrit.wikimedia.org/r/#/c/133477/
[4] http://jquery.com/upgrade-guide/1.9/
[5] http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg75735.html

On 7 May 2014, at 18:29, Krinkle krinklem...@gmail.com wrote:

 Hey all,
 
 TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
 major release removes deprecated functionality. Please migrate away from this
 deprecated functionality as soon as possible.
 
 It's been a long time coming but we're now finally upgrading the jQuery 
 package
 that ships with MediaWiki.
 
 We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a 
 couple
 of years ago due to lack of time and concern about disruption. Because of 
 this,
 many developers have needed to work around bugs that were already fixed in 
 later
 versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
 the first release in jQuery history that needed an upgrade guide[1][2]. It's a
 major release that cleans up deprecated and dubious functionality.
 
 Migration of existing code in extensions, gadgets, and user  site scripts
 should be trivial (swapping one method for another, maybe with a slight change
 to the parameters passed). This is all documented in the upgrade guide[1][2].
 The upgrade guide may look scary (as it lists many of your favourite methods),
 but they are mostly just addressing edge cases.
 
 == Call to action ==
 
 This is a call for you, to:
 
 1) Get familiar with http://jquery.com/upgrade-guide/1.9/.
 
 2) Start migrating your code.
 
 jQuery v1.9 is about removing deprecated functionality. The new functionality 
 is
 already present in jQuery 1.8 or, in some cases, earlier.
 
 3) Look out for deprecation warnings.
 
 Once instrumentation has begun, using ?debug=true will log jQuery 
 deprecation
 warnings to the console. Look for ones marked JQMIGRATE [7]. You might also
 find deprecation notices from mediawiki.js, for more about those see the mail
 from last October [8].
 
 == Plan ==
 
 1) Instrumentation and logging
 
 The first phase is to instrument jQuery to work out all the areas which will
 need work. I have started work on loading jQuery Migrate alongside the current
 version of jQuery. I expect that to land in master this week [6], and roll 
 out on
 Wikimedia wikis the week after. This will enable you to detect usage of most
 deprecated functionality through your browser console. Don't forget the 
 upgrade
 guide[1], as Migrate cannot detect everything.
 
 2) Upgrade and Migrate
 
 After

Re: [Wikitech-l] What should be the recommended / supported way to do skins? (A proposal.)

2014-05-25 Thread Krinkle
A skin has (or imho should have) (only) two name like properties:

1) Internal identifier (id):
 - Used for processing at the application level and as public API 
(configuration variables, url parameters, API properties, database values, 
dynamically constructed page names such as for site and user scripts, 
preference keys)
 - Lowercase
 - No non-ASCII characters
 - No spaces
 - Not localised
 - Change this would be a breaking change

2) Display title (skin name):
 - Typically starts with an uppercase letter, may contain spaces (e.g. Cologne 
Blue)
 - Used for graphical user interface (e.g. anywhere HTML is displayed, whenever 
it is used inside an message)
 - Defined by msg key skinnname-{skin}


— Krinkle

On 25 May 2014, at 18:02, Bartosz Dziewoński matma@gmail.com wrote:

 On Sun, 25 May 2014 00:11:28 +0200, Daniel Friesen 
 dan...@nadir-seen-fire.com wrote:
 
 However $wgValidSkins isn't
 something should become case insensitive, attempting that for array keys
 is asking for bugs. Same for putting non lowercase keys in the database
 and trying to make them case insensitive.
 The easiest way to make useskin=, $wgDefaultSkin, and $wgSkipSkins case
 insensitive is to normalize all skin keys from them to lowercase (which
 is actually only a few lines in Skin.php) and then as a breaking
 change say we're forbidding non-lowercase keys to $wgValidSkins (which
 should be rather simple since I haven't seen a single skin yet doing
 that which would actually be broken by such a change).
 
 Hmm. Yeah, this actually sounds very sensible. Let's make it so.
 
 To summarize:
 
 * 'skin file name' (='skin directory name' and 'skin repository name'):
  * pretty (that is, usually CamelCased, unless the skin name would
for some reason be lowercase)
  * may not contain non-ASCII characters
 * 'skin name':
  * a lowercase version of 'skin file name', which would also provide
any future skin loading/installation/management mechanism with a
simple way to map the file/directory/repository name to the 'skin
name'
  * user inputs (useskin, $wgDefaultSkin) are accepted with any
capitalization and normalized to lowercase
 
 The requirements above are technically breaking changes, but are very
 unlikely to actually break anything.
 
 Right?
 
 -- 
 Matma Rex
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-05-23 Thread Krinkle
On 23 May 2014, at 03:11, Gergo Tisza gti...@wikimedia.org wrote:

 On Wed, May 7, 2014 at 9:29 AM, Krinkle krinklem...@gmail.com wrote:
 
 A rough timeline:
 
 * 12 May 2014 (1.24wmf4 [9]): Phase 1 – Instrumentation and logging
 starts. This
  will run for 4 weeks (until June 9).
 
 * 19 May 2014 (1.24wmf5): Phase 2 – Upgrade and Migrate. This will run
 for 3
  weeks (upto June 9). The instrumentation continues during this period.
 
 * 1 June 2014 (1.24wmf7) Finalise upgrade.
 
 (...)
 Q: When will the upgrade happen?
 A: In the next few weeks, once we are happy that the impact is reasonably
 low.
 An update will be sent to wikitech-l just before this is done as a final
 reminder.
 This will be well before the MediaWiki 1.24 branch point for extension
 authors
 looking to maintain compatibility.
 
 
 I'm not sure this decision makes sense. This would mean that 1.23 shipped
 with jQuery 1.8 and 1.24 will ship with jQuery 1.11, without the backwards
 compatibility plugin. I don't see how this helps extension authors, and it
 will be a nuisance for wiki webmasters who will have to deal with the
 breakage of all the not-so-well maintained extensions, without any
 transition period where they could identify and fix/replace them, when they
 do the 1.23 - 1.24 upgrade. There should be a major version which includes
 the migration plugin.
 

I disagree. These jQuery features were deprecated many years ago.

jQuery.browser was deprecated in 1.3.0[1][2]
* 2009-01 (over 5 years ago)
* MediaWiki 1.16 shipped jQuery v1.3.2

jQuery#live was deprecated in 1.7.0[3][4]
* 2011-11 (over 2 years ago)
* MediaWiki 1.19 shipped jQuery 1.7.1

Etc. etc.

They've been deprecated for a long time now (at least one or three major 
releases). So it's not like it all came out of nowhere. These are not 
deprecations, it's the removal of deprecated features.

MediaWiki 1.24 is a major release and this major change (potentially breaking 
change) has appropriate release notes.  I don't think there is an obligation 
here to maintain compatibility with extensions yet another release.

Extensions are branched at the same time as core. There is no assumption that 
an unmaintained extension will remain working with a newer MediaWiki core 
version (practically every release we have a few minor breaking changes that 
require changes to extensions, this is no different).

Also, shipping with the Migrate extension is essentially just shipping the old 
version of jQuery. We've been doing that already since MediaWiki 1.21 by simply 
not upgrading jQuery. It's been several releases since; that time is over now.

In addition, I think it's a bad idea to release MediaWiki with the Migrate 
plugin enabled by default. Is an odd environment to provide/support (a bit like 
trying to support IE8 in IE7 compat mode; people have a hard enough time 
already with different IE versions, avoid mutations if possible).

See also my comments on https://gerrit.wikimedia.org/r/#/c/133719/

-- Krinkle

[1] http://blog.jquery.com/2009/01/14/jquery-1-3-released/
[2] http://api.jquery.com/category/deprecated/deprecated-1.3/
[3] http://blog.jquery.com/2011/11/03/jquery-1-7-released/
[4] http://api.jquery.com/category/deprecated/deprecated-1.7/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-05-14 Thread Krinkle
I don't think it is possible or worth the effort to scan for these in an 
automated fashion within Jenkins.

Static analysis is virtually impossible due to the method names being too 
simple, and lots of it relies on details of how methods are called, as well.

For example, $something.error( .. ) was deprecated. Doing a static search for 
.error would yield to many false positives. And trying to parse the scripts 
and figure out what is and isn't a jQuery object seems like a task outside the 
scope here (especially considering this is a one-time migration).

Doing it via the method the migrate instrumentation uses is more reliable 
(though it still doesn't cover everything), but that requires execution. You'd 
have to somehow execute and trigger all code paths.

It would imho give a false sense of security. I'm afraid this comes down to 
requiring active maintenance and knowledge of the code base. While migration is 
simple and should not require knowledge of a module's workings, identifying 
them in the first place can't be done statically. Take time in the next few 
weeks to play with the various teams and projects you're a part of and take a 
few minutes to ensure there are no deprecation notices being fired when using 
them.

In addition, (to see if maybe you missed any) you could perform a few grep/ack 
searches if you want to be extra sure (manually, so that you can mentally 
exclude false positives). 

— Krinkle

On 7 May 2014, at 19:27, Siebrand Mazeland siebr...@kitano.nl wrote:

 Is there any way we can have a Jenkins job check for the use of deprecated 
 and report it, or have a scan of Gerrit repos done and reports made available 
 somewhere?
 
 Cheers!
 
 --
 Siebrand
 
 Op 7 mei 2014 om 18:29 heeft Krinkle krinklem...@gmail.com het volgende 
 geschreven:
 
 Hey all,
 
 TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
 major release removes deprecated functionality. Please migrate away from this
 deprecated functionality as soon as possible.
 
 It's been a long time coming but we're now finally upgrading the jQuery 
 package
 that ships with MediaWiki.
 
 We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a 
 couple
 of years ago due to lack of time and concern about disruption. Because of 
 this,
 many developers have needed to work around bugs that were already fixed in 
 later
 versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
 the first release in jQuery history that needed an upgrade guide[1][2]. It's 
 a
 major release that cleans up deprecated and dubious functionality.
 
 Migration of existing code in extensions, gadgets, and user  site scripts
 should be trivial (swapping one method for another, maybe with a slight 
 change
 to the parameters passed). This is all documented in the upgrade guide[1][2].
 The upgrade guide may look scary (as it lists many of your favourite 
 methods),
 but they are mostly just addressing edge cases.
 
 == Call to action ==
 
 This is a call for you, to:
 
 1) Get familiar with http://jquery.com/upgrade-guide/1.9/.
 
 2) Start migrating your code.
 
 jQuery v1.9 is about removing deprecated functionality. The new 
 functionality is
 already present in jQuery 1.8 or, in some cases, earlier.
 
 3) Look out for deprecation warnings.
 
 Once instrumentation has begun, using ?debug=true will log jQuery 
 deprecation
 warnings to the console. Look for ones marked JQMIGRATE [7]. You might also
 find deprecation notices from mediawiki.js, for more about those see the mail
 from last October [8].
 
 == Plan ==
 
 1) Instrumentation and logging
 
 The first phase is to instrument jQuery to work out all the areas which will
 need work. I have started work on loading jQuery Migrate alongside the 
 current
 version of jQuery. I expect that to land in master this week [6], and roll 
 out on
 Wikimedia wikis the week after. This will enable you to detect usage of most
 deprecated functionality through your browser console. Don't forget the 
 upgrade
 guide[1], as Migrate cannot detect everything.
 
 2) Upgrade and Migrate
 
 After this, the actual upgrade will take place, whilst Migrate stays. This
 should not break anything since Migrate covers almost all functionality that
 will be removed. The instrumentation and logging will remain during this 
 phase;
 the only effective change at this point is whatever jQuery didn't think was
 worth covering in Migrate or were just one of many bug fixes.
 
 3) Finalise upgrade
 
 Finally, we will remove the migration plugin (both the Migrate compatibility
 layer and its instrumentation). This will bring us to a clean version of 
 latest
 jQuery v1.x without compatibility hacks.
 
 
 A rough timeline:
 
 * 12 May 2014 (1.24wmf4 [9]): Phase 1 – Instrumentation and logging starts. 
 This
 will run for 4 weeks (until June 9).
 
 * 19 May 2014 (1.24wmf5): Phase 2 – Upgrade and Migrate. This will run for 
 3
 weeks (upto June 9). The instrumentation continues

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-05-14 Thread Krinkle
That's an entirely different thing from scanning or catching things in 
development. That's harvesting from clients in production. That is certainly 
possible, and Wikimedia does that, too.

Deprecated properties[1], and features[2] use mw.track[3] to emit an event.

And the WikimediaEvents extension forwards these to EventLogging (at a sampled 
rate of course). Which are then available privately in the analytics database, 
and made available anonymised in Graphite[5].

You can set up similar logging for JQMIGRATE. Note though that jQuery Migrate 
doesn't have nice keys, you'll have to do with the full descriptive sentence of 
the warning (but you're doing that already at TWN).

You could try something like this:

if ( jQuery.migrateWarnings ) {
 jQuery.migrateWarnings.push = function (msg) {
mw.twn.log( '/webfiles/jswarning', { msg: '[jquery.migrate]' + msg, stack: 
new Error().stack } );
 };
}

I might set up some tracking for it at Wikimedia as well, but I'm not sure if 
that'll work properly.


— Krinkle

[1] 
https://github.com/wikimedia/mediawiki-core/blob/wmf/1.24wmf4/resources/src/mediawiki/mediawiki.js#L567
[2] 
https://github.com/wikimedia/mediawiki-core/blob/wmf/1.24wmf4/resources/src/mediawiki.api/mediawiki.api.js#L189
[3] 
https://github.com/wikimedia/mediawiki-core/blob/wmf/1.24wmf4/resources/src/mediawiki/mediawiki.js#L410-L427
[4] 
https://github.com/wikimedia/mediawiki-extensions-WikimediaEvents/blob/master/modules/ext.wikimediaEvents.deprecate.js#L14-L16
[5] http://codepen.io/Krinkle/full/zyodJ/

On 14 May 2014, at 19:07, Siebrand Mazeland siebr...@kitano.nl wrote:

 
 Op 14 mei 2014 om 14:58 heeft Krinkle krinklem...@gmail.com het volgende 
 geschreven:
 
 I don't think it is possible or worth the effort to scan for these in an 
 automated fashion within Jenkins.
 
 Static analysis is virtually impossible due to the method names being too 
 simple, and lots of it relies on details of how methods are called, as well.
 
 At translatewiki.net, we log client side issues using a script[1]. Might 
 something like that be of any benefit?
 
 [1] 
 http://git.wikimedia.org/blob/translatewiki.git/HEAD/webfiles%2Ftwn.jserrorlog.js
 
 --
 Siebrand
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Change the installer to make Project: the default option for meta namespace name

2014-05-13 Thread Krinkle
Note that Project and Project talk already work on any wiki. They are the 
canonical names for the namespace in question.

Namespaces have:
* a localised/standard name for the local wiki (based on configuration like 
Wikipedia and/or localisation like Usario).
* a canonical name (that is language and wiki independent, except for extra 
custom namespaces where the localised name becomes the canonical one, but 
project is not an extra custom namespace).
* aliases (alternate translations or legacy names, such as Image).

https://es.wikipedia.org/wiki/Project_talk:X
-
https://es.wikipedia.org/wiki/Wikipedia_ discusión:X

— Krinkle

On 11 May 2014, at 16:15, Matthew Flaschen mflasc...@wikimedia.org wrote:

 On 05/04/2014 05:08 AM, Max Semenik wrote:
 This proposal makes no sense: all these namespaces _are_ dependent on wiki
 language, so even if you force Project down the throats of non-English
 users, project talk would still be e.g. Project_ахцәажәара for Abkhazian
 wikis and so on.
 
 I think you may be misunderstanding the proposal.  I think it's proposing to 
 use standard namespace names for the project namespace, rather than 
 wiki-specific ones like the Wikipedia namespace.
 
 That's not the same as not localizing the standard namespaces.  For instance, 
 Spanish Wikipedia uses the standard namespaces for e.g. User (Usario) and 
 Category (Categoría) (that means e.g. Usario:Foo and User:Foo both work).
 
 However, it uses Wikipedia for the meta namespace.  Under this, it seems by 
 default a new Spanish wiki would use Proyecto, with Project also working and 
 referring to the same page.
 
 Matt Flaschen
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-05-07 Thread Krinkle
Hey all,

TL;DR: jQuery will soon be upgraded from v1.8.3 to v1.11.x (the latest). This
major release removes deprecated functionality. Please migrate away from this
deprecated functionality as soon as possible.

It's been a long time coming but we're now finally upgrading the jQuery package
that ships with MediaWiki.

We used to regularly upgrade jQuery in the past, but got stuck at v1.8 a couple
of years ago due to lack of time and concern about disruption. Because of this,
many developers have needed to work around bugs that were already fixed in later
versions of jQuery. Thankfully, jQuery v1.9 (and its v2 counterpart) has been
the first release in jQuery history that needed an upgrade guide[1][2]. It's a
major release that cleans up deprecated and dubious functionality.

Migration of existing code in extensions, gadgets, and user  site scripts
should be trivial (swapping one method for another, maybe with a slight change
to the parameters passed). This is all documented in the upgrade guide[1][2].
The upgrade guide may look scary (as it lists many of your favourite methods),
but they are mostly just addressing edge cases.

== Call to action ==

This is a call for you, to:

1) Get familiar with http://jquery.com/upgrade-guide/1.9/.

2) Start migrating your code.

jQuery v1.9 is about removing deprecated functionality. The new functionality is
already present in jQuery 1.8 or, in some cases, earlier.

3) Look out for deprecation warnings.

Once instrumentation has begun, using ?debug=true will log jQuery deprecation
warnings to the console. Look for ones marked JQMIGRATE [7]. You might also
find deprecation notices from mediawiki.js, for more about those see the mail
from last October [8].

== Plan ==

1) Instrumentation and logging

The first phase is to instrument jQuery to work out all the areas which will
need work. I have started work on loading jQuery Migrate alongside the current
version of jQuery. I expect that to land in master this week [6], and roll out 
on
Wikimedia wikis the week after. This will enable you to detect usage of most
deprecated functionality through your browser console. Don't forget the upgrade
guide[1], as Migrate cannot detect everything.

2) Upgrade and Migrate

After this, the actual upgrade will take place, whilst Migrate stays. This
should not break anything since Migrate covers almost all functionality that
will be removed. The instrumentation and logging will remain during this phase;
the only effective change at this point is whatever jQuery didn't think was
worth covering in Migrate or were just one of many bug fixes.

3) Finalise upgrade

Finally, we will remove the migration plugin (both the Migrate compatibility
layer and its instrumentation). This will bring us to a clean version of latest
jQuery v1.x without compatibility hacks.


A rough timeline:

* 12 May 2014 (1.24wmf4 [9]): Phase 1 – Instrumentation and logging starts. This
  will run for 4 weeks (until June 9).

* 19 May 2014 (1.24wmf5): Phase 2 – Upgrade and Migrate. This will run for 3
  weeks (upto June 9). The instrumentation continues during this period.

* 1 June 2014 (1.24wmf7) Finalise upgrade.


== FAQ ==

Q: The upgrade guide is for jQuery v1.9, what about jQuery v1.10 and v1.11?

A: Those are regular updates that only fix bugs and/or introduce non-breaking
enhancements. Like jQuery v1.7 and v1.8, we can upgrade to those without any
hassle. We'll be fast-forwarding straight from v1.8 to v1.11.


Q: What about the jQuery Migrate plugin?

A: jQuery developed a plugin that adds back some of the removed features (not
all, consult the upgrade guide[2] for details). It also logs usage of these to
the console.


Q: When will the upgrade happen?

A: In the next few weeks, once we are happy that the impact is reasonably low.
An update will be sent to wikitech-l just before this is done as a final 
reminder.
This will be well before the MediaWiki 1.24 branch point for extension authors
looking to maintain compatibility.


Q: When are we moving to jQuery v2.x?

A: We are not currently planing to do this. Despite the name, jQuery v2.x
doesn't contain any new features compared to jQuery v1 [3]. The main difference
is in the reduced support for different browsers and environments; most
noticeably, jQuery 2.x drops support for Internet Explorer 8 and below, which
MediaWiki is still supporting for now, and is outside the scope of this work.
Both v1 and v2 continue to enjoy simultaneous releases for bug fixes and new
features. For example, jQuery released v1.11 and v2.1 together[4][5].

-- Krinkle

[1] http://blog.jquery.com/2013/01/15/jquery-1-9-final-jquery-2-0-beta-migrate-
final-released/
[2] http://jquery.com/upgrade-guide/1.9/
[3] http://blog.jquery.com/2013/04/18/jquery-2-0-released/
[4] http://blog.jquery.com/2014/01/24/jquery-1-11-and-2-1-released/
[5] http://blog.jquery.com/2013/05/24/jquery-1-10-0-and-2-0-1-released/
[6] https://gerrit.wikimedia.org/r/131494
[7] https://github.com/jquery/jquery

Re: [Wikitech-l] jquery.accessKeyLabel broke mobile

2014-04-27 Thread Krinkle
The module was split out of mediawiki.util, nothing new, no need for any 
special treatment here.

It should've been given the same target definition as mediawiki.util.

— Krinkle

On 27 Apr 2014, at 19:37, Jon Robson jdlrob...@gmail.com wrote:

 Looks like  jquery.accessKeyLabel now seems to be a required ResourceLoader
 module.
 
 The patch that introduced it is
 https://gerrit.wikimedia.org/r/#/c/125426/and this has exploded the
 mobile site with 49 or 50 tests failing since the
 merge.
 
 I'm not quite sure whether this module is useful on mobile. I'm not judging
 that here - I just want to report this is a consideration.
 
 Seems we have 2 options
 1) Enable the module on mobile if it is applicable. If it's a small library
 maybe decide on whether it is important later.
 2) Revert this core change and rethink this
 
 We need to do _one_ of the above before the next deployment train.
 
 The damage can be viewed at
 http://en.m.wikipedia.beta.wmflabs.org/wiki/Forty-seven_Ronin - basically
 JavaScript exception.
 
 Bug at https://bugzilla.wikimedia.org/show_bug.cgi?id=64512
 
 
 On Sat, Apr 26, 2014 at 10:18 PM, jenkins-no-re...@cloudbees.com wrote:
 
  * FAILURE: MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome
 Build #409
 https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/
 (Sun, 27 Apr 2014 04:52:29 +)*
 *Test 
 Result*https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/
 50 failed, 20 skipped
  Failed Tests  *Test Name**Duration**Age*  Check UI components.Check
 existence of important UI components on other 
 pages.https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Check%20UI%20components/Check_existence_of_important_UI_components_on_other_pages_22
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Encourage
 new users to Keep Going.I see a KeepGoing message after completing my
 VisualEditor 
 edithttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Encourage%20new%20users%20to%20Keep%20Going/I_see_a_KeepGoing_message_after_completing_my_VisualEditor_edit48
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Encourage
 new users to Keep Going.I see a KeepGoing message after completing my 
 edithttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Encourage%20new%20users%20to%20Keep%20Going/I_see_a_KeepGoing_message_after_completing_my_edit52
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Generic
 special page features.Search from 
 Loginhttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Generic%20special%20page%20features/Search_from_Login39
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Generic
 special page features.Search from 
 Nearbyhttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Generic%20special%20page%20features/Search_from_Nearby10
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Generic
 special page features.Search from 
 Uploadshttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Generic%20special%20page%20features/Search_from_Uploads2.2
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Generic
 special page features.Search from 
 Watchlisthttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Generic%20special%20page%20features/Search_from_Watchlist38
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Issues.Clicking
 page issues opens 
 overlayhttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Issues/Clicking_page_issues_opens_overlay12
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Issues.Closing
 page 
 issueshttps://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Issues/Closing_page_issues12
 sec2https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/407/
   Issues.Closing
 page issues (browser 
 back)https://wmf.ci.cloudbees.com/job/MobileFrontend-en.m.wikipedia.beta.wmflabs.org-linux-chrome/409/testReport/junit/(root)/Issues/Closing_page_issues__browser_back_13
 sec2https

Re: [Wikitech-l] Callback for mw.toolbar.addButton() executes on edit view load instead of on click

2014-04-23 Thread Krinkle
On 22 Apr 2014, at 02:37, Lego KTM legoktm.wikipe...@gmail.com wrote:

 On Mon, Apr 21, 2014 at 4:09 PM, Justin Folvarcik jfolvar...@gmail.com 
 wrote:
 
 function removeDuplicateLinks(){
 
 ..
 }
 if (wgAction == 'edit'){
mw.toolbar.addButton( {
imageFile: '
 http://localhost/wikidev/images/2/20/Button_cite_template.png',
speedTip: 'Remove duplicate links',
callback: removeDuplicateLinks(),
 
 Change this line to callback: removeDuplicateLinks.
 
 Your code had removeDuplicateLinks(), which would execute the
 function, and set the return value as the callback, while you wanted
 the actual function. Simply removing the () fixes that.
 

Indeed. That was the reason it ran on page load, because it was being invoked 
directly by your code when creating the button object.

To Justin: Please also change the code to extend the if-statement from `( 
wgAction === edit )` to be `( wgAction === edit  mw.toolbar )`.


On 22 Apr 2014, at 15:42, Erwin Dokter er...@darcoury.nl wrote:

 
 What I *suspect* you are doing wrong is... using the 'callback:' parameter. I 
 think this is the addButton's function callback. It stands to reason that it 
 would be executed once the addButton function has done its work.
 
 From what I have been able to discern from the badly organized documentation 
 (again, I may be totally misguided), is that you want to use the 'action:' 
 parameter instead.
 


The mw.toolbar interface neither documents nor implements any such function. 
There is no such thing as button.callback or button.action, and for as long as 
I can remember, such feature never existed.

As for documentation, this particular method is quite well-documented:

https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.toolbar-method-addButton

If you need a callback, I'd recommend you re-enable the WikiEditor toolbar 
instead of the legacy classic toolbar, and use its API to add a button instead.

The WikiEditor API provides a wide range of features, including a callback.


— Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Callback for mw.toolbar.addButton() executes on edit view load instead of on click

2014-04-23 Thread Krinkle
Yes, it will ensure the code won't run if that particular toolbar isn't enabled.
And it is also a safe guard for backwards compatibility (since the API used to 
have a different interface), and forwards compatibility (it might change).

Usually you wouldn't need such guard and instead use a dependency, but because 
you don't want to trigger a load of this module (you merely want to hook into 
it if the toolbar is there, you don't want to load another toolbar which is 
what adding a dependency would do).

— Krinkle

On 23 Apr 2014, at 14:21, Justin Folvarcik jfolvar...@gmail.com wrote:

 Am I correct in assuming that adding a check for mw.toolbar will help
 prevent the code from causing errors when in an edit view without a toolbar?
 
 
 On Wed, Apr 23, 2014 at 7:14 AM, Krinkle krinklem...@gmail.com wrote:
 
 On 22 Apr 2014, at 02:37, Lego KTM legoktm.wikipe...@gmail.com wrote:
 
 On Mon, Apr 21, 2014 at 4:09 PM, Justin Folvarcik jfolvar...@gmail.com
 wrote:
 
 function removeDuplicateLinks(){
 
..
 }
 if (wgAction == 'edit'){
   mw.toolbar.addButton( {
   imageFile: '
 http://localhost/wikidev/images/2/20/Button_cite_template.png',
   speedTip: 'Remove duplicate links',
   callback: removeDuplicateLinks(),
 
 Change this line to callback: removeDuplicateLinks.
 
 Your code had removeDuplicateLinks(), which would execute the
 function, and set the return value as the callback, while you wanted
 the actual function. Simply removing the () fixes that.
 
 
 Indeed. That was the reason it ran on page load, because it was being
 invoked directly by your code when creating the button object.
 
 To Justin: Please also change the code to extend the if-statement from `(
 wgAction === edit )` to be `( wgAction === edit  mw.toolbar )`.
 
 
 On 22 Apr 2014, at 15:42, Erwin Dokter er...@darcoury.nl wrote:
 
 
 What I *suspect* you are doing wrong is... using the 'callback:'
 parameter. I think this is the addButton's function callback. It stands to
 reason that it would be executed once the addButton function has done its
 work.
 
 From what I have been able to discern from the badly organized
 documentation (again, I may be totally misguided), is that you want to use
 the 'action:' parameter instead.
 
 
 
 The mw.toolbar interface neither documents nor implements any such
 function. There is no such thing as button.callback or button.action, and
 for as long as I can remember, such feature never existed.
 
 As for documentation, this particular method is quite well-documented:
 
 
 https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.toolbar-method-addButton
 
 If you need a callback, I'd recommend you re-enable the WikiEditor toolbar
 instead of the legacy classic toolbar, and use its API to add a button
 instead.
 
 The WikiEditor API provides a wide range of features, including a callback.
 
 
 — Krinkle
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
 
 -- 
 
 Justin Folvarcik
 *When the power of love overcomes the love of power, the world will
 finally know peace.*-Jimi Hendrix
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Continuous integration updates

2014-04-11 Thread Krinkle
Hey all,

Over the last 2 days, various packages have been upgraded in our Jenkins 
environment.

There shouldn't be any noticeable changes (other than bug fixes and minor 
improvements).

Let me know if you observe build failures that seem to be unrelated to the 
change being tested (e.g. master state previously passing not failing even 
though nothing changed).

These updates don't affect jobs using npm-install/npm-test as those use the 
local package.json. This mostly affects the plain *-jslint job and 
mediawiki-core-qunit / mwext-*-qunit jobs.

== Updates ==

grunt-contrib-qunit: v0.2.0 to v0.4.0
• 
https://github.com/gruntjs/grunt-contrib-qunit/blob/784597023e/CHANGELOG

phantomjs: v1.8.1 to v1.9.7
• https://github.com/ariya/phantomjs/blob/1.9.7/ChangeLog

grunt-contrib-csslint: v0.1.2 to v0.2.0
• https://github.com/gruntjs/grunt-contrib-csslint/blob/v0.2.0/CHANGELOG

gruntjs: v0.4.0 to v0.4.1
• https://github.com/gruntjs/grunt/blob/v0.4.1/CHANGELOG
• http://gruntjs.com/blog/2013-03-13-grunt-0.4.1-released

grunt-cli: v0.1.6 to v0.1.13
• https://github.com/gruntjs/grunt-cli/compare/v0.1.6...v0.1.13

jshint: v2.1.11 to v2.4.4
• https://github.com/jshint/jshint/releases
• http://jshint.com/blog/new-in-jshint-oct-2013/
• http://jshint.com/blog/new-in-jshint-dec-2013/


— Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CSS Regressions

2014-03-14 Thread Krinkle
See also:

https://bugzilla.wikimedia.org/show_bug.cgi?id=62633

-- Krinkle


On 10 Mar 2014, at 22:04, Jon Robson jdlrob...@gmail.com wrote:

 I just wondered if anyone doing MediaWiki development had any
 experience in catching CSS regressions?
 
 We have had a few issues recently in mobile land where we've made big
 CSS changes and broken buttons on hidden away special pages -
 particularly now we have been involved in the development of mediawiki
 ui and moving mobile towards using them.
 
 My vision of how this might work is we have an automated tool that
 visits a list of given pages on various browsers, take screenshots of
 how they look and then compares the images with the last known state.
 The tool checks how similar the images are and complains if they are
 not the same - this might be a comment on the Gerrit patch or an
 e-mail saying something user friendly like The Special:Nearby page on
 Vector looks different from how it used to. Please check everything is
 okay.
 
 This would catch a host of issues and prevent a lot of CSS regression bugs.
 
 Any experience in catching this sort of thing? Any ideas on how we
 could make this happen?
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-03-03 Thread Krinkle
On 1 Mar 2014, at 22:35, Jeroen De Dauw jeroended...@gmail.com wrote:

 Hey,
 
 Regarding
 https://github.com/wikimedia/mediawiki-core/blob/793f17481ea937275898c91b9ba3c92c5a3e908b/includes/specialpage/SpecialPage.php#L467-488
 
 So now all extensions that want to retain compatibility with MediaWiki 1.22
 and at the same time not have deprecation notices on 1.23 need to do an
 if-else for all calls to this method? That sucks big time. And for what?
 The old method is just forwarding to the new one. All this hassle for a
 rename? How about removing the wfDeprecated call and not annoying all
 extension maintainers?
 
 Cheers
 

You can also use the MediaWiki release version number branches in
the extension repositories. By default the extension distributor on
mediawiki.org uses this already.

Those branches are automatically cut when MediaWiki core is cut, so
they should work fine with the MediaWiki core release they are
associated with and you can move on in the master branch to react to
changes in latest master.

-- Krinkle



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-evaluating MediaWiki ResourceLoader's debug mode

2013-12-17 Thread Krinkle
I love how this thread evolved, +1 on pretty much all previous replies.

A few more thoughts though.

On 10 Dec 2013, at 03:29, MZMcBride z...@mzmcbride.com wrote:
 Ori Livneh wrote:
 On Mon, Dec 9, 2013 at 2:58 PM, Ryan Kaldari rkald...@wikimedia.org
 wrote:
 I am somewhat concerned about the implications for JS debugging here.
 Debugging JS problems with the live sites is already pretty complicated:
 1. debug=true won't reproduce some bugs (usually race condition related)
 
 Yeah, debug mode sucks. I think we need to think it over.
 


Indeed, there are various improvements to be made to debug mode. As well as
various bug fixes, such as the execution scope. In debug mode we currently load
javascript without closures, thus resulting in differences when there are scope
bugs. This should make no difference for “good” code (and for this
definition of good, anything passing jshint is good enough), but we still have
extensions with largely unreviewed javascript, and of course gadgets, site
scripts and user scripts which can contain anything imaginable.


On 10 Dec 2013, at 03:29, MZMcBride z...@mzmcbride.com wrote:
 Currently it goes something like this, as I understand it: by default, all
 CSS and JavaScript is concatenated and minified as much as practically
 possible. If you pass a debug=true URL parameter to index.php, you can
 disable this concatenation and minification of CSS and JavaScript (which
 largely, if not exclusively, come through via load.php/ResourceLoader).
 

Yep, baring a few oversimplified details, this is correct.
A more detailed description: 
https://www.mediawiki.org/wiki/ResourceLoader/Features


On 10 Dec 2013, at 03:29, MZMcBride z...@mzmcbride.com wrote:
 * Minification reduces bandwidth usage.
 ** At the cost of making debugging more difficult.
 
 * You can specify debug=true.
 ** Specifying the URL parameter can damage reproducibility.
 ** URL parameter is non-obvious to just about everyone.
 *** Could add an HTML comment at least pointing people in this direction.
 
 * Minification is a form of obfuscation and it harms the open Web.
 
 I'm not sure what the right answer is here. The damage to reproducibility
 and the open Web hurts a lot. The performance hit may hurt more.
 

Don’t forget that concatenation is also an important factor in reducing
bandwidth. Firstly because gzip is far more effective when applied to a larger
package. Secondly, because the number of http requests can sometimes be a more
significant cause of slow-down than file size, especially on mobile where both
matter very mcuh.

I disagree it makes debugging more difficult. It certainly adds a level of
abstraction, but I don’t think this level is affecting difficultly. Especially
considering the tools that browsers ship nowadays for debugging. These have
improved a lot over the years. For one, Chromium-based browsers have had for
several years now (Chrome, Opera, ..) and WebKit releases recently (Safari 6)
all have a “Pretty print” or “Original formatting” feature that will
expand minified code on-the-fly without reloading, reparsing, or reproducing
anything. That is enough to allow a developer who is reproducing a bug to
understand context and step-through code execution.

It might not give an exact file and line number, but that’s inevitable short
of banning converters (e.g. js closure wrapping, and CSSMin @embed) or compiling
(LESS) completely. What matters is the context of the error and what module it
originates from.

And in most cases you don’t even need pretty-print in order to know context.
Variable and function names in the stack trace usually provide more than enough
information when provided by a bug reporter for developers to know where to
look.

On the subject, there are other things I think we should focus on to improve
debugging in modern browsers: Source maps[1]. Having source maps will actually
provide the one thing we can’t provide right now: Getting original file names
and line numbers without sacrificing compilation or conversion. In fact, we
could drop debug mode entirely (short of its effect on debug-modules being
loaded, it wouldn’t affect the way modules are packages anymore).

As example, when working on VisualEditor I never ever use debug=true (not
locally, not in production). It is completely unnecessary and would be rather
slow to request near 600 raw js and css files from a server (used to be more
before we switched to using a compiled version of OOjs UI). For me it hasn’t
been a trade-off due to the slowness in debug=true, it simply didn’t provide
anything I already had.

— Krinkle

[1] http://www.html5rocks.com/en/tutorials/developertools/sourcemaps/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] chrome new version alway tips event.returnValue

2013-12-17 Thread Krinkle
On 11 Dec 2013, at 18:40, Bryan Davis bd...@wikimedia.org wrote:

 On Wed, Dec 11, 2013 at 9:59 AM, Sen kik...@gmail.com wrote:
 when i open the chrome console,i always can get:
 event.returnValue is deprecated. Please use the standard 
 event.preventDefault()  instead.
 
 
 is any plan to fix this?
 
 I don't know if there is currently a plan to fix it, but the warning
 is from jQuery and should be fixed by version 1.11 or greater [0]. As
 noted on the upstream bug this is just a warning and should have no
 effect on functionality.
 
 [0]: http://bugs.jquery.com/ticket/14282
 

In addition to merely being deprecated (and us using a dated version of jQuery),
it is also harmless if support were to be removed entirely by Chrome.

As far as I know, no version of jQuery ever relied on event.returnValue.
It just did the fallback from the standard event.preventDefault in an odd
way causing it to trigger a property access, which then (now, years later)
got triggered by Chrome’s deprecation system.

Simplified:

* They were using (jQuery v1.9.1):

  e.defaultPrevented || e.returnValue === false

Which means if the browser supports standard e.defaultPrevented,
but is genuinely set to false (it is a boolean property after all) it would
look at e.returnValue.

* They are now using (jQuery v1.11.0-pre, latest upstream master):

  e.defaultPrevented ( … )
  e.defaultPrevented === undefined  e.returnValue === false

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New collapsible elements behavior or more option for mw-collapsible

2013-11-28 Thread Krinkle
On 2013-11-27, at 22:07, Bartosz Dziewoński matma@gmail.com wrote:

 On Wed, 27 Nov 2013 18:12:53 +0100, Erick Guan fantasticfe...@gmail.com 
 wrote:
 
 The zhwiki's collapsible behavior makes the whole header bar clickable and
 shows a helpful text indicating the collapse state. Thus, the Navbox and
 many templates can be easily expand by click the hotspot(header bar)
 instead of finding the show/hide link to click. What do you think of
 making this behavior as default?
 And now, because we want to make use of upstream jQuery.makecollapsible but
 not break the local collapsible behavior, I think the best way to fix this
 is adding some configs for jQuery.makecollapsible.
 
 Can't you do that already? I made a simple mockup right now: 
 https://pl.wikipedia.org/w/index.php?title=Wikipedysta:Matma_Rex/brudnopisoldid=37990024
  - try clicking on the header. I think this could be easily applied to 
 navboxes.
 
 -- 
 Matma Rex
 


Be careful in this case to not cause a user experience regression. It isn't
obvious that the header can be clicked to expand (or collapse) the content.

Making the header clickable as a whole is imho a nice option, and it's great
that makeCollapsible supports this already (by using the header itself as a
custom toggle). However, depending on how your custom toggle is styled, there
should imho be a separate toggle or other kind of user interface component to
indicate collapsibility.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tip for Sublime Text editors: DocBlockr plugin and conf for JSDuck

2013-11-19 Thread Krinkle
Hi,

If you're using Sublime Text as your text editor[1], I'd recommend checking out
the DocBlockr plugin[2]. It makes it easier to produce documentation comments.

It helps you through various features such as:
* Autocomplete various @-tags
* Auto-create blocks[3]
* Detect function params and property value types
* And lots more..

It avoids mistakes and speeds up the creation of conformant comments so that
Jenkins/JSDuck won't shout at you.

Though it provides all this by default, I recommend you fine tune it to your 
liking
(and to the specifics of JSDuck).

To deal with the variety in how different projects write documentation comments,
it has various configuration options[4] (e.g. @return vs @returns, @var, vs 
@property,
type {Boolean}, {boolean} or {bool} etc.). I've published my configuration on 
GitHub.
It might be useful to you to get started[5].

You can install DocBlockr in Subllime the easy way using Package Control[6],
or check out the plugin page[2] for manual installation.

-- Krinkle

[1] http://www.sublimetext.com/
[2] https://github.com/spadgos/sublime-jsdocs
[3] Type /** followed by tab to insert the template, then tab trough 
placeholders to fill them in
[4] 
https://github.com/spadgos/sublime-jsdocs/blob/2.11.7/Base%20File.sublime-settings
[4] 
https://github.com/Krinkle/dotfiles/blob/b2088da/hosts/KrinkleMac/SublimePreferences.json#L17-L23
[5] https://sublime.wbond.net/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] First round of loggable javascript deprecations

2013-10-30 Thread Krinkle
TL;DR: We're starting to remove long-deprecated methods. First, they'll be 
wrapped in mw.log.deprecate (possibly replaced with dummy values) which will 
produce a stacktrace to the console when used. Example: 
http://cl.ly/image/3W0o131K0D3j

Pre-amble:

* As of MediaWiki 1.17 we started actively developing new javascript features.
* Any code from before that point has been tagged legacy and deprecated as of 
v1.17.0 back in 2011.

Problems:

* There is no easy way to see whether a page is using any of these deprecated 
interfaces.
* We still haven't removed any of the legacy code.
* Though we've added new things (jQuery, jQuery UI, mw.Title, mw.Uri etc.). 
We've been reluctant to iterate further on these new things and haven't been 
able to really refactor or improve them.

We've upgraded jQuery and jQuery UI a few times. But only because there were no 
major changes in backwards compatibility. This changed in 1.9 and that's why 
we're still on 1.8.

It would seem we're in a fixed position – unable to move up, only sideways. On 
the server-side of MediaWiki, deprecation has been a natural process enforced 
socially (plugins not maintained for several release cycles will simply have to 
be disabled). We introduce the new, we migrate uses of the old, we deprecate 
the old, we support both for uses for a while to allow others to migrate their 
stuff, then we remove the old.

In the front-end we never completed such a cycle. But, we're getting close to 
the completion of our first first cycle now. The legacy scripts deprecated as 
of v1.17.0 have all been removed or wrapped in mw.log.deprecate.

Please go to your favourite wiki in debug=true mode with a modern browser 
(recent Chrome, Firefox or Opera) and look at the console. Look for any 
uncaught exceptions, deprecation notices or other messages and try to address 
them soon (or report it to the maintainers of the gadgets triggering them).


-- Krinkle

[1] 
https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.log-method-deprecate

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core code coverage report

2013-10-18 Thread Krinkle
True, but there's two sides of this.

When needing the explicit annotation, it will keep out stuff that shouldn't be 
included (I agree on the wfRunHooks example).

However, depending on how it's implemented, it might also remove the very 
essence of line coverage. We wouldn't want to disable automatic coverage 
entirely (e.g. @cover Foo to mark Foo as 100% covered). It's still useful to 
have it provide the line-by-line coverage of an individual method to see what 
hasn't been covered yet.

Anyway, it looks like it's implemented with this in mind. It's not a boolean 
flag to disable the covering detection. It's more like a filter to to keep 
everything else out. Great :)

-- Krinkle

On 2013-10-18, at 18:38, Erik Bernhardson ebernhard...@wikimedia.org wrote:

 different definitions of test ;-)  code touched seems like a much less
 useful metric than code specifically tested, but i could be convinced
 otherwise.
 
 
 On Fri, Oct 18, 2013 at 9:31 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:
 
 On Fri, Oct 18, 2013 at 12:13 PM, Erik Bernhardson
 ebernhard...@wikimedia.org wrote:
 On Fri, Oct 18, 2013 at 8:23 AM, Brad Jorsch (Anomie) 
 bjor...@wikimedia.org
 wrote:
 
 On Fri, Oct 18, 2013 at 5:52 AM, Antoine Musso hashar+...@free.fr
 wrote:
 
 
 In June I enforced a PHPUnit feature which force us to mention which
 MediaWiki function is covered by a test method [FORCE COVER].
 
 Why is this desirable?
 
 
 In my experience this is desirable in order to only mark the particular
 code you are testing as covered.  So for example if you are testing
 database functions, you don't want it to mark wfRunHooks and friends as
 covered when it just happened to be called but wasn't specifically
 tested.
 
 On the other hand, you *did* test part of wfRunHooks, even if indirectly.
 
 
 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New! mw.Title reborn

2013-10-07 Thread Krinkle
Hey all,

Last Friday, the mw.Title rewrite landed in master. Here's a brief summary of 
the changes.

TL:DR;
* New static constructor mw.Title.newFromText  (returns mw.Title or null).
* New internal title parser (uses wgLegalTitleChars and mirrors most of 
Title::secureAndSplit).
* Bugs like [[.com]] throwing have been fixed.
* Big thanks to David Chan!

New: Static constructor mw.Title.newFromText

Unlike the regular constructor, this one does not throw exceptions on invalid 
input. Instead it returns null. This should make using mw.Title less awkward.

As a reminder, you are still required to account for invalid input. Where you 
previously wrapped `new mw.Title` it in a try/catch (which many users forgot), 
one may now use mw.Title.newFromText and check the result for truthiness.

Examples:

```php
$title = Title::newFromText( $input );
if ( $title ) { .. }
```

```js
title = mw.Title.newFromText( input );
if ( title ) { .. }
```

Regular constructor (old pattern):
```js
try {
title = new mw.Title( input );
} catch ( e ) { .. }

if ( title ) {  .. }
```

New: Title parser

Previously mw.Title had a very basic processor for the text input. It was 
designed to be looser than its PHP counterpart so that it is fast and defaults 
to considering something valid. It is indeed more important to not consider 
something invalid when it is is in fact valid, than the other way around. 
Clients should never be blocking an action and it'll have to go through the 
server anyway at some point. Though that design is good, it is not what it 
really was.

In practice mw.Title's old processor considered various things invalid that 
were valid. And it had certain side-effects that weren't very intuitive (it 
removed certain invalid characters so that the title would become valid;in 
other cases it would throw an exception).

The new parser uses wgLegalTitleChars [2] and mirrors most of 
Title::secureAndSplit. For this we had to convert the character sequences in 
wgLegalTitleChars from bytes to Unicode. Big thanks to David Chan for making 
this work!

A full list of bugs fixed and patterns now properly recognised as valid and 
invalid can be found in the commit message[1] and by examining the added test 
cases.

Various bugs have been fixed (e.g. titles starting with dots, such as [[.com]], 
throwing an exception).

Documentation: 
https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.Title

-- Timo

[1]
https://gerrit.wikimedia.org/r/#/c/83047/
https://github.com/wikimedia/mediawiki-core/commit/4894793ab60ea0a245372cb472150b4ed79d19f4
[2]
https://gerrit.wikimedia.org/r/#/c/82040/
https://github.com/wikimedia/mediawiki-core/commit/dc9c9ee7fc6d96f957e15b4f56276000cb8e8f06
[3]  https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.Title

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RfC update: LESS stylesheet support in core

2013-09-22 Thread Krinkle
On Sep 19, 2013, at 11:19 PM, C. Scott Ananian canan...@wikimedia.org wrote:

 @dan: the particular less isn't very powerful issues I'm concerned about
 are the ones solved by compass.  As is well-known, there is no equivalent
 to compass for less, and is not likely every to be, since less can not
 express the transformations required.  Compass uses ruby code to do this w/
 sass.  For example,
 https://github.com/chriseppstein/compass/blob/stable/lib/compass/sass_extensions/functions/gradient_support.rbis
 the code in compass in order to generate clean gradient specifications
 that work with all major browsers (including synthesizing SVG background
 images where required).  (Spec in
 http://compass-style.org/reference/compass/css3/images/ ).  Now, maybe we
 don't actually need all that power.  But the automatic cross-browser
 compatibility it allows sure is nice...
  --scott
 ​

As Ori pointed out, these could be implemented as Less functions in PHP land.

However note that (for me) one the reasons I prefer Less over Sass with Compass 
is that it allows more fine-grained control over browser support and doesn't 
take the insane approach of trying to support everything and leaving you with 
very minimal options to switch things off.

For example, in the of older browsers supporting SVG background-image but not 
CSS3 gradients, I'd say just drop that and only provide CSS3 gradient + vendor 
prefixed versions thereof and have it fallback for the rest. You're gonna have 
to provide that fallback anyway for browsers that support neither CSS3 or SVG.

Optimise for the future, not for the past (those browsers are getting smaller 
in usage). It's not worth it to generate and serve that SVG to all clients.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ResourceLoader support question: how to construct a value in CSS from PHP

2013-08-25 Thread Krinkle
On Jul 3, 2013, at 4:47 AM, Ori Livneh o...@wikimedia.org wrote:

 On Tue, Jul 2, 2013 at 11:52 PM, Thomas Gries m...@tgries.de wrote:
 
 Question:
 =
 How to contruct the background-image filename from a value in one of the
 OpenID PHP modules ?
 
 
 For a single rule, you can get away with something like:
 
 $wgHooks[ 'BeforePageDisplay' ][ ] = function ( $out, $skin ) {
 $out-addHeadItem( 'oauth-provider', 'style.oauth { color: red;
 }/style' );
 return true;
 };

Please don't. Adding additional script or style tags with must be avoided.

Even for just one this is imho not acceptable for new code under any 
circumstances.

For dynamically generated css, create a RL module that returns the generated 
CSS instead of the contents from a file on disk.

RL modules specifically allow this. In fact, the module base class make no 
assumptions about being associated with a file, only the 
ResourceLoaderFileModule implements this. For example 
ResourceLoaderUserCSSPrefsModule generates it as part of the module, and 
ResourceLoaderWikiModule loads it from wikipage content.

Though there are more benefits, here's a few that should be convincing enough 
to never use raw style again. The first two being most important as they 
aren't enhancements but actual bugs that can arise as a result of not using 
ResourceLoader:

* Not being cached as part of the page output.
* Automatically being run through CSSJanus for proper RTL-flipping.

* Keeps the output page smaller, by sharing a common cache (the load.php 
request with 30 days+ cache expiration).
* Automatically being run through CSSMin for compression.
* Better gzip compression for the css selectors (e.g. class name 
mw-oauth-button) and rules (e.g. phrase background-image used in other 
stylesheets as well) by being part of a request that also includes other 
stylesheets and scripts.


-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: LESS support in MediaWiki core

2013-08-22 Thread Krinkle
On Aug 20, 2013, at 2:31 AM, Tyler Romeo tylerro...@gmail.com wrote:

 As long as the change does not inhibit extensions from hooking in and using
 other CSS pre-processors, I don't see any issue with using LESS in core.
 


However if and when we adopt LESS support in core, which only happens if we
also incorporate it into our MediaWiki coding conventions, complying
extensions will, by convention, not be tolerated to use other
pre-processors.

However I agree that ResourceLoader in general should be agnostic and allow
implementation and usage of whatever you want in your own extensions.

From quickly looking at the draft patch set and the existing
extension[1][2] that already implements this I we can conclude that this is
already the case, and I'll hereby vouch for continued support of such
extensibility for other pre-processors as well. However core should only
(at most) include 1.

-- Krinkle

[1] https://www.mediawiki.org/wiki/Extension:Less
[2] https://github.com/wikimedia/mediawiki-extensions-Less

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Why isn't hotcat an extension?

2013-07-18 Thread Krinkle
On Jul 17, 2013, at 11:52 PM, Yuvi Panda yuvipa...@gmail.com wrote:

 It's universally liked, is there almost on every wiki, and provides a
 much needed functionality. Why isn't this deployed as an extension, or
 better yet - part of core, than as a gadget? Just a matter of someone
 to do the work?
 
 --
 Yuvi Panda T
 http://yuvi.in/blog
 

https://www.mediawiki.org/wiki/Extension:InlineCategorizer

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-] ResourceLoader loading extension CSS dynamically?

2013-06-05 Thread Krinkle
On Jun 5, 2013, at 2:43 PM, vita...@yourcmc.ru wrote:

 Hello!
 
 I've got a serious issue with ResourceLoader.
 
 WHAT FOR it's made to load extension styles_ DYNAMICALLY using 
 JavaScript?
 
 It's a very bad idea, it leads to page style flickering during load. I.e. 
 first the page is displayed using only skin CSS and then you see how 
 extension styles are dynamically applied to it. Of course it's still rather 
 fast, but it's definitely noticeable, even in Chrome.
 
 Why didn't you just output link rel=stylesheet href=load.php?ALL 
 MODULES / ??
 

If you need them in the HTML and they are really important, you can use 
addModuleStyles instead.

However use it sparingly as using that makes it harder to cache things 
properly, produces slightly more http request overhead.

The client side loader allows all kinds of useful cache features, load 
optimisations, dependency resolution and rapid global cache puring without 
requiring a purge of every single page of the wiki. This is what enables 
Wikipedia to deploy new code in ~ 5 minutes globally without invalidating 
anything in the HTML.

Historically speaking relying on javascript to load styles may seem odd, but 
nowadays it is worth it and is in many cases even faster due to being 
non-blocking (where appropriate).

Loading with javascript also doesn't mean that it will flicker. That only 
happens if the module is loaded from the bottom.
Set 'position' = 'top' in your module and load with addModules() to load it 
from the top. That should fix the flash of unstyled content.

-- Krinkle

[1] http://www.mediawiki.org/wiki/Manual:$wgResourceModules
[2] http://www.mediawiki.org/wiki/ResourceLoader/Features

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coding style: Language construct spacing

2013-05-14 Thread Krinkle
On May 10, 2013, at 3:22 AM, Tim Starling tstarl...@wikimedia.org wrote:

 On 09/05/13 10:26, Krinkle wrote:
 I'm obviously biased, but I think the same goes for require_once
 (and include, require etc.). Right now this is causing quite a few
 warnings in our php-checkstyle report.
 
 include and require return a value, so they are more like functions
 than return or print. See e.g. ResourceLoader.php:
 
 $this-register( include( $IP/resources/Resources.php ) );
 
 You could compare them to pseudo-functions like empty and isset. I
 suppose you could write:
 
 $this-register( include $IP/resources/Resources.php );
 
 But that looks kind of weird to me.
 
 -- Tim Starling
 
 

It might take some getting used to, but I don't think it looks off.

It is similar to the new keyword, which also returns value.

$masterdb-add( new SlaveDB() );

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: New approach to release notes

2013-05-08 Thread Krinkle
On May 7, 2013, at 8:56 PM, Bartosz Dziewoński matma@gmail.com wrote:

 On Tue, 07 May 2013 20:51:07 +0200, Krinkle krinklem...@gmail.com wrote:
 
 It is the duty of repository co-owners to make wise decisions beyond
 just code quality. About what changes go in what release (if at all),
 whether the introduced features are in the best interest of the users
 and that we can maintain them and are willing to support them. And to
 be aware of whether a change is breaking or not, and if so if whether
 the change should still go in the current release or a the next (e.g.
 don't remove/break without deprecation first, if possible).
 
 So in other words, this puts more burden on reviewers, making it harder to 
 get changes merged, especially for new users?
 
 Because that's what this sounds like. Changes are already rotting in gerrit 
 for a year (see the recent watchlist thread), and this certainly will not 
 help.
 
 The current process for release notes is fine; we just need someone to write 
 a custom merge driver for JGit to avoid the merge conflicts. This is a 
 technical issue, not a policy one.
 


How does this make anything harder for new users? If anything, it makes it 
easier by not having to worry about which file to edit, what to put in it etc.

As for more burden on reviewers, I disagree. It might inspire them to give more 
care to proper commit subjects (and edit them liberally as needed) because if 
you leave it in bad shape, it needs to be fixed later in the notes.

And here again, it simplifies things by maintaining release notes centrally and 
away from inside the tight development loop. Some of the same people will be 
doing both, but in the right frame of mind, instead of when you don't want to.

The current process for release notes is not fine. It hasn't been fine for at 
least 2 or 3 releases. It is missing a lot, and what's there is (imho) poor 
quality (my own notes included).

Improving that doesn't require moving the process, but I think this is an 
opportunity to fix a mess the right way at once.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Coding style: Language construct spacing

2013-05-08 Thread Krinkle
Hi,

Since there appears to have been a little bit of trivia around fixing
these phpcs warnings, I'll open a thread instead.

Both in javascript and PHP there are various keywords that can be used
as if they are functions. In my opinion this is a misuse of the
language and only causes confusion.

I'm referring to code like this (both javascript and php):

delete( mw.legacy );

new( mw.Title );

typeof( mw );

echo( $foo . $bar );

print( $foo . $bar );

return( $foo . $bar );

… and, wait for it.. 

require_once( $foo . $bar );

I think most experienced javascript developers know by now that using
delete or new like it is a function is just silly and looks like
you don't know what you're doing.

To give a bit of background, here's why these work at all (they aren't
implemented both keywords and functions, just keywords). Though I'm
sure the implementation details differ between PHP and javascript, the
end result is the same: Keywords are given expressions which are then
evaluated and the result is used as value. Since expressions can be
wrapped in parenthesis for readability (or logic grouping), and since
whitespace is insignificant to the interpreter, it is possible to do
`return(test)`, which really is just `return (test)` and
eventually `return test`.

I'm obviously biased, but I think the same goes for require_once
(and include, require etc.). Right now this is causing quite a few
warnings in our php-checkstyle report.

I didn't disable that rule because it appears (using our code base as
status quo) that we do want this. There's 0 warnings I could find in
our code base that violate this, except for when the keyword is
include|require(_once)?

The check style sniffer does not (and imho should not) have a separate
rule per keyword. Either you use constructs like this, or you don't.

But let's not have some weird exception just because someone didn't
understand it[1] and we all copied it and want to keep it for no
rational reason.

Because that would mean we have to either hack the sniffer to exclude
this somehow, or we need to disable the rule, thus not catching the
ones we do use.

See pending change in gerrit that does a quick pass of (most of) these
in mediawiki/core:

https://gerrit.wikimedia.org/r/62753


-- Krinkle

[1] Or whatever the reason is the author originally wrote it like
this. Perhaps PHP was different back then, or perhaps there was a
different coding style.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: New approach to release notes

2013-05-07 Thread Krinkle
On May 7, 2013, at 5:52 PM, Brad Jorsch bjor...@wikimedia.org wrote:

 On Mon, May 6, 2013 at 10:09 PM, Krinkle krinklem...@gmail.com wrote:
 On May 3, 2013, at 9:33 PM, Anomie wrote:
 
 Taking a recent example[1], please tell me how to compress the
 following into 62 characters:
 
 (in the New features section)
 
 * (bug 45535) introduced the new 'LanguageLinks' hook for manipulating the
 language links associated with a page before display.
 
 (in the API section)
 
 * BREAKING CHANGE: action=parse no longer returns all langlinks for the page
 with prop=langlinks by default. The new effectivelanglinks parameter will
 request that the LanguageLinks hook be called to determine the effective
 language links.
 * BREAKING CHANGE: list=allpages, list=langbacklinks, and prop=langlinks do 
 not
 apply the new LanguageLinks hook, and thus only consider language links
 stored in the database.
 
 I don't think Add LanguageLinks hook with breaking changes to 4 API
 modules is detailed enough for release notes. And before you try to
 cheat and split it into multiple commits, note that the new hook and
 what it means for how langlinks are stored in the database is what is
 the breaking change in those API modules; the actual changes to the
 API modules are just mitigating or noting it.
 
 The summary actually used for that revision, BTW, was (bug 45535)
 Hook for changing language links.
 
 [1]: https://gerrit.wikimedia.org/r/#/c/59997/
 
 
 Though this is not a typical type of change and I think you already
 know the answer, I'll give you my take on this one.
 
 As commit subject (and thus release notes change log entry) I'd use:
 
 Add hook LanguageLinks for changing langlinks before display
 
 Oh, so not mentioning the breaking API change at all? Definitely not good.
 
 2) I'm not sure why you'd make ApiParse not call the hook by default.
 An option to get the raw langlinks may be useful (I'd be curious as to
 the use cases, but I can imagine), but doing so by default seems odd.
 
 I suggested that too. The Wikidata people disagreed, and I didn't feel
 like arguing over it.
 
 This change is a typical case where extra-awareness notes are in
 order. I personally wouldn't consider these breaking changes, but
 anyway, they are certainly important. So these are are the kind of
 changes for which you'd include notes in a separate section.
 
 How do these extra notes get noted wherever you intend for them to
 be noted? That seems to be missing from the proposal.
 
 And this brings me back to my concern that others will incorrectly
 think they know what is considered a breaking change in the API.
 


Extra notes are added like usual, except now on the release notes wiki
page instead of the file in version control.

Now I hear you thinking, does that mean every patch contributor now
needs to know about this wiki page and wait for the change to be
approved and then edit the page to add notes? No.

It is the duty of repository co-owners to make wise decisions beyond
just code quality. About what changes go in what release (if at all),
whether the introduced features are in the best interest of the users
and that we can maintain them and are willing to support them. And to
be aware of whether a change is breaking or not, and if so if whether
the change should still go in the current release or a the next (e.g.
don't remove/break without deprecation first, if possible).

As such the co-owners of a repository (not per se the patch
contributor) will know these things and should take care (or delegate)
the addition of proper release notes as you deem appropriate for the
users of the component(s) you maintain.

This also means that there's no uncomfortable waiting time between
submission and (if needed for this particular change) the editing of
the wiki page. Because can write them right after you approve a change
(or do it later).

-- Krinkle






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: New approach to release notes

2013-05-06 Thread Krinkle
On May 3, 2013, at 9:33 PM, Anomie wrote:

 On Fri, May 3, 2013 at 1:02 PM, Krinkle wrote:
 First of all, I think a lot of our commit subjects are poorly written,
 even for a commit message. Having said that, a good commit subject is
 also a good release note (that is, if the change itself is notable for
 release notes). I don't think that these extensive paragraphs of text
 we are known for in release notes are a good habit.
 
 In my opinion, a good commit summary and a good release note are not
 necessarily the same thing at all, otherwise we could just dump the
 git log as the release notes and be done with it. Release notes
 *should* go into sufficient detail to tell the reader what it is they
 should be noting.
 

I believe that a (filtered) list of good summaries is indeed sufficient for the 
release notes. The projects I referenced as example already proof this fact. I 
don't think it is realistic to think that we need a different type of message 
for the release notes for each change.

There are some changes (by far not the majority) that require special 
attention, for example when:

* the site admin needs to make changes to their configuration prior or during 
upgrading
* the site admin needs to update specific extensions at the same time due to 
breaking compatibility
* an extension maintainer should make changes soon due to deprecation of a 
feature
* an extension maintainer needs to ensure changes are made due to removal of a 
feature 
* etc.

However in such case an entry in the list of changes in the release notes that 
is more elaborate than the others doesn't really stand out. In such case, as I 
believe we have in most cases already, the text in question needs to be written 
in a paragraph in the Compatibility, Upgrading or similar sections.

 We already have a 62-char limit for the commit subject. That seems to
 be going well. Assuming that we properly summarise changes that way
 already, why would one need more room in the release notes? It is the
 same event.
 
 Taking a recent example[1], please tell me how to compress the
 following into 62 characters:
 
 (in the New features section)
 
 * (bug 45535) introduced the new 'LanguageLinks' hook for manipulating the
  language links associated with a page before display.
 
 (in the API section)
 
 * BREAKING CHANGE: action=parse no longer returns all langlinks for the page
  with prop=langlinks by default. The new effectivelanglinks parameter will
  request that the LanguageLinks hook be called to determine the effective
  language links.
 * BREAKING CHANGE: list=allpages, list=langbacklinks, and prop=langlinks do 
 not
  apply the new LanguageLinks hook, and thus only consider language links
  stored in the database.
 
 I don't think Add LanguageLinks hook with breaking changes to 4 API
 modules is detailed enough for release notes. And before you try to
 cheat and split it into multiple commits, note that the new hook and
 what it means for how langlinks are stored in the database is what is
 the breaking change in those API modules; the actual changes to the
 API modules are just mitigating or noting it.
 
 The summary actually used for that revision, BTW, was (bug 45535)
 Hook for changing language links.
 
 [1]: https://gerrit.wikimedia.org/r/#/c/59997/


Though this is not a typical type of change and I think you already
know the answer, I'll give you my take on this one.

As commit subject (and thus release notes change log entry) I'd use:

Add hook LanguageLinks for changing langlinks before display

Regarding the change itself:

1) I think this hook should be renamed as it ambiguous. It could be a
hook for changing langlinks when parsing/saving a page (input) or a
hook to implement a custom source of langlinks when requesting them
(output).

2) I'm not sure why you'd make ApiParse not call the hook by default.
An option to get the raw langlinks may be useful (I'd be curious as to
the use cases, but I can imagine), but doing so by default seems odd.

Regarding the release notes:

This change is a typical case where extra-awareness notes are in
order. I personally wouldn't consider these breaking changes, but
anyway, they are certainly important. So these are are the kind of
changes for which you'd include notes in a separate section.

Which brings me to another point.

No doubt these are breaking changes, but in a way almost every change
is potentially a breaking change for something for someone,
somewhere.

The kind of changes that break things we previously supported should
be noted in those separate sections. Thus resulting in a situation
where the change log is skimmable for the curious and only containing
summaries of notable changes. And anything that requires attention is
clearly separate and to be read first for all eyes (breaking changes
for site admins or extension maintainers and configuration changes).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Extensions History in enwiki

2013-05-04 Thread Krinkle
On May 4, 2013, at 8:41 PM, Lukas Benedix bene...@zedat.fu-berlin.de wrote:

 Hi,
 
 I'm looking for the history of extension usage in enwiki. I asked for that in 
 the IRC and was told that all the information can be found in the two files: 
 CommonSettings.php and InitialiseSettings.php.
 
 One of my main problems is that there are so many ways extensions get 
 included in the CommonSettings.php…
 
 for example:
 include( $IP . '/extensions/Renameuser/Renameuser.php' );
 include( $IP/extensions/AntiBot/AntiBot.php );
 include $IP . '/extensions/AntiSpoof/AntiSpoof.php';
 include $IP/extensions/WikimediaMessages/WikimediaMessages.php;
 require( $IP/extensions/Oversight/HideRevision.php );
 require_once( $IP/extensions/LocalisationUpdate/LocalisationUpdate.php );
 require $IP/extensions/UserDailyContribs/UserDailyContribs.php;
 (I don't think this list is complete)
 
 The next problem is that I have to look at the InitialiseSettings.php for a 
 lot of extensions:
 
 CommonSettings:
 if ( $wmgUserDailyContribs ) {
 require $IP/extensions/UserDailyContribs/UserDailyContribs.php;
 }
 
 InitialiseSettings:
 'wmgUserDailyContribs' = array(
 'default' = true,
 ),
 
 
 The other problem is getting all versions of these two files and correlate 
 them to figure out which extensions were live in enwiki.
 
 
 
 Another idea was to look at the History of Special:Version 
 http://web.archive.org/web/20120126235208*/http://en.wikipedia.org/wiki/Special:Version
  but I don't think the history there is complete (there is a big gap in 2007).
 
 
 Can anyone help me creating a list like this for enwiki:
 
 EXTENSION_NAME; DEPLOYMENT_DATE; REMOVAL_DATE;
 
 
 kind regards,
 
 Lukas

I don't think there is such a record readily available.

However it should be relatively straight forward to develop something that 
could generate the data you're looking for.

The program would:

* Browse trough the git history of mediawiki-config.git (optionally picking one 
per day, or every commit if you like)
* Use a PHP interpreter to run CommonSettings.php with site context 
dbname=enwiki. No need to explicitly have it read InitialiseSettings.php since 
that is referenced from CommonSettings.php
* Fetch list of run-time included php files
* Filter for $IP/extensions/*/*.php
* Strip to match only directory name
* Insert into database of some sorts with timestamp of commit

From there you'll be able to determine exactly when and for how long an 
extension was deployed.

-- Krinkle

PS: The public mediawiki-config.git only dates back to 24 Feb 2012. Before that 
date the information was in a non-public svn repository inside the 
wmf-production cluster.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: New approach to release notes

2013-05-03 Thread Krinkle
On May 2, 2013, at 7:41 PM, Brad Jorsch bjor...@wikimedia.org wrote:

 I like the idea of not having to mess with RELEASE-NOTES-#.## merge
 conflicts. But I'm not entirely sure about everything in the proposal.
 
 On Thu, May 2, 2013 at 12:30 PM, Krinkle krinklem...@gmail.com wrote:
 For more details see Proposal 2 on this page:
 
 https://www.mediawiki.org/wiki/Requests_for_comment/Release_notes_automation#Proposal_2:_Proper_commit_message
 
 To avoid wrong entries for bugs that were merely mentioned in the commit
 message but not fixed in that commit, it becomes more important that we
 enforce a consistent format. bug 123 is a mention. Bug: 123 in footer
 meta-data indicates the bug was fixed.
 
 So we're changing the semantics of that Bug footer already? Instead
 of being used for searching changesets related to a bug, now it's only
 for searching changesets actually fixing a bug? And what about
 changesets that only partially fix a bug? Or bugs where changes to
 both core and extensions are needed to fix a bug?
 

Fair point. We could use some other detectable way instead. However
I'd prefer to keep things simple where bug 123 is a mention and
Bug: 123 a resolution. Both are linkified by Gerrit, so that's not
an issue.

We could introduce something like Bug-Fixed: 123. But I'm not sure
if more syntax is the answer here. Note we also have people using
(bug 123) in the subject still, and even people copying BugZilla's
format of Bug 123: title.

Or we could simply omit this detection altogether and build the list
from Bugzillla instead (resolved bugs marked as fixed for the
appropriate milestone). Then we'd only have to make sure periodically
that recently closed bugs without a milestone set get a milestone set.
I believe several people perform such query already and fill in the
missing milestones.

 We can't use the commit message subject because they aren't always the same
 as what you want to write in the release notes.
 
 So are we getting rid of the recommended less-than-50-character limit
 on commit subjects? Or are we assuming that whoever volunteers to
 clean up the subject-dump will expand on it as necessary?
 

No, the commit subject character limit is there for a good reason. We
shouldn't change that for this.

First of all, I think a lot of our commit subjects are poorly written,
even for a commit message. Having said that, a good commit subject is
also a good release note (that is, if the change itself is notable for
release notes). I don't think that these extensive paragraphs of text
we are known for in release notes are a good habit.

We already have a 62-char limit for the commit subject. That seems to
be going well. Assuming that we properly summarise changes that way
already, why would one need more room in the release notes? It is the
same event.

There are cases where it is important to write more elaborate notes
for maintenance for extension developers and migration/upgrading for
site administrators.

And yes when the release notes are processed on the wiki page, things
that need further documentation can be freely written in (for example)
an Upgrade or To extension maintainers section – instead of
scattered through the bullet point lists.

For example:

 == Changes ==
 * foo: Upgrade to version 1.20
 * bar: Don't silently ignore invalid values in event handler
 * ..
 
 == To extension maintainers ==

 Various bar plugins are known to call the event handler with an object instead
 of an array containing the object. This was previously silently skipped.
 This failure is now explicit by throwing an invalid arguments exception.
 


 Mark entries with the relevant labels (New, Removed, Breaking etc.)
 
 I worry that whoever is doing this for API changes (if it doesn't wind
 up being me) won't know how we determine Breaking for API changes.
 I.e. anything that's not backwards-compatible for API users.
 

If you don't know what you're doing, don't do it?

I think we can reasonably assume good faith that people won't go and
mislabel things on the wiki page as breaking for a component they
are not knowledgeable in.

Plus, it is a wiki page, so it'll be a lot easier to keep track of and
fix for everyone involved.

This will make that easier since this would basically turn to doing a
post-merge review of activities to some degree.

-- Krinkle






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] RFC: New approach to release notes

2013-05-02 Thread Krinkle
Hi all,

Our current release note strategy is clearly not working.

Too many times do people omit it. Either because they consider the
commit not important enough for release notes or because it is a pain
to do due to the continuous merge conflicts. Not to mention fix-ups
and small clarifications are often annoying to do. It also often
results in inconsistent formatting of the notes as everybody does it
differently.

For this and other reasons[1], I'd recommend we stop updating release
notes through git commits in a separate file.

For more details see Proposal 2 on this page:

https://www.mediawiki.org/wiki/Requests_for_comment/Release_notes_automation#Proposal_2:_Proper_commit_message

It is important that we settle quickly and perfect later since this is
affecting our current release cycle already.

In case people worry about the scalability of the approach, I'm
willing to take this on for the near future. However I'm convinced we
have enough people who care and will filter the incoming stream on the
wiki page. Simply look at the git history of release notes files (I
expect at least Reedy and hexmode will naturally be drawn to this).

-- Krinkle

[1] Other reasons include keeping the change atomic and easy to
backport. For what its worth, Git[2], Nodejs[3] and many jQuery
projects such as QUnit[4] already maintain release notes this way.
[2] https://github.com/git/git/commit/v1.8.2.2
[3] https://github.com/joyent/node/commit/v0.10.5
[4] https://github.com/jquery/qunit/commit/v1.11.0


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reminder about the best way to link to bugs in commits

2013-03-21 Thread Krinkle
On Mar 20, 2013, at 11:59 AM, Niklas Laxström niklas.laxst...@gmail.com wrote:

 On 1 March 2013 23:46, Chad innocentkil...@gmail.com wrote:
 Bug: 1234
 Change-Id: Ia90.
 
 
 So when you do this, you're able to search for bug:1234 via Gerrit.
 By doing this, you're also removing it from the first line (which was
 our old habit, mostly from SVN days), providing you more space to
 be descriptive in that first line.
 
 Few questions:
 
 1) Why is Bug:43778 different from bug:43778 when searching?
 

Because it doesn't literally search for Bug:123 (even though in our case it 
looks that way because the footer is also Bug: 123).

There is a search operator (bug), which is linked to a footer name (Bug:), a 
match (\\#?\\d{1,6}) for the value that is to be indexed.
Just like project, owner, branch and topic are search operators linked to 
certain values. The operators are case sensitive and always lowercase by 
convention.

The footer being clickable is done independently.

-- Krinkle



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-13 Thread Krinkle
Can git notes be changed after merge?

Do they show in Gerrit diffs, so we don't accidentally lose them if someone 
else pushed an amend without the tags?

Can they be removed after merge?

Can one query for commits having a certain tag within a repo/branch/author? 
(eg. fixme commits by me in mw core)

If not on displayed in Gerrit, then from git-cli via a web tool (how does that 
grep perform, is it fast enough?)

Ofcourse, changing would still go from cli (not w/ the web tool).

-- Krinkle

On 14 mrt. 2013, at 00:07, Christian Aistleitner christ...@quelltextlich.at 
wrote:

 Hi,
 
 On Tue, Mar 12, 2013 at 08:28:25PM -0700, Rob Lanphier wrote:
 The Bugzilla-based solution has some of the advantages of the
 MediaWiki-based solution.  We may be able to implement it more quickly
 than something native to Gerrit because we're already working on
 Bugzilla integration, and we get features like queries for free, as
 well as the minor convenience of not having to have a new database
 table or two to manage.
 
 The problem at this point is that the gerrit-plugin interface is
 rather new, and that shows at various ends:
 * It's not possible to add GUI elements from a plugin.
 * Plugins cannot add their own database tables through gerrit.
 [...]
 
 So whatever we could possibly get into upstream gerrit, we should
 really try to get into upstream and not put into the plugin.
 
 But thinking about whether gerrit or bugzilla would be the correct
 place to store those tags ...
 It seems to me that the tags are not really tied to issues or changes,
 but rather to commits...
 Wouldn't git notes be a good match [1]?
 
 They're basically annotations attached to commits. You can add/remove
 them to any commit without changing the actual commit. And support
 comes directly with git, as for example in
 
  git log --show-notes
 
 Queries are just a grep away, and setting them can be done through git
 as well, until we integrate that into gerrit.
 
 Best regards,
 Christian
 
 P.S.: We already use git notes. They contain links to code
 review. But we could add lines like
 Tag: fixme
 and have them automatically show when reading the logs.
 
 
 -- 
  quelltextlich e.U.  \\  Christian Aistleitner 
   Companies' registry: 360296y in Linz
 Christian Aistleitner
 Gruendbergstrasze 65aEmail:  christ...@quelltextlich.at
 4040 Linz, Austria   Phone:  +43 732 / 26 95 63
 Fax:+43 732 / 26 95 63
 Homepage: http://quelltextlich.at/
 ---
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extensions meta repo problem with MaintenanceShell

2013-03-05 Thread Krinkle
On Mar 3, 2013, at 7:04 AM, Chad innocentkil...@gmail.com wrote:

 On Sat, Mar 2, 2013 at 9:56 PM, Jeremy Baron jer...@tuxmachine.com wrote:
 On Sun, Mar 3, 2013 at 5:50 AM, Brion Vibber br...@pobox.com wrote:
 Is anybody else seeing this when running 'git submodule update' in a
 checkout of the extensions repo?
 
  fatal: reference is not a tree: beead919cac17528f335d9409dfcada12e606ebd
  Unable to checkout 'beead919cac17528f335d9409dfcada12e606ebd' in
 submodule path 'MaintenanceShell'
 
 Seems like the submodule's gotten broken somehow?
 https://gerrit.wikimedia.org/r/51887 attempts to fix it manually...
 
 Well it does exist:
 
 https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fextensions%2FMaintenanceShell.git;a=commit;h=beead919cac17528f335d9409dfcada12e606ebd
 
 But that's not in the log of the current master. Must have had a force
 push bypassing review. (which makes sense if you look at the history)
 Maybe not updating the parent repo is a gerrit bug.
 
 
 The auto-updating submodule magic only works if you're pushing
 through Gerrit. Skip Gerrit, and you don't get the benefits of the
 magic submodules.
 
 -Chad

Even if after a force push changes are merged by Gerrit the normal way?

I did a one-time import of the original history, replacing the empty 
repository.

After that I merged 3 changes via Gerrit[1] and there have been no forced
pushes since.

-- Krinkle

[1] as indicated by the pink ref/changes labels:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/MaintenanceShell.git;a=summary
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] QUnit testing in Jenkins

2013-03-05 Thread Krinkle
On Mar 5, 2013, at 3:49 PM, Dan Andreescu dandree...@wikimedia.org wrote:

 From console[1]:
 
 02:50:21 Testing
 
 http://localhost:9412/mediawiki-core-28a705a9f648da310ff2a4fca9d013bf147f3d1f/index.php?title=Special:JavaScriptTest/qunitExceptionthrown
 by test.module1: expected
 02:50:21 Error: expected
 02:50:24 OK
 02:50:24  832 assertions passed (5350ms)
 02:50:24
 02:50:24 Done, without errors.
 
 This is strange. The console says there was a problem: Exception thrown
 by
 test.module1: expected and Error: expected
 But then it says everything is fine: Done, without errors.
 
 Željko
 --
 [1]
 https://integration.mediawiki.org/ci/job/mediawiki-core-qunit/3/console
 
 I'm not sure how to navigate to the source that defines that test but I
 suspect it's just using an expected exception test:
 http://docs.jquery.com/QUnit/raises

No, it is neither.

Remember you're looking at the console log. Which, in this case is
being written two from lots of sources:

* jenkins - stdout
* grunt/qunit - stdout
* phantomjs - console.log()

The part cited here is mostly qunit's output (the dotted progress
line), but when calling console.log from within the javascript it is
forwarded to the jenkins console.

A console log is harmless and no reason for alarm. If it were an
actual exception, it wouldn't be tolerated.

In this case it is coming from the unit tests that asserts that
mw.loader sets a module's state to error if it's javascript bundle
throws an exception. mw.loader executes the bundle in a try/catch and
logs any exceptions to the console after which it sets state=error and
triggers the error callbacks.

Right now grunt/qunit forwards the phantomjs console straight to the
main output. It would be prettier if it would display it in a way that
more clearly says phantomjs console. I requested this 3 months ago:
https://github.com/gruntjs/grunt-contrib-qunit/pull/6

Execute the tests in your own browser and you'll see the same data in
your browser's console (e.g. Chrome Dev Tools).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-05 Thread Krinkle
 would be there always and you'd be loading the actual
content over a separate request). You could also consider an individual page
of a book to be a separate request, again see the Media in print section
above.

* We certainly aren't going to embed the GFDL legal text in every http
request…

So given all that, whilst not having a clue whether all that is legal – I'm
assuming so since that's practically how every website in the world operates
(both free and non-free websites) – I think it is acceptable for our program
code to follow similar guidelines as multimedia and text (since code is text).
So it ought to be legal for our software to deliver individual bits and pieces
to the browser that are not a complete package with license and all (like
pages in a book).

Instead one is expected to know about the colophon page. If you are in a
position where you're legally required to have permission to do what you're
about to do (e.g. copy our javascript), you go back up the chain and access
the complete package. Find the Powered by MediaWiki button on the bottom of
the page the code was bundled with (the colophon). Then, after looking up
MediaWiki's license, go and find that code again in the original MediaWiki
book and find the helloWorld.js in all its glory on page 42.

Not sure how well that analogy flies,

-- Krinkle

[1] 
https://upload.wikimedia.org/wikipedia/commons/thumb/e/eb/Baantjegracht_Dokkum_2010.jpg/160px-Baantjegracht_Dokkum_2010.jpg
[2] 
https://upload.wikimedia.org/wikipedia/commons/e/eb/Baantjegracht_Dokkum_2010.jpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-05 Thread Krinkle
On Mar 5, 2013, at 6:39 PM, Jon Robson jdlrob...@gmail.com wrote:

 I was wondering what the latest on this was (I can't seem to find any
 recent updates in my mailing list). The MobileFrontend project was
 reassured to see a github user commenting on our commits in github.
 It's made me more excited about a universe where pull requests made in
 github show up in gerrit and can be merged. How close to this dream
 are we?
 

I'm not sure to what extend we should make it show up in Gerrit.

But there is https://bugzilla.wikimedia.org/show_bug.cgi?id=35497

Where it is explained that it is trivial to take a PR and submit it to Gerrit.

Though one could write a tool to simplify it, there isn't much to simplify.

Someone with access to Gerrit (anyone with a labs account) only has to:
* Check it out locally.
* Squash it[1] and amend with no modifications (just `git commit --amend -C
  HEAD`; which will trigger your git-review hook to add a Change-Id).
* Push to Gerrit.

If it is submitted as a pull request on GitHub, the communication with the
author and revisions of the patch should be on GitHub. We only submit it to
Gerrit once it is pretty much finalised.

Otherwise the user is going to be unable to answer and act on the feedback.

I assume the reason we are not disabling Pull requests, which is possible, is
because we want this. If all we do is immediately copy the PR, submit it to
Gerrit and close the PR saying Please create a WMFLabs account, learn all of
fucking Gerrit, and then continue on Gerrit to finalise the patch, then we
should just kill PR now.

Instead we are going to have to have some people that participate in review on
GitHub. Which, fortunately, is very open and much like on Gerrit.

Anyone with a GitHub account can participate in review, anyone can take it and
submit it to Gerrit. The only minor detail is closing the PR. When that
happens and who that does. The who is clear, someone with write access to the
Wikimedia GitHub account. The when, could be when it is taken to Gerrit, could
be when it lands in master.

-- Krinkle

[1] Squash because on GitHub it is common to add commits and squash later
(some projects don't even squash, it depends on whether they handle a policy
where every commit in the master history should be good - either way, we do,
so when a PR gets a commit added to it that fixes a syntax error, we should
squash it in the process of preparing for Gerrit).


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global user CSS and JS

2013-03-05 Thread Krinkle
On Mar 6, 2013, at 2:43 AM, Matthew Flaschen mflasc...@wikimedia.org wrote:

 On 03/05/2013 09:27 AM, James Forrester wrote:
 You can of course always counter-over-ride your global JS/CSS locally - the
 composite rule would presumably be changed to:
 
 1. file,
 2. site
 3. skin,
 *. global-user
 4. local-user
 
 However, it's trickier to override JS then override CSS.  For example,
 you can't remove a single event listener unless you have a reference to
 the original function.
 
 Matt Flaschen
 

Considering the global aspect it may be more useful (and flexible) to enforce 
this from the global script instead of from local preferences, which are rather 
annoying to maintain imho.

if ( dbname == wikidatawiki || .. ) {
 return;
}

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] QUnit testing in Jenkins

2013-03-04 Thread Krinkle
Hey all,

As of today, we automatically run our QUnit test suite[4] in MediaWiki
core from Jenkins.

Example:
* https://gerrit.wikimedia.org/r/52177
* https://integration.mediawiki.org/ci/job/mediawiki-core-qunit/3/
* https://integration.mediawiki.org/ci/job/mediawiki-core-qunit/3/console


Today I sprinted to pick up QUnit testing in Jenkins and get it
stabilised and deployed.

It is run by using PhantomJS[2] and we're using
grunt-contrib-qunit[3][4] to abstract the logic:

* starting phantomjs
* pointing it to a url
* hooking into javascript engine to register callbacks to QUnit events
* progress indicator in cli
* return the proper exit code

I won't go in detail about what PhantomJS is, but in short:

It is a headless WebKit browser. Meaning, it doesn't render pixels
to a screen on the server side, but it does behave like a fully valid
browser environment as if it were rendering it to a screen (CSS is
being parsed, the DOM is there, stylesheets are active, retrieving
computed styles, ajax requests can be made etc.).
For more information, see [2].

Just to point out the obvious, this doesn't catch issues specific to
certain browsers (e.g. syntax only breaking in older EcmaScript 3
engines, or code that incorrectly relies on HTML5 DOM APIs that exist
in latest WebKit but not in Internet Explorer or Firefox). Those we
will catch once we also run this on multiple operating systems and in
more than 1 browser (which is the next chapter in implementing the
continuous integration workflow[5]).

Things this will catch are basically everything else. Any runtime error
that we can't detect in static analysis but will fail no matter what
browser you're in, such as:

* misspelled identifiers or syntax errors
* issues with ResourceLoader (mw.loader)
* issues with AJAX
* any code failures that result in exceptions
* the obvious (catching failures/regressions in our QUnit tests)

Even code that doesn't have unit tests. The code execution alone
should result an uncaught exception, which we can now get our hands on
since we actually execute the javascript in a real browser. This
includes anything related to ResourceLoader, since we don't just
execute the unit tests in a browser, we load them from MediaWiki's
core/index.php entry point (Special:JavaScriptTest, to be specific).

Similar to how we have the php-checkstyle job currently, the QUnit job
is in non-voting mode. However, unlike php-checkstyle, our QUnit tests
are actually passing, but we're not letting it vote yet to see how it
behaves over the next 24 hours. If it is stable, we'll make it voting
(like phplint, jshint and phpunit are already).

So, next time you read the jenkins-job comment, look for the QUnit job.


Happy testing,

-- Krinkle

[1] https://www.mediawiki.org/wiki/Manual:JavaScript_unit_testing

[2] PhantomJS:
http://phantomjs.org/

[3] node-phantomjs (npm wrapper with nom-install hook)
https://github.com/Obvious/phantomjs

[4] grunt-contrib-qunit
https://github.com/gruntjs/grunt-contrib-qunit

[5] https://www.mediawiki.org/wiki/Continuous_integration/Workflow_specification
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LQT and MediaWiki

2013-02-24 Thread Krinkle
On Sun, Feb 24, 2013 at 3:51 PM, David Gerard dger...@gmail.com wrote:

 On 24 February 2013 14:44, Platonides platoni...@gmail.com wrote:
  On 23/02/13 23:58, Mark A. Hershberger wrote:

  That is, I think it is safe to say LQT will remain usable in its current
  state on any coming MW versions for the foreseeable future.
  Right now, though, all I'm looking for is a confirmation that it will
  remain usable.  I imagine one of the first things that we would need to
  do is include it in some testing plans.

  It is used by some of WMF wikis, so it has to remain usable not to broke
  them.


 Although I don't expect it - is anyone maintaining it against 1.19 LTS?

 If it isn't being maintained against 1.19, then is there an exit
 strategy? Is there a way to remove LQT while preserving the content
 usably?


I'm not sure what kind of exit strategy you'd need.

If you're using MediaWiki 1.19, you install the REL1_19 version the
extension, right?

Just because MediaWiki marked version 1.19 LTS, doesn't mean every
extension author has to support 1.19 on the latest version of their
extension. Extensions and their authors have always been independent.

Extensions that are actively maintained by people who stay up to date with
announcements (should include authors involved with the Wikimedia
Foundation) may decide to support 1.19 and/or maintain a REL branch, but
that's at their own discretion.

There's lots of extensions that implement new features that are only for
1.20 or 1.21 and will likely break horribly on 1.19 or earlier.

As crazy as that may sound, and I don't necessarily agree with this
practice, It's how it's always been. We've been doing mass updates to all
extensions for breaking changes in core from time to time, we have to in
order to support the latest stable version.

If you're using MediaWiki 1.19, you install the REL1_19 version the
extension.

Unless the extension documentation explicitly says it supports an old
version, installing a newer version of an extension on an older version of
MediaWiki is at your own risk.

This is the reason we branch extensions after every release and provide
them as options in the extension distributor.

https://www.mediawiki.org/wiki/Special:ExtensionDistributor/LiquidThreads

-- Krinkle
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Caching Discussion: Dealing with old (deleted) wmf branches

2013-02-22 Thread Krinkle
On Feb 22, 2013, at 6:33 PM, Greg Grossmeier g...@wikimedia.org wrote:

 Hello all,
 
 Background and longer/more detailed discussion on this issue is in bug
 44570:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=44570
 
 Summary:
 As w

 e delete old -wmfX branches there appears to be cached pages that
 reference old branch URLs, eg:
 https://bits.wikimedia.org/static-1.21wmf1/skins/common/images/poweredby_mediawiki_88x31.png
 
 (that 404s because 1.21wmf1 is long gone)
 
 
 If you want to see my bad ASCII art representation of our current caching
 layers, see this page:
 http://wikitech.wikimedia.org/view/Caching_overview
 
 
 So possible ways forward
 
 
 option 1:
 * reduce parsercache timeout to size of deployment window (~28 days) [0]
 * Tim may have knowledge why that shouldn't happen [1]
 

Well, the obvious thing to do and imho what we should do, like, *right now*
is extend the lifetime of the old branch to the timeout of the cache.

Simply not deleting a directory is very, very easy.

As far as I'm concerned we can agree right now not to delete any old branch from
the servers until further notice (until we've figured out the max time age, and 
then
implement the guard in multiversion/deleteMediawiki and then remove then when
possible).


 option 2:
 * change away from version numbers in URLs [2]
 ** maybe use slots or something else
 ** skins?
 

My bugzilla comment doesn't' suggest to change away from using these version 
numbers.
It suggest to not use these urls directly in any code that makes it to the main 
html output.

 
 [0] https://bugzilla.wikimedia.org/show_bug.cgi?id=44570#c14
 [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=44570#c12
 [2] https://bugzilla.wikimedia.org/show_bug.cgi?id=44570#c15
 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-02-21 Thread Krinkle
On Feb 21, 2013, at 4:51 PM, bawolff bawolff...@gmail.com wrote:

 
 * Work to get bugfixes backported to 1.19.  I don't have Gerrit
  rights to commit to the REL1_19 branch, but that will keep me from
  fixing bugs by fiat.
 
 I think we should give you such rights.
 
 --bawolff


Though it is probably obvious, just to point out (and to be corrected if I 
misunderstood):

You wouldn't single-handedly fix bugs. The bug fix is first submitted in master 
(where submission and review happen by two different people where Mark could be 
the writer of a patch or the reviewer/merger, but never both).

And in release branches (as far as I know) we only tolerate self-merges if the 
change is a cherry-pick of an already-merged change in master. If the change is 
e.g. a bug fix of something that occurred due to combination of backports in 
the release branch, it should be reviewed and written by two different people 
like any other change.

And for others looking to help out in back porting fixes:

One can submit a backport without merge access. It is the same process 
(cherry-pick to another branch, submit back to Gerrit with the same change id). 
Except that you'd ask for someone to approve the merge instead of self-merging 
right away.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code Review Dashboards of other users in gerrit

2013-02-19 Thread Krinkle
On Feb 19, 2013, at 12:13 AM, Krenair kren...@gmail.com wrote:
 
 On 18/02/13 23:08, hoo wrote:
 Hello,
 
 after the last gerrit update I'm no longer able to visit the Code Review
 Dashboards of other gerrit users in case I don't know their user ids. If
 I do it's fine (eg. https://gerrit.wikimedia.org/r/#/dashboard/50 is
 mine).
 Is there a way to get to these dashboards or at least get to know the
 user id of an user? Those dashboards gave a rather good overview of what
 a user is currently doing and I want them back...
 
 Cheers,
 
 Marius Hoch (hoo)
 
 
 Inspect your browser's calls to 
 https://gerrit.wikimedia.org/r/gerrit_ui/rpc/ChangeDetailService. It returns 
 loads of info in JSON format, including the IDs of the users on the page.
 
 Alex Monk
 


I think he means whether there is a usable way (from the GUI) to get to a 
dashboard from a user.

Previously this was achieved by simply clicking the username on any page where 
it is mentioned.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-18 Thread Krinkle
On Feb 18, 2013, at 5:54 PM, Waldir Pimenta wal...@email.com wrote:

 On 15/02/13 09:16, Waldir Pimenta wrote:
 
 should all access points be on the root directory of the wiki, for
 consistency? currently mw-config/index.php is the only one not in the root.
 
 
 On Mon, Feb 18, 2013 at 3:56 PM, Platonides platoni...@gmail.com wrote:
 
 Well, every bit of the installer used to be in the config folder, until
 the rewrite, which moved the classes to includes/installer
 Now you mention it, you're probably right in that it could be moved to
 eg. /installer.php
 
 
 OK, so I'll ask here on the list: is there any reason we shouldn't move
 [root]/mw-config/index.php to [root]/installer.php?
 

Well, Platonides already gave a reason:

On Feb 15, 2013, at 12:58 PM, Platonides platoni...@gmail.com wrote:

 On 15/02/13 09:16, Waldir Pimenta wrote:
 should all access points be on the root directory of the wiki, for
 consistency?
 
 No. The installer is on its on folder on purpose, so that you can delete
 that folder once you have installed the wiki.

It is also arguably easier to understand for non-tech users to
navigate to the mw-config directory compared to the  installer.php file
(though that's a minor difference, if any).

There's also an overrides.php file and an mw-config/index.php5 file.

If it were just installer.php it'd be equally easy to removed as the 
directory, but since there three files,
(one of which depends on another, so if you'd just remove installer.php then 
you're left in an unclean state where hitting installer.php5 will cause an 
internal server error clogging the php error log).

… which brings up for how much longer we should keep that *.php5 stuff around…

But before more bike shedding (have we had enough these last 2 months yet?), is 
there a problem with having a directory?

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] When to automerge (Re: Revoking +2 (Re: who can merge into core/master?))

2013-02-15 Thread Krinkle
On Feb 15, 2013, at 3:32 PM, Platonides platoni...@gmail.com wrote:

 I'm not convinced that backporting should be automatically merged, though.
 Even if the code at REL-old is the same as master (ie. the backport
 doesn't needs any code change), approving something from master is
 different than agreeing that it should be merged to REL-old (unless
 explicitly stated in the previous change). I'm not too firm on that for
 changes that it's obvious should be backported, such as a XSS fix*, but
 I would completely oppose to automerge a minor feature because it was
 merged into master.
 Note that we are not alone opinating about what it's worth backporting,
 since downstream distros will also call into question if our new release
 is “just bugfixes” before they agree into accepting it as-is.
 

I don't know where you pull auto-merging from but it certainly isn't from my 
e-mail, which was about revoking merge access and about when self-merging may 
or may not be tolerated.

Auto-merging would imply some random dude can take a change from master merged 
by someone else *for master*, and submit it to any branch and have it be 
auto-merged.

What I was talking about is that a code reviewer with merge access can submit 
an approved change from master to another branch and self-merge it.

Just because one can however doesn't mean one should.

When our random dude pushes a change for review to an old branch that backports 
a feature from master, the assigned reviewer should (as you explain) not 
approve it.

And for the same reason, when that reviewer backports himself, he wouldn't 
self-merge. Or rather, he wouldn't draft such a change in the first place.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revoking +2 (Re: who can merge into core/master?)

2013-02-14 Thread Krinkle
On Feb 15, 2013, at 1:11 AM, Tyler Romeo tylerro...@gmail.com wrote:

 I know this is pretty obvious, but self-merging pretty much any change
 should be grounds for removal (or at the very least no second chance).
 
 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 


Though certain repositories in Gerrit have their own policies (such as 
operations, who appears to use Git mostly as a log, they self-merge 
continuously, maybe even after deployment - don't know if that's the case)

But I agree that for MediaWiki core self-merging needs to be frowned upon more.
I don't know if revoking access should be a directly correlated consequence 
though, that may be too extreme.

Note though that there are at least 2 commonly accepted exceptions:

* Backporting a change approved by someone else in master to another branch, 
and self-merging that backport

e.g. User A pushes for review to master,  User B approves it and merges it, 
User A or User C then pushes the same change to another branch and self-merges 
that) - such as to REL branches or WMF branches.

* Reverting

Due to the way Git and Gerrit interact, a revert (unlike a merge) is a two-step 
process. It considers it to be a separate commit (which it is, of course). So 
clicking Revert only drafts a commit, it still needs to be merged. This is 
generally done straight away if the revert reason comes from someone with merge 
rights.


These 2 cases (and maybe more, such as i18n-bot) were the reasons that a few 
weeks/months back we didn't make the change in Gerrit to make self-merging no 
longer an option (e.g. reject merge attempts by the author of the change).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] URLs for autogenerated documentation

2013-02-08 Thread Krinkle
On Feb 8, 2013, at 7:18 AM, Antoine Musso hashar+...@free.fr wrote:

 A) http://doc.wikimedia.org/mediawiki-core/branch|tag/php
 
 Would bring:
 
  doc.wikimedia.org/mediawiki-core/master/php
  doc.wikimedia.org/mediawiki-core/master/js
  doc.wikimedia.org/mediawiki-core/1.20.2/php
  doc.wikimedia.org/mediawiki-core/REL1_20/js
  doc.wikimedia.org/mediawiki-AbuseFilter/1.20.2/php
 
 
 B) doc.mediawiki.org/project/type/version/
 
 Would bring:
 
  doc.mediawiki.org/core/php/master
  doc.mediawiki.org/core/js/master
  doc.mediawiki.org/core/php/1.20.2
  doc.mediawiki.org/core/js/REL1_20
  doc.mediawiki.org/AbuseFilter/php/REL1_20
 
 I would prefer having the MediaWiki doc hosted on the mediawiki.org
 domain.  As for the ordering I guess we can bikeshed for a long time but
 most probably some ordering will seem natural for most people there :-]
 
 Thanks!
 



Presenting them as A and B seems flawed as they are separate things:

Let's decouple them into a matrix of more tangible decisions:

A) Do we want to maintain two domain names for our documentation?

No,
* We already have doc.wikimedia.org, doc.mediawiki.org would be a new one
* If we maintain separate domain names, when is something mediawiki and when 
is it wikimedia?
For example, documentation for jQuery plugins, VisualEditor etc are stand 
alone, most certainly not MediaWiki specific.
There is no reason to force ourselves into this ambiguity. One domain is all we 
need. With some redirects perhaps.


B) Hierarchy of directories project, branch and category of code (usually a 
programming language). 6 possible combinations of these 3.
*The one I've proposed before and rationalised in the comment thread is project 
 branch  code-category.
* This because not all projects have the same code categories. A tree is 
usually structured towards more specificity down the tree.
* With the separator of time as high up as possible so that you don't duplicate 
versions in multiple locations.
* Easier to maintain if we rename the code categories later on.
* Easier to generate to have everything go in 1 target directory and not 
separate directories in different locations.


I appreciate you putting thought into these minor details, but I think this 
discussion is pointless because you're not providing any rational input 
yourself (all I see it I prefer which is undeniably a useless argument to 
defend against). I can talk for hours, but it help get the documentation 
deployed if you don't communicate.

The thread can become slightly more useful if you had actually provided some 
arguments of your own. I've explained the reasons for my version, then you 
reverted it with no explanation, linking[1] only to this post on wikitech-l, 
where you merely present A and B once again with no explanation as to why you 
disagree in the first place.

I'd love to discuss the advantage of your version or disadvantage of mine, 
preferably as a simple response to my question in Gerrit, but here is fine too. 
Let's keep in mind the bigger picture and do what's best for the users.


On Feb 8, 2013, at 7:41 AM, bawolff bawolff...@gmail.com wrote:

 Whichever way we chose, could we have http redirects from the old
 svn.wikimedia.org? There's a lot of urls that link there.
 

Unrelated.

Yes, of course. When we move it, the old location will become a redirect.

 I prefer doc.mediawiki.org/project/version/master (aka
 doc.mediawiki.org/core/master/php ) as in my mind, the hierarchy makes
 more sense like that, as the type of code is something more
 fine-grained than what version, and also something that belongs to the
 version number in a sense. I also like keeping the names MediaWiki and
 Wikimedia separate. At the end of the day it doesn't really matter
 which way though.
 

I agree.

 It would also be cool if puppet docs were on doc.wikimedia.org, but if
 you had doc.mediawiki.org in the url, things auto redirected (and vice
 veras: if you went to doc.wikimedia.org/core/master/php things
 redirected to doc.mediawiki.org/core/master/php )
 

I'm not sure we should be maintaining two domains. We can have
doc.mediawiki.org redirect to doc.wikimedia.org/mediawiki-core, but to maintain
both would be confusing, decentralising and depending on the implementation,
it would encourage using multiple urls for the same thing. Might as well stick
with one canonical url.



Best,
-- Krinkle

[1] https://gerrit.wikimedia.org/r/39212


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] varying ResourceLoader module dependencies

2013-01-28 Thread Krinkle
On Sun, Jan 27, 2013 at 6:42 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 01/27/2013 07:58 PM, S Page wrote:
  How can an extension have a soft requirement on a module, so that it
  only adds that module to its ResourceLoader dependencies if the
  extension providing that module is loaded?
 
  For example, there's no sense in depending on ext.eventLog if the wiki
  doesn't load Extension:EventLogging, but if it's available then you
  want the module sent to the browser along with your code. (Obviously
  your JS would check to see if the module's functions are available
  before invoking them.)

 That's one design decision, but I don't think it's the obvious one.  In
 fact, it's rather uncommon.

 The way most loggers are used, you can always call the log function
 (without checking, or defining your own shim), and it's up to the
 logging library to be a no-op or actually log.

 Matt Flaschen


I agree, but if it is unknown whether the wiki had this logging library
installed, that approach doesn't help.

However to get back at S' question, I'd recommend changing the rules
because the game is no fun this way.

Variable dependencies mean you can't rely on something, that's an
unacceptable design. Whatever it is that insists the need for a variable
dependency should be changed instead.

I think the picture you present is misrepresented. I'd say, just depend on
it. But for some reason you don't want to.

Requiring that the wiki install it seems like a trivial thing. That can't
be the real reason.

I presume that when you say it is unknown whether the wiki has it
installed, the real issue is that it is unknown whether the wiki has it
enabled at this time.

Being able to disable logging whenever you desire is a valid use case.

However in that case the problem lies with the logging library to come up
with a sane way to enable/disable the library that does not involve
commenting out the require_once line of the extensions' installation. But
for example something like $wgEventLoggingEnable.

Then the logging library would disable the backend API when the extension
is disabled and also swap the front-end client for a no-op.

-- Krinkle
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+

2013-01-18 Thread Krinkle

On Jan 17, 2013, at 10:05 AM, Robert Vogel vo...@hallowelt.biz wrote:

 @Krinkle: I've written a custom API module that returns categorized results. 
 I wanted to have a categorized output like in this demo: 
 http://jqueryui.com/autocomplete/#categories
 My first approach was to override some of the logic in 
 mediawiki.searchSuggest on the client side. But this didn't work as I 
 expected. Then I saw that jquery.ui.autocomplete was part of MW framework 
 and so I tried to use this according to the demo I mentioned before. But 
 mediawiki.searchSuggest always occupied the input element so I couldn't 
 apply jquery.ui.autocomplete to it.
 But now with the hint from Matt it works pretty good :)
 

mediawiki.searchSuggest does NOT use jquery.ui.autocomplete.


On Jan 17, 2013, at 10:05 AM, Robert Vogel vo...@hallowelt.biz wrote:

 Are there any future plans to make the removal of already added modules 
 available in ResourceLoader?
 

When taking the question literally I'd say, no, not ever and for good reason.

Features and modules should not be confused. Removing a module would do nothing 
but make things crash and fail. If the module is referenced somewhere it is 
expected to exist and provide a certain API. Removing the module would lead to 
a missing dependency exception.

Now, one could create a different module in its place with the same name, but 
that would violate the naming conventions and is generally a bad idea as it 
defeats the modular design we have. Module names are unique identifiers for a 
piece of code, not an idea or concept. Loading a module brings the expectation 
that that exact module is going to be loaded and not some duck punched 
impersonator in its place that may be implement a similar feature.

Instead you'd register your own module with its own name and make sure it is 
loaded instead of some other module. The method used can differ based on the 
feature at hand, but handling this at the module level is not the way to go.

Let's take editors for example. There is a legacy editor, the WikiEditor 
extension and the VisualEditor extension and many others. They all have their 
own coding structure, API modules and ResourceLoader modules.

The EditPage provides a hook to change which modules are queued. That's where 
you hook in, not in the module itself.

I think in the case of SearchSuggest it should probably be loaded by the Skin, 
not by OutputPage. The only use case I can think of where server-side hooks are 
not enough (PrefixIndex, API modules etc.), is if the visual presentation (as 
opposed to the suggested items themselves) needs to be different.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+

2013-01-16 Thread Krinkle

On Jan 15, 2013, at 9:18 AM, Robert Vogel vo...@hallowelt.biz wrote:

 Hi everybody!
 
 I'm writing an extension that needs to replace/disable the build-in 
 suggestion feature of the search box. 
 
 

In case there may be a better approach to solve your problem, what is it you're 
looking to add to this feature?

Most of this module is pretty trivial and only wraps around other features that 
have elaborate hooks and toggles (such as PrefixIndex and the API).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Rewriting ExtensionDistributor

2013-01-07 Thread Krinkle
On Jan 7, 2013, at 8:03 PM, Chad innocentkil...@gmail.com wrote:

 I'm thinking we configure $wgExtDistBranches as follows:
 
 $wgExtDistBranches = array(
  'master',
  'SNAPSHOT-for-1.21',
  'SHAPSHOT-for-1.20',
  ...
 );
 
 We can go ahead and create tags for the existing released versions
 to approximate snapshots of each extension as of the corresponding
 MediaWiki release. Extension authors can update the tag (or maybe
 use a branch, if they want) using the same name that we have
 configured.
 
 I'm willing to bikeshed (a little) on the SNAPSHOT-for-1.21 tag
 

Why not use what we already use: REL1_20 etc.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Krenair for core

2013-01-06 Thread Krinkle
On Jan 6, 2013, at 3:39 AM, Ori Livneh o...@wikimedia.org wrote:

 
 On Saturday, January 5, 2013 at 2:08 PM, Alex Monk wrote:
 
 Okay then, so what query do you use to get a history of +1 and -1 reviews a
 user has made?
 
 
 
 The best I could come up with is:
 
 reviewer:krenair project:mediawiki/core -owner:krenair (label:CodeReview=1 OR 
 label:CodeReview=-1)
 
 https://gerrit.wikimedia.org/r/#/q/reviewer:krenair+project:mediawiki/core+-owner:krenair+(label:CodeReview%253D1+OR+label:CodeReview%253D-1),n,z
 
 It seems more useful to cast a wider net, though:
 
 reviewer:krenair -owner:krenair
 
 https://gerrit.wikimedia.org/r/#/q/reviewer:krenair+-owner:krenair,n,z
 
 I'll add them to the request on MediaWiki.org.
 

Just pointing out the obvious but this returns all commits where *someone* did 
CR+1/-1 and where Krenair is in the reviewer list.

Such as https://gerrit.wikimedia.org/r/#/c/38258/:
where Bsitu did CR+1 and Alex no review on the final version but is in the 
reviewer/CC list.

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Generating documentation from JavaScript doc comments

2013-01-04 Thread Krinkle
On Dec 28, 2012, at 5:05 AM, Matthew Flaschen mflasc...@wikimedia.org wrote:

 We have all these JavaScript documentation comments, but we're not
 actually generating docs.  This has been talked about before, e.g.
 http://www.gossamer-threads.com/lists/wiki/wikitech/208357?do=post_view_threaded,
 https://bugzilla.wikimedia.org/show_bug.cgi?id=40143,
 https://www.mediawiki.org/wiki/Requests_for_comment/Documentation_overhaul#Implementation_ideas
 .
 
 I don't think Doxygen is the best choice, though.  JavaScript is really
 put forth as a sort-of afterthought.
 
 I suggest JSDoc (http://usejsdoc.org/), simply because it's a standard
 library and has has been put forward in the past, with good rationale.
 
 I know there are other good ones too.
 
 What do you think?
 
 Matt Flaschen

Doxygen is indeed not meant for JavaScript. With some hacks it can be tricked 
into reading comment blocks from javascript files, but that won't scale for our 
code base, nor will it be enough to make a useful structure given the dynamic 
way JavaScript works.

JSDoc is pretty solid, though there are some concerns:
* The syntax is somewhat foreign compared to what we're doing right now
* Development is unclear (v2 on google-code has been discontinued, v3 on github 
is a rewrite still being worked on)
* Written in javascript, but doesn't run on node. Requires Java.
* Features appear to cover the general cross-language cases, but too limited 
when trying to document more complex javascript solutions (e.g. VisualEditor's 
code base).

I've recently looked into a documentation generator for VisualEditor and though 
I haven't stopped looking yet, I'm currently pausing rather long at JSDuck. It 
is very well engineered and especially optimised for modern JavaScript 
(inheritance, mixins, event emitters, override/overload methods from another 
module, modules, etc.).

It is also easy to extend when needing to implement custom @tags.

I've set up a vanilla install for VisualEditor's code base here:

http://integration.wmflabs.org/mwext-VisualEditor-docs/

Right now, like MediaWiki core, VisualEditor is just documenting code loosely, 
not following any particular doc-syntax, so we're bound to require a few 
tweaks[1] no matter which framework we choose. Our current syntax is just 
something we came up with loosely based on what we're used to with Doxygen.

Right the demo on labs  only uses the Doc app of JSDuck, but it also supports 
Guides, Demo's, interactive live-editable Examples and more.

A few random things I like in particular about JSDuck are:
* Support documenting parameters of callback functions
* Support documenting events emitted by a class/module
* Option to show/hide inherited methods and other characteristics
* Support to completely document objects for @param and @return (like @param 
{Object} foo, @param {number} foo.bar)
* Live search and permalinks
* Markdown all the way + duck extension for doc specific syntax (e.g. @link and 
#method)

If it works out, I think we can get this going for MediaWiki core as well.

Regardless of the framework we choose, we should set it up to be generated for 
branches and update on merge from jenkins's post-merge hooks. Come to think of 
it, we should probably do that for the PHP/Doxygen as well (is that still 
running from the cronjob on svn.wikimedia.org?).

-- Krinkle

[1] Perfectionist alert, this commit does more than just the necessary 
tweaks: https://gerrit.wikimedia.org/r/42221/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit code review guidelines

2012-12-31 Thread Krinkle
On Dec 27, 2012, at 7:18 PM, Juliusz Gonera jgon...@wikimedia.org wrote:

 Hi,
 
 I'm a bit confused when it comes to various options I have in gerrit and it 
 seems the docs are not up to date on that.
 
 * What is the difference between Verified and Code Review? When would I put 
 +1 in one of them but -1 in the other?
 * What is the difference between +1 and +2, especially in Verified?
 * Why do we even have +2? +1 means that someone else must approve. What does 
 +2 mean? No one else has to approve but I'm not merging anyway, why?
 
 It seems the docs (http://www.mediawiki.org/wiki/Code_review_guide) do not 
 explain it.
 
 Juliusz

== Verified ==

Verified is for linting and executing unit tests.
* Verified +1 means Checked for lint issues
* Verified +2 means Tested by executing the Jenkins tests

If you are a human, do not use Verified (most people don't have user 
permissions to set this, anyway).

If you tested the change (either by checking it out locally and using the 
wiki and/or by running phpunit locally) and you have the ability to set 
Verified, still do not set it.

It does not mean the same thing. Because the Jenkins tests are much more 
elaborate than the testing you may do locally (e.g. different database 
backends, and soon also different browsers and test suites: jshint, phplint, 
phpunit, qunit, mysql, sqlite etc.).

We might rename this field to Automated Testing for clarification (Verified 
is a generic name that is the default in Gerrit) and (where not already) it 
will eventually be restricted to bots only[2].

== Code Review ==

Code Review is for human review of the code.

A positive value means you believe it is perfect and may be merged as-is (under 
the condition that the Jenkins test will pass[1]). If you have merge rights 
then you'd give +2. Others would give +1.

A negative value means there are issues.

[1] So if you set CR+2 you're saying The code looks great, let Jenkins run it, 
and if it passes, merge it.
[2] Except for wmf-deployment probably, to be able to override it if Jenkins is 
having issues and certain commits need emergency deployment.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Adding support for Twitter Cards

2012-12-31 Thread Krinkle
On Dec 26, 2012, at 11:09 PM, Harsh Kothari harshkothari...@gmail.com wrote:

 On 12/26/2012 12:25 PM, Sébastien Santoro wrote:
 
 Do you still work on http://www.mediawiki.org/wiki/Extension:OEmbedProvider ?
 
 Could this make a good project for next outreach projects like GSoC or OPW?
 
 What is Twitter card? and what this extension do?
 


https://www.google.com/search?q=twitter+cardsbtnI=1

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Clarification on unit tests requiring CR+2

2012-12-19 Thread Krinkle
Hello,

We would like to clarify the reason we changed Jenkins to no longer run unit
tests on patch submission.

We had to defer code execution to after CR+2 for security reasons. If unit tests
were ran on submission that meant  anyone with a labs account could effectively
get shell access on the server.

Because LDAP accounts are now up for open registration (aka free Labs accounts,
and by extend permission to submit patches to Gerrit), that also meant the whole
world would be able to get shell access on the server (via PHP/Nodejs/ant/bash
to infinity and beyond).

This issue will be definitely solved by isolating tests in dedicated virtual
machines for each run. We are investigating Vagrant.

Restricting unit tests is simpler and faster to implement over all the Vagrant
engineering. So running tests after CR+2 is a temporary measure until the
implementation of Vagrant sandboxes in Jenkins builds is ready.

So, in conclusion: Unit tests will be run again on patch submission once we have
finished integrating Vagrant in Jenkins.

-- The CI team
Antoine  Timo


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Small tweak to Gerrit Verified category - tomorrow

2012-12-19 Thread Krinkle
On Dec 18, 2012, at 6:01 AM, Liangent liang...@gmail.com wrote:

 On Tue, Dec 18, 2012 at 12:45 PM, Chad innocentkil...@gmail.com wrote:
 
 Linting jobs will receive Verified ±1 votes. Unit tests jobs
 (triggered after someone votes CR+2, as it currently is) will
 receive Verified ±2 votes.
 
 
 Actually I prefer running unit tests on every new patchset as what we
 did in the past. Some issues in my code were caught by it.
 


Afaik everybody prefers tests to be ran on submission, including yours truly. 
However this thread does not discuss that change. This change was made a few 
weeks back.

Now that everything is in place we can elaborate on this. Antoine and I sent 
out a mai to wikitech-l just now:
http://lists.wikimedia.org/pipermail/wikitech-l/2012-December/065202.html

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-14 Thread Krinkle
On Dec 11, 2012, at 9:53 PM, K. Peachey p858sn...@gmail.com wrote:

 
 That is contrary [..]
 

No, it isn't contradictory.

The bugzilla account for wikibug...@lists.wikimedia.org is either inexistent, 
or unused.
The *e-mail address* exists, but that's irrelevant.

Mails to this list are sent through the global configuration setting 
globalwatchers – not through the fact that it may or may not be on CC of a 
bug.

At least that's how it is supposed to work, and should work again if not.

This global configuration setting is afaik a core feature, what exactly does 
this hack entail?


On Dec 11, 2012, at 8:56 PM, Andre Klapper aklap...@wikimedia.org wrote:

 [..] virtual accounts, even with non-working email addresses. They
 are only created so you can add such accounts to your User Watching
 list. 
 


That is an interesting technique, but that isn't what we're doing here afaik. 
Because if someone wants those mails, they can simply subscribe to the list. 
And moreover, subscribing to the list is semantically more correct since then 
you receive all activity though the globalwatchers setting, as opposed to only 
the activity on bugs the Nobody account is CCed on (which is often removed 
from bugs when someone assigns a bug or otherwise ends up removing it for some 
reason).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla: Waiting for merge status when patch is in Gerrit?

2012-12-12 Thread Krinkle
On Dec 13, 2012, at 1:25 AM, Matthew Flaschen mflasc...@wikimedia.org wrote:

 On 12/12/2012 04:15 PM, Sébastien Santoro wrote:
 Currently there is a patch-in-gerrit keyword in Bugzilla. When a bug
 report ends up as RESOLVED FIXED there usually had been a codefix in
 Gerrit that got merged. Hence patch in gerrit could be considered
 another state on the journey of a bug from reporting to fixing.
 And Bugzilla allows adding stati (stuff like NEW or RESOLVED).
 
 ASSIGNED seems perfect for me. It's ASSIGNED, this mean there are work
 going to be done, or done.
 
 That doesn't make sense to me, since I can be assigned something, and
 actively working on it, but not have submitted a Gerrit at all yet (let
 alone one almost ready to be merged).
 
 Matt Flaschen

I agree with Sébastien. ASSIGNED is enough.

I don't see the significance of whether there is a Gerrit change yet?

If there is no Gerrit change, it doesn't mean nobody is working on it.
And if there is a change, it may not be a good one and/or one written by 
someone else (e.g. someone else can give it a try, send the change-id to 
bugzilla, but the assignee hasn't reviewed it yet and/or abandoned it).

Then we'd have to keep that in sync (back from this PENDING to ASSIGNED after 
the change is rejected?).

Only more maintenance and bureaucracy for imho no obvious gain or purpose.

The queryable state is ASSIGNED (and maybe, though I personally don't find it 
useful, the keyword patch-in-gerrit). And for any further details just open 
the bug and read it.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla upgrade [was: bugzilla.wikimedia.org downtime: Now.]

2012-12-11 Thread Krinkle
On Dec 10, 2012, at 2:09 PM, Andre Klapper aklap...@wikimedia.org wrote:

 On Sat, 2012-12-08 at 05:50 +1000, K. Peachey wrote:
 Can you propose a better option for this than defaulting the cc'er as
 wikibugs?
 
 I doub't there are that that many people that want the wikibugs-l list
 to receive a copy of every single change (Eg: cc changes) to the bug,
 But that is probably a discussion for another thread.
 
 There are two people watching wikibugs-l@: Quim Gil and I.
 So I'll likely set up an account like all-bugm...@wikimedia.bugs and
 set it as globalwatcher tomorrow. Then Quim and I can subscribe to it
 to drown in bugmail, and you can use wikitech-l@ as usual.
 

I am also subscribed to wikibugs-l, and I believe Roan is as well (or was 
anyway).

I agree with Chad, it should continue to work as expected. From a globalwatcher 
point of view, whatever change is made to a bug, it must not affect the fact 
that the globalwatcher address (wikibugs-l in this case) is notified. The 
default CC status of Nobody wikibug...@lists.wikimedia.org is afaik just 
for legacy reasons and as placeholder. wikibugs-l as Bugzilla account isn't 
actually used afaik. In the Bugzilla configuration, wikibugs-l is configured as 
globalwatcher.

The proposal Andre makes here sounds confusing, How is that different from the 
current situation? What problem is it supposed to address?

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Really Fast Merges

2012-12-05 Thread Krinkle
On Dec 4, 2012, at 9:46 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote:

 On Tue, 04 Dec 2012 12:37:02 -0800, Chad innocentkil...@gmail.com wrote:
 
 On Tue, Dec 4, 2012 at 3:27 PM, Chad innocentkil...@gmail.com wrote:
 On Tue, Dec 4, 2012 at 3:24 PM, Tyler Romeo tylerro...@gmail.com wrote:
 Don't we have some sort of policy about an individual merging commits that
 he/she uploaded?
 
 
 Yes. We've been over this a dozen times--if you're on a repository
 that has multiple maintainers (ie: you're not the only one, so you're
 always self-merging), you should almost never merge your own
 code unless you're fixing an immediate problem (site outage, sytax
 errors).
 
 
 In fact, I'm tired of repeating this problem, so I started a change to
 actually enforce this policy[0]. We'll probably need to tweak it further
 to allow for the exceptions we actually want. Review welcome.
 
 -Chad
 
 [0] https://gerrit.wikimedia.org/r/#/c/36815/
 
 Doesn't TWN's bot self-review? Might need to add an exception for that before 
 merging.

I'm not sure in which part of the flow rules.pl is applied but maybe it can be 
enforced the other way around?

Instead of restricting Submit, restrict CR scores. Submission in turn only has 
to be restricted to CR+2.

But yeah, we need to either whitelist L10n-bot from this restriction or make 
those commits auto-merge in a different way.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Really Fast Merges

2012-12-05 Thread Krinkle
On Dec 5, 2012, at 7:13 PM, Patrick Reilly prei...@wikimedia.org wrote:

 There were 132 days for anybody to review and comment on the technical
 approach in the UID class.
 
 — Patrick
 

Even if all people involved had seen in a 100 times, self-merging is a social 
rule separate from that. That was the reason it was brought up, and the reason 
it was subsequently reverted again. Nothing personal and not (directly) related 
to the contents of the commit itself.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF's webpage

2012-11-28 Thread Krinkle
On Nov 28, 2012, at 11:29 PM, Samat sama...@gmail.com wrote:

 Hi,
 
 I am the only one, who see this (in attachment) on the top of WMF's Main
 Page (https://wikimediafoundation.org/wiki/Home)?
 

Looks like that one got purged in the mean time. I currently see it on:

https://wikimediafoundation.org/wiki/FAQ/en

This is caused by the recent change to the headings in the Vector skin.
They were changed from h4/h5, however the CSS used those tags to identify them 
(instead of using css classes). Which means, as expected, that the page layout 
breaks for up to 30 days.

Page cache is controlled by the wiki page content. Unless the page is modified, 
the cache is kept for up to 30 days for anonymous users.

Resource modules, however, are served by ResourceLoader which has its own much 
more efficient and deployable cache mechanism. However this means that the 
resources for the skin are deployed globally and site-wide within 5 minutes. 
Whereas the html isn't for another 2 weeks.

This is why client resources must always be backwards compatible.

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Standardizing highest priority in Bugzilla

2012-11-27 Thread Krinkle
On Nov 27, 2012, at 5:39 PM, Andre Klapper aklap...@wikimedia.org wrote:

 On Mon, 2012-11-26 at 17:36 -0800, James Forrester wrote:
 On 26 November 2012 17:25, Rob Lanphier ro...@wikimedia.org wrote:
 Timeframes seem like a pretty good proxy for priority.  If something
 is highest priority, and yet is not on track to be completed for
 several months, then.wait, what?
 
 I disagree. In 1962, NASA's highest (most-high?) priority was to put
 a human on the Moon; that doesn't mean they achieved it before 1969.
 High priority and soonness-to-be-done are orthogonal.
 
 I've been made aware (off-list) of some concerns of this proposal, and
 your comment provides the same sentiment.
 
 The term highest priority has some ambiguity in human language.
 It's perfectly fine to state that a bunch of bug reports are highest
 priority: Issues that a team is working on currently, or should work on
 as the very next task.
 My initial proposal was to make highest priority mean really urgent
 or immediate. Consequently, this should also be reflected by its name.
 Still there should be a way to express what's highest priority for a
 team.
 
 == Reworked proposal: New Immediately priority ==
 
 I propose adding a *new* priority called Immediate which should only
 be used to mark really urgent stuff to fix. This priority would be added
 above the existing Highest priority.
 
 
 [I'm going to respond to the wider priority vs. severity vs. target
 milestones vs. does this all make sense together discussion in a
 separate email.]
 


I don't think adding more fields/values is the solution. Perhaps use milestone
for immediate?

So both Get man on the moon (tracking) and [Regression] Bike shed should not
be on fire have highest priority. But one is a regression milestoned for the
current release, and the other is on track for N+2 release or maybe Future
release.

Besides an immediate bug without a milestone doesn't make sense to start with?
If that is possible, there is a missing milestone I guess.

We should make more use of being able to combine and query different fields to
express clarity instead of adding more options that represent a multiple of
values in other fields which then also need to be set separately (Commons
categories comes to mind, like Category:Blue objects made of recycled glass
hanging upside-down in Amsterdam, Netherlands).

-- Krinkle





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Standardizing highest priority in Bugzilla

2012-11-26 Thread Krinkle
On Nov 27, 2012, at 2:36 AM, James Forrester jforres...@wikimedia.org wrote:

 On 26 November 2012 17:25, Rob Lanphier ro...@wikimedia.org wrote:
 Timeframes seem like a pretty good proxy for priority.  If something
 is highest priority, and yet is not on track to be completed for
 several months, then.wait, what?
 
 I disagree. In 1962, NASA's highest (most-high?) priority was to put
 a human on the Moon; that doesn't mean they achieved it before 1969.
 High priority and soonness-to-be-done are orthogonal. More
 prosaically, VE's most-high priority task in July was to deploy a test
 version of the VisualEditor to the English Wikipedia in December; it's
 now only a few weeks away, but for the past five months it's remained
 our most-high priority, whilst we've worked on things to support that.
 


I agree with Rob, there is a strong correlation between priority and
milestone. However, I don't believe they are linked enough to be able
to merge them.

Last year I proposed to replace Priority, Severity and Milestone with
just Milestones. However I now believe that Priority and Timeframe
(milestone) should stay separate.

On Nov 27, 2012, at 2:25 AM, Rob Lanphier ro...@wikimedia.org wrote:

 Your definitions of priority strike me as redefining the field as a
 severity field, which makes it redundant and also not terribly
 useful as a prioritization tool. 
 


It seems Severity (or Priority) is redundant in practice. Severity may
be useful for statistical purposes afterwards, but given that it is
rarely useful for anything other than enhancement, we might as well
drop it and just have a tag for enhancement (like for
regression).

Highest priority is the next primary feature in a component and/or a
critical bug that needs fixing. Both are very important, but one is
long-term the other short-term (as James demonstrates well with the
NASA example).


On Tue, 2012-11-20 at 02:33 +0100, Andre Klapper wrote:
 == Proposal ==
 
 Proposing the following definitions for Priority:
 * highest: Needs to be fixed as soon as possible, a week at the
  most. A human assignee should be set in the Assigned to field.

I'd recommend we also require a Milestone to be set for Highest
priority tickets.

That way it is clear to both the assigned developer and the community
what the expectations are.

-- Krinkle




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Jenkins now lints javascript!

2012-11-22 Thread Krinkle
On Nov 21, 2012, at 7:23 PM, S Page sp...@wikimedia.org wrote:

 I have vim set up [..]


That's great. If you're using a different editor, here's a list if all kinds of 
platforms and editors and how to use jshint in them:

http://jshint.com/platforms/


On Nov 21, 2012, at 7:23 PM, S Page sp...@wikimedia.org wrote:

 Should we be using .jshintrc per directory or lines like
 /*jshint multistr:true */
 at the top of files?  Or both?


Neither. Place .jshintrc in the root directory of the repository. JSHint 
travels up until it finds it. Falling back ~/.jshintrc in your home directory 
(e.g. for files not in a repository but just loose somewhere, opening them up 
in your editor would use that).

Although having it in your editor is nice, but for massive (recursive) running 
against a repository you can install jshint (npm install -g jshint). Used like 
$ jshint . (recursively the current directory), or pass it the path to a 
directory or file.

JSHint has had many versions and the behaviour of some rules has changed over 
the years. So make sure you run against the same (latest) version we use on the 
cluster (i.e. the version that matters, for jenkins, 0.9.1 currently).

Some editors have their own (possibly outdated) copy of jshint (I know 
SublimeLinter had an old version for a while). If you have node-jshint 
installed, I'd recommend configuring the linter plugin of your editor to shell 
out to your version of jshint (if it supports that).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Jenkins now lints javascript!

2012-11-21 Thread Krinkle
We're not planning on creating a job specifically for some extensions that just 
lint javascript.

However, what we are doing is this: The javascript lint check is currently the 
first component in the new Grunt build tasks system.

Once we've migrated all crucial tasks from Ant build.xml to Gruntjs and have 
migrated to Zuul (instead of Gerrit Trigger), we can easily create a catch-all 
job for any repositories that don't have a dedicated job which would simply 
execute grunt lint (aka the Universal Linter) that will run phplint, 
jshint, csslint, puppetlint etc.

-- Krinkle


On Nov 21, 2012, at 7:17 AM, Santhosh Thottingal 
santhosh.thottin...@gmail.com wrote:

 Is there a plan to enabled this for MediaWiki extensions? For some of
 our extensions, we do jshnt check manually in local machines. It would
 be great if they also can be configured with Jenkins to run jshint.
 
 
 Thanks
 Santhosh
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Jenkins now lints javascript!

2012-11-20 Thread Krinkle
Hey all,

For a while now we have .jshintrc rules in the repository and are able
to run node-jshint locally.

TL;DR: jshint is now running from Jenkins on mediawiki/core
(joining the linting sequence for php and puppet files).


I cleaned up the last old lint failures in the repo yesterday in
preparation to enable it from Jenkins (like we already do for PHP and
Puppet files). After some quick testing in a sandbox job on Jenkins to
confirm it passes/fails accordingly, this has now been enabled in the
main Jenkins job for mediawiki/core.

Right now only master and REL1_20 pass (REL1_19 and wmf branches do
not, the next wmf branch will however pass).

Therefore is has only been enabled on the master branch for now.

Example success:
* https://gerrit.wikimedia.org/r/#/c/24249/
* https://integration.mediawiki.org/ci/job/MediaWiki-GIT-Fetching/7730/console

22:16:41 Running jshint task
22:16:48 OK
22:16:48
22:16:48 Done, without errors.

Example failure:
* https://gerrit.wikimedia.org/r/#/c/34433/
* https://integration.mediawiki.org/ci/job/MediaWiki-GIT-Fetching/7732/console

22:24:01 Running jshint task
22:24:08  resources/mediawiki/mediawiki.js: line 5, col 5, Identifier 
'bla_bla' is not in camel case.
22:24:08  resources/mediawiki/mediawiki.js: line 5, col 12, 'bla_bla' is 
defined but never used.
22:24:08  
22:24:08  2 errors
22:24:08 Warning: Task jshint failed.

So if your commit is marked as failure, just like with failures from
phplint, puppetlint or phpunit: Click the link from jenkins-bot and
follow the trail.

-- Timo Tijhof


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-19 Thread Krinkle
On Nov 8, 2012, at 11:08 PM, Tim Starling tstarl...@wikimedia.org wrote:

 All extension branches were removed during the migration to Git. Very
 few extensions have branches for MW core major version support.
 There's no longer a simple way to branch all extensions when a core
 release is updated, and nobody has volunteered to write a script.
 

Such as script is easy to write, and Chad wrote one just now.

But, why would we want to do the auto-branching again? This was a
natural side-effect of our directory structure and use of Subversion.
Now that we have a choice, I don't think we should be auto-branching
all extension on a MediaWiki core release.

Extension maintainers should be able to decide when and where to
branch. So that they don't have to backport changes just because
someone at the foundation decided to branch all extensions.

Given that ExtensionDistributor will support Git branches soon[1] (by
listing the branch names of that extensions' git repo in the drop down
menu), I think everything is done.


 Given this, I think code reviewers should insist on backwards
 compatibility with MW 1.20 for commits to the master branch of
 extensions that are commonly used outside Wikimedia, at least until
 the release management issue is solved.
 


That makes perfect sense. The master branch should always be
compatible with releases between it and the latest branch made.
So when an extension has a REL1_19 and REL1_20 branch, and
MediaWiki core is on 1.22-alpha, then git master branch should
support 1.21 and 1.22-alpha (if and until the extension maintainer
decides that compatibility needs to be broken and makes a
REL1_21 branch).

-- Krinkle

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=37946

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 backwards compatibility in extensions

2012-11-19 Thread Krinkle
On Nov 19, 2012, at 5:12 PM, Daniel Kinzler dan...@brightbyte.de wrote:

 On 19.11.2012 17:08, Jeroen De Dauw wrote:
 +1. Having a script run that makes pretty arbitrary tags in between actual
 releases for extensions that have real releases and associated
 tags/branches does not seem helpful at all to me.
 
 It is, however, extremely helpful for those extensions that don't have their 
 own
 release system.
 
 Maybe the script can just skip any extension that has a VERSION or 
 RELEASE-NOTES
 file.


Or we don't run it and just create REL branches when we have to and so will 
others.

Extensions that aren't maintained, aren't maintained.

As a start, I created a REL1_20 branch for extensions that were bundled with 
1.20.0.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikimedia URL shortener

2012-11-18 Thread Krinkle
On Nov 18, 2012, at 1:43 AM, Erik Moeller e...@wikimedia.org wrote:

 On Sat, Nov 17, 2012 at 4:40 PM, Mono monom...@gmail.com wrote:
 
 More than that, they are using Wikipedia's logo stolen from Wikimedia
 Commons without attribution and infringing on the WMF's trademark.
 
 I've notified our legal team of this infringing use.
 
 Erik


Also, http://www.wi.ki/ violates the terms of the creative commons license:

https://commons.wikimedia.org/wiki/File:Wikipedia_mini_globe_handheld.jpg
https://creativecommons.org/licenses/by-sa/3.0/deed.en

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Creating custom skin based on Vector in MediaWiki 1.20

2012-11-17 Thread Krinkle
On Nov 17, 2012, at 6:08 PM, Dmitriy Sintsov ques...@rambler.ru wrote:

 Right. I can probably make local stylesheet with references to google cdn, 
 however I am not sure it wil not violate IE security or something.
 So I did:
   $out-addLink( array( rel = stylesheet, href = 
 http://fonts.googleapis.com/css?family=PT+Sans+Narrowsubset=latin,cyrillic,latin-ext;
 ) );
   $out-addLink( array( rel = stylesheet, href = 
 http://fonts.googleapis.com/css?family=Open+Sans:400,600subset=latin,cyrillic,latin-e
 xt,cyrillic-ext,greek,greek-ext ) );
 

No IE security issues, unless your website is served from HTTPS in which 
Chrome, IE and possibly other browsers will block those requests (which is 
good).

The Google Font APIs support HTTPS natively:
* 
http://fonts.googleapis.com/css?family=PT+Sans+Narrowsubset=latin,cyrillic,latin-ext
* 
https://fonts.googleapis.com/css?family=PT+Sans+Narrowsubset=latin,cyrillic,latin-ext

So in that case I'd recommend load with with a protocol-relative url so that it 
always works (only do this for urls that you know support both, such as Google 
Font APIs).

href = 
//fonts.googleapis.com/css?family=PT+Sans+Narrowsubset=latin,cyrillic,latin-ext

More about protocol-relative: 
http://paulirish.com/2010/the-protocol-relative-url/

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] jQuery 1.9 will remove $.browser (deprecated since jQuery 1.3 - January 2009)

2012-11-16 Thread Krinkle
On Nov 9, 2012, at 2:17 AM, Tim Starling tstarl...@wikimedia.org wrote:

 I can understand the rationale behind removing jQuery.browser:
 apparently most developers are too stupid to be trusted with it. Maybe
 the idea is to use per-project reimplementation of jQuery.browser as
 an intelligence test. The trouble is, I think even the stupidest
 developers are able to copy and paste.
 
 -- Tim Starling
 

jQuery will publish a jquery-compat plugin as they always do when removing 
major features. However I would recommend strongly against using it (at least 
inside MediaWiki) because:

To use it, you'll have to add it to your dependencies. In which case you might 
as well keep your editor open for another minute and use jquery.client instead.

And, the old jQuery.browser was pretty basic anyway:
https://github.com/jquery/jquery/blob/1.8.3/src/deprecated.js#L12

But if you're working outside MediaWiki and need it is good enough for you, 
you'd copy and paste those dozen or so lines. Or (after jQuery 1.9 is released) 
simply load the jquery-compat version on your web page.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git, gerrit and git review

2012-11-15 Thread Krinkle
On Nov 15, 2012, at 12:35 PM, dan entous d_ent...@yahoo.com wrote:

 dear all,
 
 after struggling for quite some time on how to best work with the wikimedia 
 gerrit site, i have created a page that i’m hoping will add some clarity for 
 myself and others. while the documentation that already exists is helpful, i 
 still found it difficult to understand, so i have done my best to summarize 
 what i discovered and write the documentation from the point of view of 
 someone who is more familiar a git workflow.
 
 if you have a moment, please take a look and edit or contribute to the page 
 where necessary as i am definitely not a gerrit expert.
 
 http://www.mediawiki.org/wiki/Git/Gerrit
 
 
 with kind regards,
 dan
 

Get's the basics right, however I worry:

* https://www.mediawiki.org/wiki/Git
* https://www.mediawiki.org/wiki/Git/TLDR
* https://www.mediawiki.org/wiki/Git/Tutorial
* https://www.mediawiki.org/wiki/Git/Workflow
* https://www.mediawiki.org/wiki/Git/Tips
* https://www.mediawiki.org/wiki/Gerrit
* https://www.mediawiki.org/wiki/Gerrit/Navigation

+ https://www.mediawiki.org/wiki/Git/Gerrit

-- Krinkle

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About RESOLVED LATER

2012-11-06 Thread Krinkle
On Nov 6, 2012, at 2:44 PM, Andre Klapper aklap...@wikimedia.org wrote:

 On Tue, 2012-11-06 at 02:24 +0100, Krinkle wrote:
 We have the following indications that should be used instead:
 * blockers / dependencies
 * status ASSIGNED
 * keyword upstream
 * priority low or lowest
 * severity minor
 * or, resolved-wontfix if that is the case.
 
 Using LATER in any of these cases only causes the bug to be lost end
 omitted from searches and lists.
 
 I don't think it depends on the project, it depends on the user making
 the query not knowing how to utilise the above factors.
 
 If one wants to generate a list to work off of that doesn't contain
 any of these, exclude them in the search parameters as desired. Don't
 stamp LATER on the bugs to make it fit the lazy search that only omits
 RESOLVED/**.
 
 [..]
  * Meaning of depends on UPSTREAM. We won't fix it ourselves but
wait for an upstream fix. We won't backport the upstream fix but
wait until we upgrade servers to an entire new Ubuntu version
which provides a newer package that includes the fix. The Mer
project uses RESOLVED WAITING_FOR_UPSTREAM for this.
 
 If I miss meanings, please provide them.
 
 Currently I'm in favor of killing RESOLVED LATER (which requires
 retriaging these tickets) and introducing a RESOLVED
 WAITING_FOR_UPSTREAM status for the latter case.
 
 andre
 

I'm not sure the extra resolved status makes sense. The issue is clearly not 
resolved and upstream is already indicated with the upstream keyword.

As for making complex queries, this may take a minute the first time, but 
doesn't have to be repeated.

The VisualEditor team, for example, has about half a dozen useful queries that 
are shared within the team (any bz user can enable it in their preferences) 
that the product manager maintains for us. We just click them when we need to 
know what's up.

And then there is the product-independent query that is useful for developers: 
Bugs assigned to me (that is, with status=ASSIGNED, not just any bug that was 
by default assigned to you).

Also, why would want to exclude Waiting for upstream bugs from triage lists? 
During a weekly triage I suppose it makes sense to evaluate these as well. Just 
like one would check up on a patch or the contributor, one would check up on 
the upstream ticket and maybe poke someone there.

-- Krinkle
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] jQuery 1.9 will remove $.browser (deprecated since jQuery 1.3 - January 2009)

2012-11-06 Thread Krinkle
Hey all,

Just a reminder that jQuery will (after almost 4 years of deprecation!) drop
$.browser [1] in jQuery 1.9.

Please check your scripts and make sure you are are no longer using $.browser
(or jQuery.browser). After jQuery 1.9 is released, from the next MediaWiki
deployment after that $.browser will be undefined and old code trying to access
a property of it will throw a TypeError for accessing a property of undefined.

Don't be alarmed, this has been a long time coming. It's been almost 4 years,
and by the time this is deployed it will probably have been 4 years.

For those who just realised their script is still using it, or if you read this
later to fix someone else's script that just broke (hello future, did the world
end in 2012?), I'll briefly describe two migration paths you can take from
here:

== Feature detection

In most (if not all) cases of people using $.browser it is because they want
different behaviour for browsers that don't support a certain something. Please
take a minute to look at the code and find out what it is you are special-casing
for that apparently doesn't work in a certain browser.

Research on the internet and look for a way to detect this properly (examples
below). Browser detection (instead of feature detection) is not reliable, nor is
it very effective. For example, Internet Explorer has changed a lot since IE6.
Blindly doing A for IE and B for non-IE isn't very useful anymore as most (if
not all) of the new features will work fine in IE8, IE9 or IE10.

The opposite is also true. If you do something cool for Chrome, you're missing
other WebKit-based browsers that should get the same awesomeness (Safari,
Chromium, iPhone/iPod/iPad, possibly Android, Flock, etc.) these all share the
exact same engine that backs Chrome). And what if Firefox and IE also start to
support this new awesome feature?

There are many ways to do feature detection. jQuery comes with various detectors
built-in in the object jQuery.support[2]. This contains for example 
support.ajax,
support.opacity and many more[2]. You can also easily make your
own feature detector:

* var supportPlaceholder = 'placeholder ' in document.createElement('input');
* var supportJSON = !!window.JSON;
etc.

If you need any help with feature detection, I'd recommend asking in one
of the following channels on irc.freenode.net:

* ##javascript (recommended)
* #jquery
* #wikimedia-dev

== jQuery.client [3]

If you can't figure out how to detect what you really want to switch for, there
is an elaborate plugin in MediaWiki that does the same thing that jQuery.browser
used to do (and more). This can be used as an alternative migration path. To
give an impression:

 jQuery.browser
 {
chrome:  true,
version:  22.0.1229.94,
webkit:  true
}

 $.client.profile();
 {
name:  chrome,
layout:  webkit,
layoutVersion:  537,
platform:  mac,
version:  22.0.1229.94,
versionBase:  22,
versionNumber:  22
}

For example:

if ( $.browser.chrome ) {}

Would become:

++ dependency: jquery.client
var profile = $.client.profile();

if ( profile.name === 'chrome ) {}

But:

if ( $.browser.msie ) {
  // IE doesn't support opacity
  el.style.filter = 'alpha(opacity=50)';
} else { .. }

Should become:

if ( $.support.opacity ) {
  el.style.filter = 'alpha(opacity=50)';
} else { .. }

Or better yet, since this is supported by jQuery core for a while now, like
this:

 $(el).css('opacity', 0.5);

Which will do the right thing for newer browsers and old IE respectively.

-- Krinkle 


[1] http://api.jquery.com/jQuery.browser/
[2] http://api.jquery.com/jQuery.support/
[3] https://www.mediawiki.org/wiki/RL/DM#jQuery.client


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] About RESOLVED LATER

2012-11-05 Thread Krinkle
On Nov 5, 2012, at 11:54 PM, Platonides platoni...@gmail.com wrote:

 How many of them depend on action from somebody else?
 (eg. upstream fixing its tool)
 
 Of course, if we are waiting for upstream, it should list the upstream
 bug id, have upstream keyword, someone actually noticing when it's
 fixed, etc.  but those are form issues, not the status.
 (and yes, 'resolved' is a misnomer)



On Nov 6, 2012, at 12:02 AM, Nabil Maynard nadr...@gmail.com wrote:

 LATER = We acknowledge this is a valid bug, and we agree that it should be
 fixed, but don't have time right now.  We will be voluntarily revisiting
 this soon.
 
 While we could arguably just dump both situations into WONTFIX, I think
 that would muddy the triage process pretty heavily, making it more
 difficult to track bugs we intend to revisit.
 
 That said, it IS easy for LATER to become a procrastination paradise, where
 it gets resolved and then never thought of again.  Would it be worthwhile
 to set up an automated notice when a LATER bug ages past a certain date
 (say, a month without being touched), instead of axing the resolution?
 
 

I agree we should drop this status.

We have the following indications that should be used instead:

* blockers / dependencies
* status ASSIGNED
* keyword upstream
* priority low or lowest
* severity minor
* or, resolved-wontfix if that is the case.

Using LATER in any of these cases only causes the bug to be lost end omitted 
from searches and lists.

I don't think it depends on the project, it depends on the user making the 
query not knowing how to utilise the above factors.

If one wants to generate a list to work off of that doesn't contain any of 
these, exclude them in the search parameters as desired. Don't stamp LATER on 
the bugs to make it fit the lazy search that only omits RESOLVED/**.

@AndreKlapper: What do you think?

-- Krinkle



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW 1.20 release tomorrow

2012-11-05 Thread Krinkle
On Nov 5, 2012, at 8:46 PM, Mark A. Hershberger m...@everybody.org wrote:

 I said last week that I would be releasing 1.20 today.  Due to some
 hiccups, I won't be able to do that.  I'll work on the release tonight
 and prep it for tomorrow.
 
 Thank you for your patience,
 
 Mark.


Open regressions introduced during 1.20 development:
https://bugzilla.wikimedia.org/buglist.cgi?resolution=---keywords=code-update-regressionkeywords_type=anywordsquery_format=advancedproduct=MediaWikiversion=1.20list_id=157826

 12 bugs found.

Open tickets milestoned 1.20.0:
https://bugzilla.wikimedia.org/buglist.cgi?resolution=---query_format=advancedproduct=MediaWikitarget_milestone=1.20.0%20releaselist_id=157827

 Zarro Boogs found.

Nice :)


-- Krinkle


[1] https://toolserver.org/~krinkle/wmfBugZillaPortal/ (version 1.20 open regr. 
 milestone 1.20.0 open)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Clean up prototype.wikimedia.org

2012-10-29 Thread Krinkle
Hey all,

I thought I'd put together a list of stuff I found running on 
prototype.wikimedia.org.

I have no control over terminating the machine itself, but I intend to clean up 
stuff that is no longer used so that when it is eventually terminated, ideally 
nothing runs on it anymore.

As you should know by now this server is a ticking time bomb. Since it is 
fairly unorganised (no central registry of what is and isn't used or by whom), 
I'll dump it here. Please reply with what you know is or isn't being used 
anymore.

* IdeaTorrent http://prototype.wikimedia.org/ideas/
** en-idea/
** en-idea-6.14/
** en-idea-6.16/

Wikis http://prototype.wikimedia.org/wikis/
* Release-candidate (rc-ar, rc-de, rc-en, rc-pl)
* Deployment (d-ar, d-de, d-en, d-es, d-fr, d-it, d-ja, d-ko, d-nl, d-pl, d-ru, 
d-si, d-sr, d-zh)
* Article Creation Branch
* FlaggedRevs (de, en)
* iwtrans (1, 2, 3)
* mwe-gadget, mwe-gadget-testing
* TimedMediaHandler (tmh)
* Semantic (smw-1)
* WM-DE (wmde-b, wmde-s-1)

Wiki release-candidate infrastructure: 
* Backups http://prototype.wikimedia.org/backups/
* Exports http://prototype.wikimedia.org/exports/
* Maintenance http://prototype.wikimedia.org/maintenance/

Items on this lists that do not yield reason to do otherwise before Sunday 11 
November 2012 (in 2 weeks), will be `rm -rf`-ed from the server on Monday 12 
November[1].

If you own any of these (or ones not listed here) and have no problem with its 
removal, feel free to ssh in and remove it yourself whenever you feel like.

-- Krinkle

[1] I intend to remove them separately, not all at once. So if support for a 
few is shown, I'll still remove the rest. Please reply complete (e.g. I use X 
will mean I keep X, it won't mean I leave Y and Z as well).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Should JS/CSS pages be parsed?

2012-10-18 Thread Krinkle
On Oct 18, 2012, at 5:04 AM, Daniel Kinzler dan...@brightbyte.de wrote:

 Hi!
 
 When designing the ContentHandler, I asked around about whether JS and CSS 
 pages
 should be parsed as wikitext, so categories etc would work. The gist of the
 responses I got was naw, lets get rid of that. So I did (though PST is still
 applied - Tim asked for that at the Berlin Hackathon).
 
 Sure enough, people are complaining now, see
 https://bugzilla.wikimedia.org/show_bug.cgi?id=41155. Also note that an 
 older
 request for disablingt parsing of script pages was closed as WONTFIX:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=32858.
 
 I'm inclined to (at least optionally) enable the parsing of script pages, but
 I'd like to get some feedback first.
 
 -- daniel


Yeah, as more elaborately put on the bug[1], it was disabled in ContentHandler 
without dedicated discussion because it was thought of as a minor oddity that 
should be removed as a bug.

We know now that (though it might have been a bug originally) it is a major 
feature that unless replaced, must not be removed.

-- Krinkle

[1] https://bugzilla.wikimedia.org/41155


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Prefs removal

2012-10-09 Thread Krinkle
Also note that, as explained in more detail on the ticket[1], gadgetizing will 
be a bad idea. We might as well not remove it at all. A preference is a lot 
better than a gadget because (at least for now) its label will be localised a 
lot better, and more importantly: The option will show up in the relevant 
preferences section.

For reasons explained elsewhere, shipping gadgets by default with the Gadgets 
extension is not an option and generally counter-productive and bound to cause 
unwanted/unmaintained scripts to rot in wikis after being injected upon 
installation.

That's now how the Gadgets extension work, and imho it should stay that way.

Instead of gadgetization, a better option would be to move them into an 
extension (not the Gadgets extension). Which means the extension can:
* add it to the preferences section where it used to be
* use the same key as the old preference in core, so that users don't have to 
reset their preferences (as opposed to gadget-foo).

Especially the latter is in my opinion a must-have for the smooth migration of 
any wiki that decides to install this LegacyPreferences extension after 
upgrading their MediaWiki-install.

Besides, most of the preferences discussed here aren't worth a gadget (I guess 
most communities will not want them in the preferences section after voting to 
have them disabled/removed from core). They don't add any significant features, 
only silly visual options that someone afraid of change insisted on in the past.

That is, of course, referring to preferences we would end up removing after 
some kind of consensus. I'm not talking about all preferences or any preference 
in particular .

In other words: If the story is remove them from core, add as 
default-available[2] gadget. Then please, *please*, keep them in core. Because 
that wouldn't be an improvement over the current situation.

If we have good ground to remove them (because they're silly options that we 
don't want end-users to be thinking about), then we remove them. Don't move 
them from one junkyard to another.

-- Krinkle

[1] https://bugzilla.wikimedia.org/40346
[2] default-available != default-enabled

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please fix the PHP unit tests

2012-10-05 Thread Krinkle
On Oct 5, 2012, at 9:17 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 05/10/12 19:11, bawolff a écrit :
 It would be nice if jenkins also did a run with some non-default
 options. Obviously we have a lot of options, but doing at least 1 run
 with some of the common variations (say $wgCapitalLinks) would help
 quite a bit I imagine.
 
 The Jenkins job setup a default MediaWiki install which mean it is
 mostly using DefaultSettings.php.  If we want to test wgCapitalLinks, we
 should write tests for that feature and adapt existing tests to take in
 account that global variable.
 
 -- 
 Antoine hashar Musso
 

I agree. Tests should make sure their expected results are based on the
configuration they set in the setup/teardown (as opposed to relying on the
default configuration). And if there are any significant differences possible in
the output based on configuration options, then the test should provide tests
for that (as opposed to re-running all test suites for one configuration option
which is an approach full for wasting resources and doesn't scale).

So the Html test cases should just set $wgHtml5=true and $wgWellFormedXml and be
done with it. Because that it will add  / everywhere is obvious (it should have
one or two assertions to make sure the other behaviour is also tested), no need
to duplicate the entire test suite 2 or 3 times for unimportant details.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Call to eliminate sajax

2012-10-03 Thread Krinkle
On Oct 3, 2012, at 7:18 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote:

 The real problem however is extensions. For some reason it appears that we 
 STILL have extensions depending on sajax. And I'm not talking about ancient 
 extensions on the wiki or in svn. I only did an ack through stuff that's 
 currently in git.
 
 So I welcome anyone who is interested in going through extension code and 
 eliminating the use of sajax in favor of jQuery.ajax and RL.
 
 

Also note that in various cases these are not just frontend legacy problems, 
backend as well. Meaning, AjaxDispatcher.

Invoked through index.php?action=ajaxrs=efFooBarrsargs[]=paramrsargs[]=param.

Though blindly replacing sajax would allow us to remove it from core, it would 
be very much worth it to give these extensions a good look and update them in 
general (to make it use API modules, ResourceLoader modules, and following 
current conventions for front-end code with mw and jQuery).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer

2012-10-02 Thread Krinkle
On Oct 2, 2012, at 4:25 PM, Chris McMahon cmcma...@wikimedia.org wrote:

 I am pleased to announce that Željko Filipin joins WMF this week as QA
 Engineer.

Welcome Željko!

For the last 1.5 year, hashar and I  have set up the current integration 
environment. I'm also in CET (Krinkle on freenode).

Hashar did most of the backend with PHPUnit and Jenkins, I'm occupied in 
browsers and unit testing their in (QUnit/TestSwarm/BrowserStack/..).

Looking forward to work with you!

-- 
Timo Krinkle Tijhof


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTMLMultiSelectField as select multiple=multiple/

2012-09-30 Thread Krinkle
Nobody (afaik) is saying that jQuery chosen isn't better than a bunch of
checkboxes.

The point is that select multiple (without javascript enhancement) is horrible
and (imho) unacceptable. And no, it is not because we found situations in which
it is not a good idea. It is because it is never a good idea, never.

The solution I'd propose (and Daniel also mentioned this various times here and
elsewhere) to output checkboxes as fallback and an interface like jQuery 
Chosen
as enhancement.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


<    1   2   3   4   5   >