[Wikitech-l] Mediawiki's access points and mw-config

2013-02-15 Thread Waldir Pimenta
While trying to add some more information to
https://www.mediawiki.org/wiki/Manual:Code, I came across a slightly
peculiar issue regarding the entry points for MediaWiki:

Right now, among all the entry points that I know of (those are listed in
Manual:Code), only mw-config/index.php doesn't sit in the root folder.
Furthermore, it's related to the installer at includes/installer/, but that
is not clear at all from the code organization, specifically the directory
names (and the lack of documentation both in the file and on mediawiki.org).

I have two questions, then:
1) should all access points be on the root directory of the wiki, for
consistency?
2) should the name "mw-config" be changed to something that more clearly
indicates its relationship with the installer?

Note that these aren't merely nitpicking: a consistent structure and
intuitive names for files and directories play an important role in the
self-documenting nature of the code, and make the learning curve smoother
for new developers (e.g. yours truly :-)).

Also, I used Tim Starling's suggestion on IRC to make sure the list of
entry point scripts listed in Manual:Code was complete: git grep -l
/includes/WebStart.php
I am not sure that exhausts the list, however, since thumb_handler.php
doesn't show up on its results. Any pointers regarding potential entry
points currently omitted from that list are most welcome.

--Waldir

ps - while investigating this subject I came accross some inconsistencies
in the way the access points include the WebStart.php file, including an
incorrect use (according to Tim) of require_once() instead of require(). I
submitted a change to Gerrit harmonizing them; if this interests you,
please review the commit: https://gerrit.wikimedia.org/r/#/c/49208/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WikiEditor caching (??)

2013-02-15 Thread Federico Leva (Nemo)

Thank god when it even loads completely. :)
https://bugzilla.wikimedia.org/show_bug.cgi?id=44188

Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-15 Thread Maria Miteva
Hi,

There are so many extensions useful to the enterprise but probably also so
many which are not useful at all or not maintained and if I wanted to start
a corporate wiki right now I would probably be very lost what to look at
and how people do things, so it seemed like a good idea to list the
extensions that ARE actually used. Also, I guess one team solved a certain
problem one way, while another solved it differenty, using a different
extension or set of extensions, so writing this out might help everybody
get new ideas/ avoid reinventing the wheel. But I guess I either asked on
the wrong list or there is not much interest at all.

Mariya

On Thu, Feb 14, 2013 at 9:06 PM, Dmitriy Sintsov  wrote:

> On 14.02.2013 21:14, vita...@yourcmc.ru wrote:
>
>> I guess this would not directly solve any of the problems listed, but
>>> would
>>> it be helpful to bring back to life
>>> https://www.mediawiki.org/**wiki/Enterprise_hub?
>>>  It was started by somebody
>>> an year or two ago but seems to have been abandoned at a draft stage. I
>>> am
>>> thinking if everybody adds some information about extensions/pages they
>>> find particularly useful in the enterprise world, it will help future
>>> users
>>> but also help current enterprise wikis exchange experience. Does this
>>> seem
>>> worthwhile?
>>>
>>
>> IMHO there are so much useful extensions that I think it can be a little
>> much for that page.
>>
>> For example if I edited that article I would put almost all extensions
>> from our distribution there... so I'm documenting them on
>> http://wiki.4intra.net/**Category:Mediawiki4Intranet_**extensions:-)
>>
>>  The question is, how much they are stable and secure. MediaWiki is
> high-quality software that should not be impaired by low-quality extension.
> Also, when extension is unmaintained, it's stability and security becomes
> questional as well.
> Also, I remember for major MW extensions scalability is the big problem.
> Efficient SQL queries, using APC / Memcached cache, not invalidating parser
> cache too often. For example my own Extension:QPoll is not well-scaling
> requiring some major rewrites. That applies to many of another extensions
> as well.
> Dmitriy
>
>
>
> __**_
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/**mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-15 Thread vitalif
There are so many extensions useful to the enterprise but probably 
also so
many which are not useful at all or not maintained and if I wanted to 
start
a corporate wiki right now I would probably be very lost what to look 
at

and how people do things, so it seemed like a good idea to list the
extensions that ARE actually used. Also, I guess one team solved a 
certain
problem one way, while another solved it differenty, using a 
different
extension or set of extensions, so writing this out might help 
everybody
get new ideas/ avoid reinventing the wheel. But I guess I either 
asked on

the wrong list or there is not much interest at all.


So, you're talking about some "basic set" of extensions that are 
thought to be definitely useful for ALL people?


It may be useful, but I think that it anyway may require testing of a 
complete distribution (MW version X + all these extensions) to recommend 
it to companies... And this returns us to the idea of a "pre-built" 
distribution like our one :-))


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-15 Thread Platonides
On 15/02/13 09:16, Waldir Pimenta wrote:
> While trying to add some more information to
> https://www.mediawiki.org/wiki/Manual:Code, I came across a slightly
> peculiar issue regarding the entry points for MediaWiki:
> 
> Right now, among all the entry points that I know of (those are listed in
> Manual:Code), only mw-config/index.php doesn't sit in the root folder.
> Furthermore, it's related to the installer at includes/installer/, but that
> is not clear at all from the code organization, specifically the directory
> names (and the lack of documentation both in the file and on mediawiki.org).
> 
> I have two questions, then:
> 1) should all access points be on the root directory of the wiki, for
> consistency?

No. The installer is on its on folder on purpose, so that you can delete
that folder once you have installed the wiki.


> 2) should the name "mw-config" be changed to something that more clearly
> indicates its relationship with the installer?
> 
> Note that these aren't merely nitpicking: a consistent structure and
> intuitive names for files and directories play an important role in the
> self-documenting nature of the code, and make the learning curve smoother
> for new developers (e.g. yours truly :-)).

It was originally named "config". It came from the link that sent you
there: "You need to configure your wiki first". Then someone had
problems with other program that was installed sitewide on his host
appropiating the /config/ folder, so it was renamed to mw-config.


> Also, I used Tim Starling's suggestion on IRC to make sure the list of
> entry point scripts listed in Manual:Code was complete: git grep -l
> /includes/WebStart.php
> I am not sure that exhausts the list, however, since thumb_handler.php
> doesn't show up on its results. Any pointers regarding potential entry
> points currently omitted from that list are most welcome.

That's probably because it doesn't include WebStart (it included
thumb.php, which is the one including WebStart).

Take a look at tools/code-utils/find-entries.php I have updated it to
add a few new rules in https://gerrit.wikimedia.org/r/49230

It will give you about 100 files to check, most of them cli scripts.
Although there are a few web-enabled ones, such as
tests/qunit/data/styleTest.css.php

Use -d to see why that file was considered an entry point. As you'll
see, it is very strict -with reason- in what it considers safe.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] When to automerge (Re: Revoking +2 (Re: who can merge into core/master?))

2013-02-15 Thread Platonides
On 15/02/13 01:38, Brad Jorsch wrote:
> I'd propose one more:
> 
> * Someone else gives +2, but Jenkins rejects it because it needs a
> rebase that is not quite trivial enough for it to do it automatically.
> For example, something in RELEASE-NOTES-1.21.

Seems a better example.
I'm not convinced that backporting should be automatically merged, though.
Even if the code at REL-old is the same as master (ie. the backport
doesn't needs any code change), approving something from master is
different than agreeing that it should be merged to REL-old (unless
explicitly stated in the previous change). I'm not too firm on that for
changes that it's obvious should be backported, such as a XSS fix*, but
I would completely oppose to automerge a minor feature because it was
merged into master.
Note that we are not alone opinating about what it's worth backporting,
since downstream distros will also call into question if our new release
is “just bugfixes” before they agree into accepting it as-is.


* Still, we could be making a complete new class in master but just
stripping the vulnerable piece in the old release.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] When to automerge (Re: Revoking +2 (Re: who can merge into core/master?))

2013-02-15 Thread Krinkle
On Feb 15, 2013, at 3:32 PM, Platonides  wrote:

> I'm not convinced that backporting should be automatically merged, though.
> Even if the code at REL-old is the same as master (ie. the backport
> doesn't needs any code change), approving something from master is
> different than agreeing that it should be merged to REL-old (unless
> explicitly stated in the previous change). I'm not too firm on that for
> changes that it's obvious should be backported, such as a XSS fix*, but
> I would completely oppose to automerge a minor feature because it was
> merged into master.
> Note that we are not alone opinating about what it's worth backporting,
> since downstream distros will also call into question if our new release
> is “just bugfixes” before they agree into accepting it as-is.
> 

I don't know where you pull "auto-merging" from but it certainly isn't from my 
e-mail, which was about revoking merge access and about when self-merging may 
or may not be tolerated.

Auto-merging would imply some random dude can take a change from master merged 
by someone else *for master*, and submit it to any branch and have it be 
auto-merged.

What I was talking about is that a code reviewer with merge access can submit 
an approved change from master to another branch and self-merge it.

Just because one can however doesn't mean one should.

When our random dude pushes a change for review to an old branch that backports 
a feature from master, the assigned reviewer should (as you explain) not 
approve it.

And for the same reason, when that reviewer backports himself, he wouldn't 
self-merge. Or rather, he wouldn't draft such a change in the first place.

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-15 Thread Waldir Pimenta
On Fri, Feb 15, 2013 at 11:58 AM, Platonides  wrote:

> On 15/02/13 09:16, Waldir Pimenta wrote:
> > 1) should all access points be on the root directory of the wiki, for
> > consistency?
>
> No. The installer is on its on folder on purpose, so that you can delete
> that folder once you have installed the wiki.
>

Sorry if I wasn't clear. I meant that mw-config/index.php should be in te
root, not that the installer files should. Unless I'm misunderstanding you,
and by "the installer" you mean the contents of mw-config/ rather than
/includes/installer (which would prove my point about unclear nomenclature).

> Also, I used Tim Starling's suggestion on IRC to make sure the list of
> > entry point scripts listed in Manual:Code was complete: git grep -l
> > /includes/WebStart.php
> > I am not sure that exhausts the list, however, since thumb_handler.php
> > doesn't show up on its results. Any pointers regarding potential entry
> > points currently omitted from that list are most welcome.
>
> That's probably because it doesn't include WebStart (it included
> thumb.php, which is the one including WebStart).
>
> Take a look at tools/code-utils/find-entries.php I have updated it to
> add a few new rules in https://gerrit.wikimedia.org/r/49230
>
> It will give you about 100 files to check, most of them cli scripts.
> Although there are a few web-enabled ones, such as
> tests/qunit/data/styleTest.css.php
>
> Use -d to see why that file was considered an entry point. As you'll
> see, it is very strict -with reason- in what it considers safe.
>

Thanks, that sounds quite useful, but I don't seem to be able to run it
properly (I get a few php warnings, and "0/0" as output). I placed it in
the root dir of my local wiki. Am I missing anyting?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] When to automerge (Re: Revoking +2 (Re: who can merge into core/master?))

2013-02-15 Thread Chris Steipp
On Fri, Feb 15, 2013 at 6:32 AM, Platonides  wrote:
> On 15/02/13 01:38, Brad Jorsch wrote:
>> I'd propose one more:
>>
>> * Someone else gives +2, but Jenkins rejects it because it needs a
>> rebase that is not quite trivial enough for it to do it automatically.
>> For example, something in RELEASE-NOTES-1.21.
>
> Seems a better example.
> I'm not convinced that backporting should be automatically merged, though.
> Even if the code at REL-old is the same as master (ie. the backport
> doesn't needs any code change), approving something from master is
> different than agreeing that it should be merged to REL-old (unless
> explicitly stated in the previous change). I'm not too firm on that for
> changes that it's obvious should be backported, such as a XSS fix*, but
> I would completely oppose to automerge a minor feature because it was
> merged into master.

We should probably reset the subject again since this is something of
a tangent from revoking +2, but since it was brought up, let me
clarify the process behind when gerrit reports that someone is doing a
self-merge for security fixes (and I welcome input for ways to improve
the process!).

When we get a report of an xss (assuming isn't not yet "public" or
being actively exploited), we file a bugzilla ticket in the security
category. Usually me or someone else in that group adds a patch to the
bug. Someone else gives a "yes, this looks ok to deploy" comment on
the bug. The patch is put into production, backported to the supported
release versions, then we release tarballs / patches. When the
tarballs are released, I typically submit and merge the patches into
master and supported branches, since we have a number of users who
pull from git to update their systems.

This process is painful (no one like reviewing patches in bugzilla),
and the wrangling to get the right people to review patches in
bugzilla is slowing down our security releases. It would be much
better if we had a way to submit the patches in gerrit, go through the
normal review process by a trusted group of developers ending in a
+2's, and then the final merge is just a single click when we release
the tarballs. But we haven't been able to get gerrit to do that yet
(although if any java developers want to work on that, I would be very
excited).


> Note that we are not alone opinating about what it's worth backporting,
> since downstream distros will also call into question if our new release
> is “just bugfixes” before they agree into accepting it as-is.
>
>
> * Still, we could be making a complete new class in master but just
> stripping the vulnerable piece in the old release.
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-15 Thread Asher Feldman
Just to tie this thread up - the issue of how to count ajax driven
pageviews loaded from the api and of how to differentiate those requests
from secondary api page requests has been resolved without the need for
code or logging changes.

Tagging of the mobile beta site will be accomplished via a new generic
mediawiki http response header dedicated to logging containing key value
pairs.

-Asher

On Tue, Feb 12, 2013 at 9:56 AM, Asher Feldman wrote:

> On Tuesday, February 12, 2013, Diederik van Liere wrote:
>
>> > It does still seem to me that the data to determine secondary api
>> requests
>> > should already be present in the existing log line. If the value of the
>> > page param in an action=mobileview api request matches the page in the
>> > referrer (perhaps with normalization), it's a secondary request as per
>> case
>> > 1 below.  Otherwise, it's a pageview as per case 2.  Difficult or
>> expensive
>> > to reconcile?  Not when you're doing distributed log analysis via
>> hadoop.
>> >
>> So I did look into this prior to writing the RFC and the issue is that a
>> lot of API referrers don't contain the querystring. I don't know what
>> triggers this so if we can fix this then we can definitely derive the
>> secondary pageview request from the referrer field.
>> D
>
>
> If you can point me to some examples, I'll see if I can find any insights
> into the behavior.
>
>
>>
>> > On Mon, Feb 11, 2013 at 7:11 PM, Arthur Richards <
>> aricha...@wikimedia.org
>> > >wrote:
>> >
>> > > Thanks, Jon. To try and clarify a bit more about the API requests...
>> they
>> > > are not made on a per-section basis. As I mentioned earlier, there are
>> > two
>> > > cases in which article content gets loaded by the API:
>> > >
>> > > 1) Going directly to a page (eg clicking a link from a Google search)
>> > will
>> > > result in the backend serving a page with ONLY summary section content
>> > and
>> > > section headers. The rest of the page is lazily loaded via API request
>> > once
>> > > the JS for the page gets loaded. The idea is to increase
>> responsiveness
>> > by
>> > > reducing the delay for an article to load (further details in the
>> article
>> > > Jon previously linked to). The API request looks like:
>> > >
>> > >
>> >
>> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
>> > >
>> > > 2) Loading an article entirely via Javascript - like when a link is
>> > clicked
>> > > in an article to another article, or an article is loaded via search.
>> > This
>> > > will make ONE call to the API to load article content. API request
>> looks
>> > > like:
>> > >
>> > >
>> >
>> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
>> > >
>> > > These API requests are identical, but only #2 should be counted as a
>> > > 'pageview' - #1 is a secondary API request and should not be counted
>> as a
>> > > 'pageview'. You could make the argument that we just count all of
>> these
>> > API
>> > > requests as pageviews, but there are cases when we can't load article
>> > > content from the API (like devices that do not support JS), so we
>> need to
>> > > be able to count the traditional page request as a pageview - thus we
>> > need
>> > > a way to differentiate the types of API requests being made when they
>> > > otherwise share the same URL.
>> > >
>> > >
>> > >
>> > > On Mon, Feb 11, 2013 at 6:42 PM, Jon Robson 
>> wrote:
>> > >
>> > > > I'm a bit worried that now we are asking why pages are lazy loaded
>> > > > rather than focusing on the fact that they currently __are doing
>> > > > this___ and how we can log these (if we want to discuss this further
>> > > > let's start another thread as I'm getting extremely confused doing
>> so
>> > > > on this one).
>> > > >
>> > > > Lazy loading sections
>> > > > 
>> > > > For motivation behind moving MobileFrontend into the direction of
>> lazy
>> > > > loading section content and subsequent pages can be found here [1],
>> I
>> > > > just gave it a refresh as it was a little out of date.
>> > > >
>> > > > In summary the reason is to
>> > > > 1) make the app feel more responsive by simply loading content
>> rather
>> > > > than reloading the entire interface
>> > > > 2) reducing the payload sent to a device.
>> > > >
>> > > > Session Tracking
>> > > > 
>> > > >
>> > > > Going back to the discussion of tracking mobile page views, it
>> sounds
>> > > > like a header stating whether a page is being viewed in alpha, beta
>> or
>> > > > stable works fine for standard page views.
>> > > >
>> > > > As for the situations where an entire page is loaded via the api it
>> > > > makes no dif
>
>
___
Wikitech-l mailing l

Re: [Wikitech-l] Merge Vector extension into core

2013-02-15 Thread Bartosz Dziewoński

There seems to be consensus about this, so I created 
https://bugzilla.wikimedia.org/show_bug.cgi?id=45051 for dealing with the 
details of the operation.

--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Rob Lanphier
Hi everyone,

We're planning to deploy Lua to a long list of wikis on Monday,
February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
English Wikipedia.

Details here:
http://meta.wikimedia.org/wiki/Lua

Jan Kučera (User:Kozuch) has placed notifications on many of the
wikis.  Those notifications and general communications listed here:
http://en.wikipedia.org/wiki/User:Kozuch/Lua

This is a really exciting deployment for the projects.  We're really
looking forward to seeing the great things that people do with this,
and looking forward to making editing and previewing more responsive
for template-heavy pages.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Bináris
Hi Rob,

this is really great and exciting and we were long waiting for this day.
Let's rock! :-)

-- 
Bináris
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Steven Walling
On Fri, Feb 15, 2013 at 12:33 PM, Rob Lanphier  wrote:

> Hi everyone,
>
> We're planning to deploy Lua to a long list of wikis on Monday,
> February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
> English Wikipedia.
>
> Details here:
> http://meta.wikimedia.org/wiki/Lua
>
> Jan Kučera (User:Kozuch) has placed notifications on many of the
> wikis.  Those notifications and general communications listed here:
> http://en.wikipedia.org/wiki/User:Kozuch/Lua


I didn't see it in the docs above, so thought I'd ask... Is this going to
include rollout of the CodeEditor extension, or will that be done
separately?


> This is a really exciting deployment for the projects.  We're really
> looking forward to seeing the great things that people do with this,
> and looking forward to making editing and previewing more responsive
> for template-heavy pages.
>
> Rob
>

This is exciting! Do we have plans for further measurement when it comes to
Lua's impact on page load times/publishing any results so far? In addition
to the general benefit of not having to program using wikitext/parser
functions, I seem to remember the performance improvements being the big
selling point of Scribunto.

Best of luck on the launch,

-- 
Steven Walling
https://wikimediafoundation.org/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-15 Thread Diederik van Liere
Thanks Asher for tying this up! I was about to write a similar email :)
One final question, just to make sure we are all on the same page: is the
X-CS field becoming a generic key/value pair for tracking purposes?

D


On Fri, Feb 15, 2013 at 11:16 AM, Asher Feldman wrote:

> Just to tie this thread up - the issue of how to count ajax driven
> pageviews loaded from the api and of how to differentiate those requests
> from secondary api page requests has been resolved without the need for
> code or logging changes.
>
> Tagging of the mobile beta site will be accomplished via a new generic
> mediawiki http response header dedicated to logging containing key value
> pairs.
>
> -Asher
>
> On Tue, Feb 12, 2013 at 9:56 AM, Asher Feldman  >wrote:
>
> > On Tuesday, February 12, 2013, Diederik van Liere wrote:
> >
> >> > It does still seem to me that the data to determine secondary api
> >> requests
> >> > should already be present in the existing log line. If the value of
> the
> >> > page param in an action=mobileview api request matches the page in the
> >> > referrer (perhaps with normalization), it's a secondary request as per
> >> case
> >> > 1 below.  Otherwise, it's a pageview as per case 2.  Difficult or
> >> expensive
> >> > to reconcile?  Not when you're doing distributed log analysis via
> >> hadoop.
> >> >
> >> So I did look into this prior to writing the RFC and the issue is that a
> >> lot of API referrers don't contain the querystring. I don't know what
> >> triggers this so if we can fix this then we can definitely derive the
> >> secondary pageview request from the referrer field.
> >> D
> >
> >
> > If you can point me to some examples, I'll see if I can find any insights
> > into the behavior.
> >
> >
> >>
> >> > On Mon, Feb 11, 2013 at 7:11 PM, Arthur Richards <
> >> aricha...@wikimedia.org
> >> > >wrote:
> >> >
> >> > > Thanks, Jon. To try and clarify a bit more about the API requests...
> >> they
> >> > > are not made on a per-section basis. As I mentioned earlier, there
> are
> >> > two
> >> > > cases in which article content gets loaded by the API:
> >> > >
> >> > > 1) Going directly to a page (eg clicking a link from a Google
> search)
> >> > will
> >> > > result in the backend serving a page with ONLY summary section
> content
> >> > and
> >> > > section headers. The rest of the page is lazily loaded via API
> request
> >> > once
> >> > > the JS for the page gets loaded. The idea is to increase
> >> responsiveness
> >> > by
> >> > > reducing the delay for an article to load (further details in the
> >> article
> >> > > Jon previously linked to). The API request looks like:
> >> > >
> >> > >
> >> >
> >>
> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
> >> > >
> >> > > 2) Loading an article entirely via Javascript - like when a link is
> >> > clicked
> >> > > in an article to another article, or an article is loaded via
> search.
> >> > This
> >> > > will make ONE call to the API to load article content. API request
> >> looks
> >> > > like:
> >> > >
> >> > >
> >> >
> >>
> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
> >> > >
> >> > > These API requests are identical, but only #2 should be counted as a
> >> > > 'pageview' - #1 is a secondary API request and should not be counted
> >> as a
> >> > > 'pageview'. You could make the argument that we just count all of
> >> these
> >> > API
> >> > > requests as pageviews, but there are cases when we can't load
> article
> >> > > content from the API (like devices that do not support JS), so we
> >> need to
> >> > > be able to count the traditional page request as a pageview - thus
> we
> >> > need
> >> > > a way to differentiate the types of API requests being made when
> they
> >> > > otherwise share the same URL.
> >> > >
> >> > >
> >> > >
> >> > > On Mon, Feb 11, 2013 at 6:42 PM, Jon Robson 
> >> wrote:
> >> > >
> >> > > > I'm a bit worried that now we are asking why pages are lazy loaded
> >> > > > rather than focusing on the fact that they currently __are doing
> >> > > > this___ and how we can log these (if we want to discuss this
> further
> >> > > > let's start another thread as I'm getting extremely confused doing
> >> so
> >> > > > on this one).
> >> > > >
> >> > > > Lazy loading sections
> >> > > > 
> >> > > > For motivation behind moving MobileFrontend into the direction of
> >> lazy
> >> > > > loading section content and subsequent pages can be found here
> [1],
> >> I
> >> > > > just gave it a refresh as it was a little out of date.
> >> > > >
> >> > > > In summary the reason is to
> >> > > > 1) make the app feel more responsive by simply loading content
> >> rather
> >> > > > than reloading the entire

Re: [Wikitech-l] When to automerge (Re: Revoking +2 (Re: who can merge into core/master?))

2013-02-15 Thread Platonides
On 15/02/13 18:51, Chris Steipp wrote:
> This process is painful (no one like reviewing patches in bugzilla),
> and the wrangling to get the right people to review patches in
> bugzilla is slowing down our security releases. It would be much
> better if we had a way to submit the patches in gerrit, go through the
> normal review process by a trusted group of developers ending in a
> +2's, and then the final merge is just a single click when we release
> the tarballs. But we haven't been able to get gerrit to do that yet
> (although if any java developers want to work on that, I would be very
> excited).

gerrit drafts?
Although those are not as private as we would like. Another option would
be to email those amongst +2 reviewers, although willingness to review
through email perhaps won't be bigger than in bugzilla.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Rob Lanphier
Hi Steven,

Thanks for the encouragement!  Comments inline:

On Fri, Feb 15, 2013 at 12:55 PM, Steven Walling  wrote:
> I didn't see it in the docs above, so thought I'd ask... Is this going to
> include rollout of the CodeEditor extension, or will that be done
> separately?

That's a good question.  My initial instinct is to say "just
Scribunto/Lua", but follow that closely with a CodeEditor deployment
very soon.  Tim may have something else in mind for this, though, so
he may correct me..  Is there any reason you're aware of not to deploy
CodeEditor as well?  Any reason why we should avoid deploying
Scribunto without CodeEditor?

> This is exciting! Do we have plans for further measurement when it comes to
> Lua's impact on page load times/publishing any results so far? In addition
> to the general benefit of not having to program using wikitext/parser
> functions, I seem to remember the performance improvements being the big
> selling point of Scribunto.

Tim did some initial benchmarks which showed a pretty marked
improvement with the Cite template:
http://www.mediawiki.org/wiki/Lua_scripting/Benchmarking

The cluster impact is going to be pretty modest (possibly unmeasurable
at this point) but will have a positive impact once templates are
converted to Lua.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Sîrbu Nicolae-Cezar
Hello,

What is Lua?

Thanks,
Sirbu Nicolae-Cezar


On Sat, Feb 16, 2013 at 12:42 AM, Rob Lanphier  wrote:

> Hi Steven,
>
> Thanks for the encouragement!  Comments inline:
>
> On Fri, Feb 15, 2013 at 12:55 PM, Steven Walling 
> wrote:
> > I didn't see it in the docs above, so thought I'd ask... Is this going to
> > include rollout of the CodeEditor extension, or will that be done
> > separately?
>
> That's a good question.  My initial instinct is to say "just
> Scribunto/Lua", but follow that closely with a CodeEditor deployment
> very soon.  Tim may have something else in mind for this, though, so
> he may correct me..  Is there any reason you're aware of not to deploy
> CodeEditor as well?  Any reason why we should avoid deploying
> Scribunto without CodeEditor?
>
> > This is exciting! Do we have plans for further measurement when it comes
> to
> > Lua's impact on page load times/publishing any results so far? In
> addition
> > to the general benefit of not having to program using wikitext/parser
> > functions, I seem to remember the performance improvements being the big
> > selling point of Scribunto.
>
> Tim did some initial benchmarks which showed a pretty marked
> improvement with the Cite template:
> http://www.mediawiki.org/wiki/Lua_scripting/Benchmarking
>
> The cluster impact is going to be pretty modest (possibly unmeasurable
> at this point) but will have a positive impact once templates are
> converted to Lua.
>
> Rob
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Matthew Flaschen
On 02/15/2013 06:03 PM, Sîrbu Nicolae-Cezar wrote:
> Hello,
> 
> What is Lua?
> 
> Thanks,
> Sirbu Nicolae-Cezar

It is a programming language used for embedded scripting.  Scribunto is
a MediaWiki extension that allows you to write templates in Lua (other
languages possibly coming later).  Done right, this can be clearer, more
powerful, and more efficient than the current ParserFunction templates.

See https://www.mediawiki.org/wiki/Lua for more information.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Matthew Flaschen
On 02/15/2013 03:33 PM, Rob Lanphier wrote:
> Hi everyone,
> 
> We're planning to deploy Lua to a long list of wikis on Monday,
> February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
> English Wikipedia.
> 
> Details here:
> http://meta.wikimedia.org/wiki/Lua

This is A Big Deal.  Congratulations to the whole team!

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Amir E. Aharoni
Is there any way to handle Unicode strings in the version that is
going to be deployed? For example, things like getting the length of
the string "François" as 8 rather than 9?

If not, is there any plan to have this ability any time soon?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2013/2/16 Rob Lanphier :
> Hi everyone,
>
> We're planning to deploy Lua to a long list of wikis on Monday,
> February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
> English Wikipedia.
>
> Details here:
> http://meta.wikimedia.org/wiki/Lua
>
> Jan Kučera (User:Kozuch) has placed notifications on many of the
> wikis.  Those notifications and general communications listed here:
> http://en.wikipedia.org/wiki/User:Kozuch/Lua
>
> This is a really exciting deployment for the projects.  We're really
> looking forward to seeing the great things that people do with this,
> and looking forward to making editing and previewing more responsive
> for template-heavy pages.
>
> Rob
>
> ___
> Wikitech-ambassadors mailing list
> wikitech-ambassad...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Jeremy Baron
Looping wikitech-ambassadors back in.

If someone had to ask at all and they did ask on wikitech-ambassadors
then answer should go to wikitech-ambassadors too.

Anyway, the gist is:
Existing pages/templates will continue to work as they are. A new
(optional) way of writing templates will now be available which will
make server performance better (and give you faster responses when you
save some pages) and will also make writing some things possible if
not before or at least much simpler to implement. Over time editors at
the various wikis will convert some templates to use the new format.

-Jeremy

-- Forwarded message --
From: Matthew Flaschen 
Date: Sat, Feb 16, 2013 at 2:31 AM
Subject: Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to
en.wikipedia.org and a few others
To: wikitech-l@lists.wikimedia.org


On 02/15/2013 06:03 PM, Sîrbu Nicolae-Cezar wrote:
> Hello,
>
> What is Lua?
>
> Thanks,
> Sirbu Nicolae-Cezar

It is a programming language used for embedded scripting.  Scribunto is
a MediaWiki extension that allows you to write templates in Lua (other
languages possibly coming later).  Done right, this can be clearer, more
powerful, and more efficient than the current ParserFunction templates.

See https://www.mediawiki.org/wiki/Lua for more information.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikitech-ambassadors] Lua rollout to en.wikipedia.org and a few others

2013-02-15 Thread Thomas PT
+1 with Amir. The unicode support is very important.

About list of firsts wikis, why fr.wikisource isn't in the list? We have 
requested some months ago to be one of the first wikis to test the extension 
and the answer seems positive. 
https://bugzilla.wikimedia.org/show_bug.cgi?id=39744

Thomas

"Amir E. Aharoni"  a écrit :

Is there any way to handle Unicode strings in the version that is
going to be deployed? For example, things like getting the length of
the string "François" as 8 rather than 9?

If not, is there any plan to have this ability any time soon?

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬


2013/2/16 Rob Lanphier :
> Hi everyone,
>
> We're planning to deploy Lua to a long list of wikis on Monday,
> February 18, 23:00-01:00 UTC (stretching into Tuesday UTC), including
> English Wikipedia.
>
> Details here:
> http://meta.wikimedia.org/wiki/Lua
>
> Jan Kučera (User:Kozuch) has placed notifications on many of the
> wikis.  Those notifications and general communications listed here:
> http://en.wikipedia.org/wiki/User:Kozuch/Lua
>
> This is a really exciting deployment for the projects.  We're really
> looking forward to seeing the great things that people do with this,
> and looking forward to making editing and previewing more responsive
> for template-heavy pages.
>
> Rob
>
> ___
> Wikitech-ambassadors mailing list
> wikitech-ambassad...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit reviewer bot update

2013-02-15 Thread Waldir Pimenta
Update: A new version has been deployed that supports the "match_all_files"
parameter to make "file_regexp" apply to all files in the patchset, rather
than any of the files. Example use:

{{Gerrit-reviewer|User
1|file_regexp=\.txt|match_all_files}}

Still lacking is support for a "commit_msg_regexp" parameter, which should
work similarly to the existing "file_regexp".

--Waldir

On Fri, Feb 15, 2013 at 7:42 AM, Waldir Pimenta  wrote:

> It's not critical to support diff content if commit messages can be used.
> In fact diffs would just be a "nice to have" feature, but the commit
> message ought to be much more informative anyway.
>
> As for a pull request, I'll give it a shot.
>
> --Waldir
>
>
> On Thu, Feb 14, 2013 at 8:31 PM, Merlijn van Deen wrote:
>
>> Hi Waldir,
>>
>> On 14 February 2013 02:21, Waldir Pimenta  wrote:
>> > Any chance the reviewer-bot can support additional triggers? For
>> example,
>> > diff content, commit-message content, etc.
>>
>> Anything that is available from the changes REST api (see [1]) can be
>> added with relative easy. This includes the commit message, but not
>> the diff content (but it might be available in a newer Gerrit
>> release).
>>
>> It would be possible to get the data from either the JSON-RPC api
>> (with a high risk of breakage on new Gerrit deployments) or via git,
>> but this would be an considerable effort.
>>
>> > Also, it wold be nice to specify whether the currently supported filters
>> > should apply to ANY of the files in the change (the current behavior) or
>> > ALL of the changed files.
>>
>> There is no fundamental reason why this would be impossible, but the
>> syntax might quickly become complicated. It would require only a few
>> lines of code and an extra parameter ('file_regexp_all'). The use case
>> you described on IRC ('new reviewers who only want to review changes
>> that only contain .css files') makes sense.
>>
>
>> I won't have time in the coming weeks to implement either one of
>> those, though. Feel free to implement it yourself & to submit a pull
>> request, though. I have just added some functionality to easily test
>> suggested reviewers from the command line. For more details, please
>> see [2]. In any case, it's on my to-do list, and I'll get to it when I
>> get to it ;-)
>
>
>> Best,
>> Merlijn
>>
>>
>> [1]
>> https://gerrit.wikimedia.org/r/Documentation/rest-api-changes.html#_get_changes_query_changes
>> [2] https://github.com/valhallasw/gerrit-reviewer-bot
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Engineering Announcement: Marc-Andre Pelletier joins TechOps

2013-02-15 Thread Ct Woo
All,

I am pleased to announce Marc-Andre will be joining us as aTechnical
Operations Engineer (contractor) starting this February 25, 2013. His
primary focus will be to build up the Wikimedia Labs infrastructure and to
assist the community developers to migrate their tools to that
infrastructure, especially those residing on the Toolserver today.

Marc-Andre is an active wikipedian and he is better known as 'Coren' on
English Wikipedia and has been  a volunteer editor since 2006  where he has
served as administrator and arbitrator. He also always kept himself
involved with the technical and procedural aspects of automated editing
(so-called bots), having written and operated a copyright-violation
detection bot  for several years.

Marc has been a Unix system administrator  and occasional computer science
instructor for 20+ years, in fields ranging from telecommunication to game
development. He studied IT at  École de Technologie Supérieure (Canada).

Please join me to welcome him and you can find him on IRC (freenode.net)
using the nick 'Coren'.

Thanks.

CT Woo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l