Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-28 Thread Pine W
Thanks for this initiative.

I think that concerns at the moment would be in the domains of privacy,
security, lack of WMF analytics intstrumentation, and WMF fundraising
limitations.

That said, looking in the longer term, a number of us in the community are
interested in decreasing our dependencies on the Wikimedia Foundation as
insurance against possible catastrophes and as a backup plan in case of
another significant WMF dispute with the community. It might be worth
exploring the options for setting up Wikipedia on infrastructure outside of
WMF. I would be interested in talking with you to discuss this further;
please let me know if you have time for a Hangout session in early to mid
December.

Thank you for your interest!
Pine
On Nov 27, 2015 10:50 PM, "Yeongjin Jang" 
wrote:

>
> Hi,
>
> I am Yeongjin Jang, a Ph.D. Student at Georgia Tech.
>
> In our lab (SSLab, https://sslab.gtisc.gatech.edu/),
> we are working on a project called B2BWiki,
> which enables users to share the contents of Wikipedia through WebRTC
> (peer-to-peer sharing).
>
> Website is at here: http://b2bwiki.cc.gatech.edu/
>
> The project aims to help Wikipedia by donating computing resources
> from the community; users can donate their traffic (by P2P communication)
> and storage (indexedDB) to reduce the load of Wikipedia servers.
> For larger organizations, e.g. schools or companies that
> have many local users, they can donate a mirror server
> similar to GNU FTP servers, which can bootstrap peer sharing.
>
>
> Potential benefits that we think of are following.
> 1) Users can easily donate their resources to the community.
> Just visit the website.
>
> 2) Users can get performance benefit if a page is loaded from
> multiple local peers / local mirror (page load time got faster!).
>
> 3) Wikipedia can reduce its server workload, network traffic, etc.
>
> 4) Local network operators can reduce network traffic transit
> (e.g. cost that is caused by delivering the traffic to the outside).
>
>
> While we are working on enhancing the implementation,
> we would like to ask the opinions from actual developers of Wikipedia.
> For example, we want to know whether our direction is correct or not
> (will it actually reduce the load?), or if there are some other concerns
> that we missed, that can potentially prevent this system from
> working as intended. We really want to do somewhat meaningful work
> that actually helps run Wikipedia!
>
> Please feel free to give as any suggestions, comments, etc.
> If you want to express your opinion privately,
> please contact ss...@cc.gatech.edu.
>
> Thanks,
>
> --- Appendix ---
>
> I added some detailed information about B2BWiki in the following.
>
> # Accessing data
> When accessing a page on B2BWiki, the browser will query peers first.
> 1) If there exist peers that hold the contents, peer to peer download
> happens.
> 2) otherwise, if there is no peer, client will download the content
> from the mirror server.
> 3) If mirror server does not have the content, it downloads from
> Wikipedia server (1 access per first download, and update).
>
>
> # Peer lookup
> To enable content lookup for peers,
> we manage a lookup server that holds a page_name-to-peer map.
> A client (a user's browser) can query the list of peers that
> currently hold the content, and select the peer by its freshness
> (has hash/timestamp of the content,
> has top 2 octet of IP address
> (figuring out whether it is local peer or not), etc.
>
>
> # Update, and integrity check
> Mirror server updates its content per each day
> (can be configured to update per each hour, etc).
> Update check is done by using If-Modified-Since header from Wikipedia
> server.
> On retrieving the content from Wikipedia, the mirror server stamps a
> timestamp
> and sha1 checksum, to ensure the freshness of data and its integrity.
> When clients lookup and download the content from the peers,
> client will compare the sha1 checksum of data
> with the checksum from lookup server.
>
> In this settings, users can get older data
> (they can configure how to tolerate the freshness of data,
> e.g. 1day older, 3day, 1 week older, etc.), and
> the integrity is guaranteed by mirror/lookup server.
>
>
> More detailed information can be obtained from the following website.
>
> http://goo.gl/pSNrjR
> (URL redirects to SSLab@gatech website)
>
> Please feel free to give as any suggestions, comments, etc.
>
> Thanks,
> --
> Yeongjin Jang
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing WikiToLearn to developers

2015-11-28 Thread Federico Leva (Nemo)
Your code modifications for http://wikitolearn.org/ are interesting. I'm 
pretty sure that KDE policies don't force you to fork MediaWiki 
extensions locally, so your patches are definitely welcome upstream.


I'm not sure what you mean with your point about  being rejected 
by the community; perhaps you refer to some performance decision made by 
WMF. If your modifications to Math are incompatible with some decision 
of the maintainers, you can ask a different repository on gerrit or 
another branch on the same repository, so that non-WMF users can use 
your code.


As for your comments on chapters and drafts, I don't see anything 
incompatible with how Wikibooks and Wikiversity work. If you have a 
solution for what we call "book management" i.e. 
https://phabricator.wikimedia.org/T17071 (worked on by Raylton and 
others with 
https://meta.wikimedia.org/wiki/Category:GSoC_Mediawiki_Book_Experience 
), that's especially interesting.


To reach the Wikibooks and Wikiversity community, the best way is to use 
a medium that can involve their active editors, such as their mailing 
lists (cc'ed here) or wikis.


Nemo


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-28 Thread Brian Wolff
On 11/28/15, Yeongjin Jang  wrote:
>
> Hi,
>
> I am Yeongjin Jang, a Ph.D. Student at Georgia Tech.
>
> In our lab (SSLab, https://sslab.gtisc.gatech.edu/),
> we are working on a project called B2BWiki,
> which enables users to share the contents of Wikipedia through WebRTC
> (peer-to-peer sharing).
>
> Website is at here: http://b2bwiki.cc.gatech.edu/
>
> The project aims to help Wikipedia by donating computing resources
> from the community; users can donate their traffic (by P2P communication)
> and storage (indexedDB) to reduce the load of Wikipedia servers.
> For larger organizations, e.g. schools or companies that
> have many local users, they can donate a mirror server
> similar to GNU FTP servers, which can bootstrap peer sharing.
>
>
> Potential benefits that we think of are following.
> 1) Users can easily donate their resources to the community.
> Just visit the website.
>
> 2) Users can get performance benefit if a page is loaded from
> multiple local peers / local mirror (page load time got faster!).
>
> 3) Wikipedia can reduce its server workload, network traffic, etc.
>
> 4) Local network operators can reduce network traffic transit
> (e.g. cost that is caused by delivering the traffic to the outside).
>
>
> While we are working on enhancing the implementation,
> we would like to ask the opinions from actual developers of Wikipedia.
> For example, we want to know whether our direction is correct or not
> (will it actually reduce the load?), or if there are some other concerns
> that we missed, that can potentially prevent this system from
> working as intended. We really want to do somewhat meaningful work
> that actually helps run Wikipedia!
>
> Please feel free to give as any suggestions, comments, etc.
> If you want to express your opinion privately,
> please contact ss...@cc.gatech.edu.
>
> Thanks,
>
> --- Appendix ---
>
> I added some detailed information about B2BWiki in the following.
>
> # Accessing data
> When accessing a page on B2BWiki, the browser will query peers first.
> 1) If there exist peers that hold the contents, peer to peer download
> happens.
> 2) otherwise, if there is no peer, client will download the content
> from the mirror server.
> 3) If mirror server does not have the content, it downloads from
> Wikipedia server (1 access per first download, and update).
>
>
> # Peer lookup
> To enable content lookup for peers,
> we manage a lookup server that holds a page_name-to-peer map.
> A client (a user's browser) can query the list of peers that
> currently hold the content, and select the peer by its freshness
> (has hash/timestamp of the content,
> has top 2 octet of IP address
> (figuring out whether it is local peer or not), etc.
>
>
> # Update, and integrity check
> Mirror server updates its content per each day
> (can be configured to update per each hour, etc).
> Update check is done by using If-Modified-Since header from Wikipedia
> server.
> On retrieving the content from Wikipedia, the mirror server stamps a
> timestamp
> and sha1 checksum, to ensure the freshness of data and its integrity.
> When clients lookup and download the content from the peers,
> client will compare the sha1 checksum of data
> with the checksum from lookup server.
>
> In this settings, users can get older data
> (they can configure how to tolerate the freshness of data,
> e.g. 1day older, 3day, 1 week older, etc.), and
> the integrity is guaranteed by mirror/lookup server.
>
>
> More detailed information can be obtained from the following website.
>
> http://goo.gl/pSNrjR
> (URL redirects to SSLab@gatech website)
>
> Please feel free to give as any suggestions, comments, etc.
>
> Thanks,
> --
> Yeongjin Jang
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Hi,

This is some interesting stuff, and I think research along these lines
(That is, leveraging webrtc to deliver content in a P2P manner over
web browsers) will really change the face of the internet in the years
to come.

As for Wikipedia specifically (This is all just my personal opinion.
Others may disagree with me):

*Wikipedia is a a fairly stable/mature site. I think we're past the
point where its a good idea to experiment with experimental
technologies (Although mirrors of Wikipedia are a good place to do
that). We need stability and proven technology.

*Bandwidth makes up a very small portion of WMF's expenses (I'm not
sure how small. Someone once told me that donation processing costs
takes up more money then raw bandwidth costs. Don't know if that's
true, but bandwidth is certainly not the biggest expense).

Your scheme primarily serves to offload bandwidth of cached content to
other people. But serving cached content (by which I mean, anonymous
users getting results from varnish) is probably the cheapest (in terms
of computational resources) part of our setup. The hard 

Re: [Wikitech-l] Vision for Wikipedia

2015-11-28 Thread Brian Wolff
On 11/26/15, Yuri Astrakhan  wrote:
> I would like to share my vision for Wikipedia's future, as well as some
> steps that could help us get there. I hope you find it useful. Please share
> your feedback and ideas.
>
> Vision: https://meta.wikimedia.org/wiki/User:Yurik/I_Dream_of_Content
> Implementation:
> https://meta.wikimedia.org/wiki/User:Yurik/From_Dream_to_Reality
>
> Thanks!
>
> P.S. The first link was previously shared on a different mailing list.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Some people in the past (I think) have suggested allowing scripted
svgs (Perhaps with restrictions on the scripts), embedded on pages
using sandboxed 's, and coming from an entirely different
domain [I'm not sure if that's sufficient for security or not. It
might be], with the idea of using them for interactive demonstrations.
Have you given any thoughts about that approach?

At the moment, the closest thing to what you're suggested, is the
experiment at eswiki about using dedicates site js to do things with
canvas (see https://es.wikipedia.org/wiki/Juego_de_la_vida and
https://es.wikipedia.org/wiki/Hormiga_de_Langton ).


I personally find vega syntax almost impossible to understand, and I'm
significantly more technical than the average user. I think the
opaqueness of the syntax has made graph functionality significantly
underfulfil its potential. As it stands, after quite a while of it
being deployed, there are only 12 main namespace articles on enwiki
with graphs on them, the majority of which use the {{graphChart}}
wrapper to make a rather boring line graph. The most interesting is
probably on [[Expansion_timeline_of_the_Moscow_Metro]], which is still
your run of the mill chart. These are not the exciting visualizations
that I was hoping the community would come up with.

Thus I think its important that we give the community tools that not
only make it possible to add cool things, but also make it easy for
users.

--
-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-28 Thread Yeongjin Jang
Thank you for your comments!


On Sat, Nov 28, 2015 at 2:33 PM, Brian Wolff  wrote:

> On 11/28/15, Yeongjin Jang  wrote:
> >
> > Hi,
> >
> > I am Yeongjin Jang, a Ph.D. Student at Georgia Tech.
> >
> > In our lab (SSLab, https://sslab.gtisc.gatech.edu/),
> > we are working on a project called B2BWiki,
> > which enables users to share the contents of Wikipedia through WebRTC
> > (peer-to-peer sharing).
> >
> > Website is at here: http://b2bwiki.cc.gatech.edu/
> >
> > The project aims to help Wikipedia by donating computing resources
> > from the community; users can donate their traffic (by P2P communication)
> > and storage (indexedDB) to reduce the load of Wikipedia servers.
> > For larger organizations, e.g. schools or companies that
> > have many local users, they can donate a mirror server
> > similar to GNU FTP servers, which can bootstrap peer sharing.
> >
> >
> > Potential benefits that we think of are following.
> > 1) Users can easily donate their resources to the community.
> > Just visit the website.
> >
> > 2) Users can get performance benefit if a page is loaded from
> > multiple local peers / local mirror (page load time got faster!).
> >
> > 3) Wikipedia can reduce its server workload, network traffic, etc.
> >
> > 4) Local network operators can reduce network traffic transit
> > (e.g. cost that is caused by delivering the traffic to the outside).
> >
> >
> > While we are working on enhancing the implementation,
> > we would like to ask the opinions from actual developers of Wikipedia.
> > For example, we want to know whether our direction is correct or not
> > (will it actually reduce the load?), or if there are some other concerns
> > that we missed, that can potentially prevent this system from
> > working as intended. We really want to do somewhat meaningful work
> > that actually helps run Wikipedia!
> >
> > Please feel free to give as any suggestions, comments, etc.
> > If you want to express your opinion privately,
> > please contact ss...@cc.gatech.edu.
> >
> > Thanks,
> >
> > --- Appendix ---
> >
> > I added some detailed information about B2BWiki in the following.
> >
> > # Accessing data
> > When accessing a page on B2BWiki, the browser will query peers first.
> > 1) If there exist peers that hold the contents, peer to peer download
> > happens.
> > 2) otherwise, if there is no peer, client will download the content
> > from the mirror server.
> > 3) If mirror server does not have the content, it downloads from
> > Wikipedia server (1 access per first download, and update).
> >
> >
> > # Peer lookup
> > To enable content lookup for peers,
> > we manage a lookup server that holds a page_name-to-peer map.
> > A client (a user's browser) can query the list of peers that
> > currently hold the content, and select the peer by its freshness
> > (has hash/timestamp of the content,
> > has top 2 octet of IP address
> > (figuring out whether it is local peer or not), etc.
> >
> >
> > # Update, and integrity check
> > Mirror server updates its content per each day
> > (can be configured to update per each hour, etc).
> > Update check is done by using If-Modified-Since header from Wikipedia
> > server.
> > On retrieving the content from Wikipedia, the mirror server stamps a
> > timestamp
> > and sha1 checksum, to ensure the freshness of data and its integrity.
> > When clients lookup and download the content from the peers,
> > client will compare the sha1 checksum of data
> > with the checksum from lookup server.
> >
> > In this settings, users can get older data
> > (they can configure how to tolerate the freshness of data,
> > e.g. 1day older, 3day, 1 week older, etc.), and
> > the integrity is guaranteed by mirror/lookup server.
> >
> >
> > More detailed information can be obtained from the following website.
> >
> > http://goo.gl/pSNrjR
> > (URL redirects to SSLab@gatech website)
> >
> > Please feel free to give as any suggestions, comments, etc.
> >
> > Thanks,
> > --
> > Yeongjin Jang
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Hi,
>
> This is some interesting stuff, and I think research along these lines
> (That is, leveraging webrtc to deliver content in a P2P manner over
> web browsers) will really change the face of the internet in the years
> to come.
>
> As for Wikipedia specifically (This is all just my personal opinion.
> Others may disagree with me):
>
> *Wikipedia is a a fairly stable/mature site. I think we're past the
> point where its a good idea to experiment with experimental
> technologies (Although mirrors of Wikipedia are a good place to do
> that). We need stability and proven technology.
>


That's true. Our current prototype works well for the testing,
but not sure for the robustness in the wild, yet.
We want to develop this to be more stable.


>
> *Bandwidth makes up a very 

[Wikitech-l] Flushing cached parser output after preview

2015-11-28 Thread Yuri Astrakhan
https://phabricator.wikimedia.org/T119779
Graph extension generates different html output depending on the isPreview
parser option, but if user previews a page and saves it right after without
any changes, the parser reuses previous output. Is there a way to force
parser regenerate on save? Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-28 Thread Yeongjin Jang
Thank you for your attention! We would love to talk with you.

Regarding Hangout meeting, we are available M-F in the next week
and further weeks. Please consider that we are living in
Eastern US Time so between 10am - 8pm EST will be
mostly available.

On Sat, Nov 28, 2015 at 3:50 AM, Pine W  wrote:

> Thanks for this initiative.
>
> I think that concerns at the moment would be in the domains of privacy,
> security, lack of WMF analytics intstrumentation, and WMF fundraising
> limitations.
>
> That said, looking in the longer term, a number of us in the community are
> interested in decreasing our dependencies on the Wikimedia Foundation as
> insurance against possible catastrophes and as a backup plan in case of
> another significant WMF dispute with the community. It might be worth
> exploring the options for setting up Wikipedia on infrastructure outside of
> WMF. I would be interested in talking with you to discuss this further;
> please let me know if you have time for a Hangout session in early to mid
> December.
>
> Thank you for your interest!
> Pine
> On Nov 27, 2015 10:50 PM, "Yeongjin Jang" 
> wrote:
>
> >
> > Hi,
> >
> > I am Yeongjin Jang, a Ph.D. Student at Georgia Tech.
> >
> > In our lab (SSLab, https://sslab.gtisc.gatech.edu/),
> > we are working on a project called B2BWiki,
> > which enables users to share the contents of Wikipedia through WebRTC
> > (peer-to-peer sharing).
> >
> > Website is at here: http://b2bwiki.cc.gatech.edu/
> >
> > The project aims to help Wikipedia by donating computing resources
> > from the community; users can donate their traffic (by P2P communication)
> > and storage (indexedDB) to reduce the load of Wikipedia servers.
> > For larger organizations, e.g. schools or companies that
> > have many local users, they can donate a mirror server
> > similar to GNU FTP servers, which can bootstrap peer sharing.
> >
> >
> > Potential benefits that we think of are following.
> > 1) Users can easily donate their resources to the community.
> > Just visit the website.
> >
> > 2) Users can get performance benefit if a page is loaded from
> > multiple local peers / local mirror (page load time got faster!).
> >
> > 3) Wikipedia can reduce its server workload, network traffic, etc.
> >
> > 4) Local network operators can reduce network traffic transit
> > (e.g. cost that is caused by delivering the traffic to the outside).
> >
> >
> > While we are working on enhancing the implementation,
> > we would like to ask the opinions from actual developers of Wikipedia.
> > For example, we want to know whether our direction is correct or not
> > (will it actually reduce the load?), or if there are some other concerns
> > that we missed, that can potentially prevent this system from
> > working as intended. We really want to do somewhat meaningful work
> > that actually helps run Wikipedia!
> >
> > Please feel free to give as any suggestions, comments, etc.
> > If you want to express your opinion privately,
> > please contact ss...@cc.gatech.edu.
> >
> > Thanks,
> >
> > --- Appendix ---
> >
> > I added some detailed information about B2BWiki in the following.
> >
> > # Accessing data
> > When accessing a page on B2BWiki, the browser will query peers first.
> > 1) If there exist peers that hold the contents, peer to peer download
> > happens.
> > 2) otherwise, if there is no peer, client will download the content
> > from the mirror server.
> > 3) If mirror server does not have the content, it downloads from
> > Wikipedia server (1 access per first download, and update).
> >
> >
> > # Peer lookup
> > To enable content lookup for peers,
> > we manage a lookup server that holds a page_name-to-peer map.
> > A client (a user's browser) can query the list of peers that
> > currently hold the content, and select the peer by its freshness
> > (has hash/timestamp of the content,
> > has top 2 octet of IP address
> > (figuring out whether it is local peer or not), etc.
> >
> >
> > # Update, and integrity check
> > Mirror server updates its content per each day
> > (can be configured to update per each hour, etc).
> > Update check is done by using If-Modified-Since header from Wikipedia
> > server.
> > On retrieving the content from Wikipedia, the mirror server stamps a
> > timestamp
> > and sha1 checksum, to ensure the freshness of data and its integrity.
> > When clients lookup and download the content from the peers,
> > client will compare the sha1 checksum of data
> > with the checksum from lookup server.
> >
> > In this settings, users can get older data
> > (they can configure how to tolerate the freshness of data,
> > e.g. 1day older, 3day, 1 week older, etc.), and
> > the integrity is guaranteed by mirror/lookup server.
> >
> >
> > More detailed information can be obtained from the following website.
> >
> > http://goo.gl/pSNrjR
> > (URL redirects to SSLab@gatech website)
> >
> > Please feel free to give as any suggestions, comments, etc.

Re: [Wikitech-l] RFC meeting: Minimum PHP version

2015-11-28 Thread Marcin Cieslak
On 2015-11-24, Rob Lanphier  wrote:
> Hi folks,
>
> This week's RFC review meeting is scheduled for Wednesday, November 25
> at 2pm PST (22:00 UTC).  Event particulars can be found at
>
>
> The main task this week is to plan out what we will define the minimum
> PHP version to be for MediaWiki 1.27 (the next LTS version).  The
> viable choices seem to be:
> *  PHP 5.3 (the status quo) - this version is no longer supported
> upstream, and doesn't have widespread support even in conservatively
> updated Linux distros.
> *  PHP 5.4 - this version is no longer supported by The PHP Group, but
> is still part of older supported Linux distros (e.g. Debian Wheezy)
> *  PHP 5.5 - this is the lowest version with reliable LTS support in

In one enterprise environment I am familiar with one is stuck
with SLES 11.3/11.4 with they own enterprise repository, and the
newest I've seen was 5.3.something.

Saper



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC meeting: Minimum PHP version

2015-11-28 Thread Rob Lanphier
On Sat, Nov 28, 2015 at 1:35 PM, Marcin Cieslak  wrote:
> On 2015-11-24, Rob Lanphier  wrote:
> > [regarding discussion about ]
> > The main task this week is to plan out what we will define the minimum
> > PHP version to be for MediaWiki 1.27 (the next LTS version).  The
> > viable choices seem to be:
> > *  PHP 5.3 (the status quo) - this version is no longer supported
> > upstream, and doesn't have widespread support even in conservatively
> > updated Linux distros.
> > *  PHP 5.4 - this version is no longer supported by The PHP Group, but
> > is still part of older supported Linux distros (e.g. Debian Wheezy)
> > *  PHP 5.5 - this is the lowest version with reliable LTS support in
>
> In one enterprise environment I am familiar with one is stuck
> with SLES 11.3/11.4 with they own enterprise repository, and the
> newest I've seen was 5.3.something.

As of Wednesday's meeting, PHP 5.5 is the new minimum for MediaWiki
1.27.  The resulting conversation on T118932 shows there is
frustration for how the decision was arrived at.

We need to figure out not only which Linux distros are supported, but
what "support" means and how we accomplish it.  If it turns out that
SLES 11.3 is very widely used, that points for the need for someone to
support that version.

That leaves a few questions:
*  What does "support" mean?  Where does this belong on the continuum
stretching from "continuing to offer security patches" to "require all
master commits to MediaWIki to interoperate with PHP 5.3"?
*  Who should provide support?  Is this something WMF needs to finance and lead?
*  How much analysis is a prerequisite to a PHP version jump?

I believe the conclusion on T118932 is a good one; it's time for us to
move to PHP 5.5 for MediaWiki 1.27+.  That said, we (the larger
community; not just WMF) clarify our LTS strategy, since it's not
entirely clear to me who is committed to supporting MediaWiki 1.23 all
of the way until May 2017.  Furthermore, there seems to be some
skepticism as to whether 1.27 should even be offered as the next "LTS"
version of MediaWiki.  Could someone clarify this for me?

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l