Re: [Wikitech-l] Composer use in MediaWiki (II)

2013-02-11 Thread Antoine Musso
Le 05/02/13 16:02, Mark A. Hershberger a écrit :
> On 02/05/2013 09:06 AM, Antoine Musso wrote:
>> Long story short, we have MediaWiki core published at packagist.org now:
> 
> This is awesome!
> 
> How do we get extensions into packagist?

There is no real documentation yet but one could reuse an existing
composer.json from another extension. For example EducationProgram [1]


Note that, by convention, packagist require a namespace and project
names to be all in lowercase and separated by dashes. Hence 'FooBar'
extension should be named 'mediawiki/foo-bar'.  The MediaWiki installer
included in composer would take care of translating that back to
CamelCase and install the extension under /extensions/

Once composer.json has been reviewed / merged, submit the git URL at
https://packagist.org/packages/submit (login using your github account).

Done.



[1]
https://github.com/wikimedia/mediawiki-extensions-EducationProgram/blob/master/composer.json



-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Mariya Nedelcheva Miteva
Hi all,

I have been talking to many third-party yas part of my WMF internship in
the last few weeks and one the main concerns they express is the lack of
stability in the PHP classes MediaWiki exposes from version to version. The
frequent changes in what I would call the PHP-API makes extentions
developement and maintenance much more difficult with compatibility from
version to version becoming a problem. Solving the problem would probably
result in the development of more extensions, easier maintenance, less
hacks to core and more users upgrading to the latest MediaWiki version. I
am sure WMF developers are facing similar issues especially with projects
like WikiData going on.

My question is: With the given technical heritage that MediaWiki carries,
is it possible to have a (relatively) stable set of PHP classes defined,
with a pledge not to change them in the next X releases or at least with
some longer deprecation time? What would maintaining such a PHP-API entail?
How difficult is it given the vast number of dependancies in MediaWiki
code? Does it require restructuring a lot of the current core code? Do you
think it should be a definite goal for WMF?

Thank you.

Mariya
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Yuri Astrakhan
Mariya,

Could you be more specific? What types of changes caused extensions to
break? I might be mistaken but the vast majority of the API framework
classes have been established over 5 years ago, with very few breaking
changes since. Most changes were related to adding new functionality (new
actions, new query submodules, new parameters), but that shouldn't have
significantly affected extension development.

I do plan to introduce a few breaking changes (majority of the extensions
should not be affected) in 1.21, such as the introduction of versioning,
further modularization to allow pageset reuse, etc.
See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future

Please note that in a rare case some features might be purposefully removed
due to the security or scalability issues.

--Yuri


On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
mariya.mit...@gmail.com> wrote:

> Hi all,
>
> I have been talking to many third-party yas part of my WMF internship in
> the last few weeks and one the main concerns they express is the lack of
> stability in the PHP classes MediaWiki exposes from version to version. The
> frequent changes in what I would call the PHP-API makes extentions
> developement and maintenance much more difficult with compatibility from
> version to version becoming a problem. Solving the problem would probably
> result in the development of more extensions, easier maintenance, less
> hacks to core and more users upgrading to the latest MediaWiki version. I
> am sure WMF developers are facing similar issues especially with projects
> like WikiData going on.
>
> My question is: With the given technical heritage that MediaWiki carries,
> is it possible to have a (relatively) stable set of PHP classes defined,
> with a pledge not to change them in the next X releases or at least with
> some longer deprecation time? What would maintaining such a PHP-API entail?
> How difficult is it given the vast number of dependancies in MediaWiki
> code? Does it require restructuring a lot of the current core code? Do you
> think it should be a definite goal for WMF?
>
> Thank you.
>
> Mariya
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Chad
I believe Mariya is talking about the internal APIs available to extensions,
(eg: User, Title, EditPage, so forth), not the public API.

Yay, acronyms!

-Chad

On Mon, Feb 11, 2013 at 8:14 AM, Yuri Astrakhan  wrote:
> Mariya,
>
> Could you be more specific? What types of changes caused extensions to
> break? I might be mistaken but the vast majority of the API framework
> classes have been established over 5 years ago, with very few breaking
> changes since. Most changes were related to adding new functionality (new
> actions, new query submodules, new parameters), but that shouldn't have
> significantly affected extension development.
>
> I do plan to introduce a few breaking changes (majority of the extensions
> should not be affected) in 1.21, such as the introduction of versioning,
> further modularization to allow pageset reuse, etc.
> See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
>
> Please note that in a rare case some features might be purposefully removed
> due to the security or scalability issues.
>
> --Yuri
>
>
> On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> mariya.mit...@gmail.com> wrote:
>
>> Hi all,
>>
>> I have been talking to many third-party yas part of my WMF internship in
>> the last few weeks and one the main concerns they express is the lack of
>> stability in the PHP classes MediaWiki exposes from version to version. The
>> frequent changes in what I would call the PHP-API makes extentions
>> developement and maintenance much more difficult with compatibility from
>> version to version becoming a problem. Solving the problem would probably
>> result in the development of more extensions, easier maintenance, less
>> hacks to core and more users upgrading to the latest MediaWiki version. I
>> am sure WMF developers are facing similar issues especially with projects
>> like WikiData going on.
>>
>> My question is: With the given technical heritage that MediaWiki carries,
>> is it possible to have a (relatively) stable set of PHP classes defined,
>> with a pledge not to change them in the next X releases or at least with
>> some longer deprecation time? What would maintaining such a PHP-API entail?
>> How difficult is it given the vast number of dependancies in MediaWiki
>> code? Does it require restructuring a lot of the current core code? Do you
>> think it should be a definite goal for WMF?
>>
>> Thank you.
>>
>> Mariya
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread bawolff
I don't think she means what we call the api - but more methods random
extensions are likely to use.

We could start documenting certain stable methods with a special code
comment ( say @stable) which would mean something like this method will not
be removed, and if the need arises we'll remove the @stable and wait 2
versions before removing the method. Key candidates for this would include
title::newFromText, parser::recursiveTagParse, etc. Ideally one would wait
for the method to stand the test of time before tagging.

>I
> am sure WMF developers are facing >similar issues especially

I don't think that's the case. It used to be the responsibility of the
person making the breaking change to fix all callers in the wikimedia
extension repo. Im not sure if that's still the case but nonetheless I do
not feel this is a significant problem for deployed extensions.(im sure
someone will correct me if im wrong)

-bawolff
On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:

> Mariya,
>
> Could you be more specific? What types of changes caused extensions to
> break? I might be mistaken but the vast majority of the API framework
> classes have been established over 5 years ago, with very few breaking
> changes since. Most changes were related to adding new functionality (new
> actions, new query submodules, new parameters), but that shouldn't have
> significantly affected extension development.
>
> I do plan to introduce a few breaking changes (majority of the extensions
> should not be affected) in 1.21, such as the introduction of versioning,
> further modularization to allow pageset reuse, etc.
> See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
>
> Please note that in a rare case some features might be purposefully removed
> due to the security or scalability issues.
>
> --Yuri
>
>
> On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> mariya.mit...@gmail.com> wrote:
>
> > Hi all,
> >
> > I have been talking to many third-party yas part of my WMF internship in
> > the last few weeks and one the main concerns they express is the lack of
> > stability in the PHP classes MediaWiki exposes from version to version.
> The
> > frequent changes in what I would call the PHP-API makes extentions
> > developement and maintenance much more difficult with compatibility from
> > version to version becoming a problem. Solving the problem would probably
> > result in the development of more extensions, easier maintenance, less
> > hacks to core and more users upgrading to the latest MediaWiki version. I
> > am sure WMF developers are facing similar issues especially with projects
> > like WikiData going on.
> >
> > My question is: With the given technical heritage that MediaWiki carries,
> > is it possible to have a (relatively) stable set of PHP classes defined,
> > with a pledge not to change them in the next X releases or at least with
> > some longer deprecation time? What would maintaining such a PHP-API
> entail?
> > How difficult is it given the vast number of dependancies in MediaWiki
> > code? Does it require restructuring a lot of the current core code? Do
> you
> > think it should be a definite goal for WMF?
> >
> > Thank you.
> >
> > Mariya
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Yuri Astrakhan
I have a blasphemous proposal... extensions should only use the public API
via FauxRequest objects. The public API interface has been very stable, and
it gives MW core devs much more flexibility with changing the innards of
the system...


On Mon, Feb 11, 2013 at 8:36 AM, bawolff  wrote:

> I don't think she means what we call the api - but more methods random
> extensions are likely to use.
>
> We could start documenting certain stable methods with a special code
> comment ( say @stable) which would mean something like this method will not
> be removed, and if the need arises we'll remove the @stable and wait 2
> versions before removing the method. Key candidates for this would include
> title::newFromText, parser::recursiveTagParse, etc. Ideally one would wait
> for the method to stand the test of time before tagging.
>
> >I
> > am sure WMF developers are facing >similar issues especially
>
> I don't think that's the case. It used to be the responsibility of the
> person making the breaking change to fix all callers in the wikimedia
> extension repo. Im not sure if that's still the case but nonetheless I do
> not feel this is a significant problem for deployed extensions.(im sure
> someone will correct me if im wrong)
>
> -bawolff
> On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:
>
> > Mariya,
> >
> > Could you be more specific? What types of changes caused extensions to
> > break? I might be mistaken but the vast majority of the API framework
> > classes have been established over 5 years ago, with very few breaking
> > changes since. Most changes were related to adding new functionality (new
> > actions, new query submodules, new parameters), but that shouldn't have
> > significantly affected extension development.
> >
> > I do plan to introduce a few breaking changes (majority of the extensions
> > should not be affected) in 1.21, such as the introduction of versioning,
> > further modularization to allow pageset reuse, etc.
> > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> >
> > Please note that in a rare case some features might be purposefully
> removed
> > due to the security or scalability issues.
> >
> > --Yuri
> >
> >
> > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > mariya.mit...@gmail.com> wrote:
> >
> > > Hi all,
> > >
> > > I have been talking to many third-party yas part of my WMF internship
> in
> > > the last few weeks and one the main concerns they express is the lack
> of
> > > stability in the PHP classes MediaWiki exposes from version to version.
> > The
> > > frequent changes in what I would call the PHP-API makes extentions
> > > developement and maintenance much more difficult with compatibility
> from
> > > version to version becoming a problem. Solving the problem would
> probably
> > > result in the development of more extensions, easier maintenance, less
> > > hacks to core and more users upgrading to the latest MediaWiki
> version. I
> > > am sure WMF developers are facing similar issues especially with
> projects
> > > like WikiData going on.
> > >
> > > My question is: With the given technical heritage that MediaWiki
> carries,
> > > is it possible to have a (relatively) stable set of PHP classes
> defined,
> > > with a pledge not to change them in the next X releases or at least
> with
> > > some longer deprecation time? What would maintaining such a PHP-API
> > entail?
> > > How difficult is it given the vast number of dependancies in MediaWiki
> > > code? Does it require restructuring a lot of the current core code? Do
> > you
> > > think it should be a definite goal for WMF?
> > >
> > > Thank you.
> > >
> > > Mariya
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-11 Thread Lydia Pintscher
Hey :)

Guillaume just reminded me that I have not yet posted this here but
only on the village pump and Signpost. Sorry. I'll fix that now with
this email.

Later today (evening UTC) we'll deploy the first phase of Wikidata on
the English language Wikipedia.  We've already deployed the first
phase on the Hungarian, Hebrew and Italian Wikipedias and things there
went rather smoothly. We hope this is the case here too.

What is going to happen exactly?
* Language links in the sidebar will come from Wikidata if they exist there.
* Existing language links in the wiki text will continue to work and
overwrite links from Wikidata.
* For individual articles language links from Wikidata can be
supressed completely with the noexternallanglinks magic word.
* Changes on Wikidata that relate to articles on this Wikipedia will
show up in Recent Changes if the option is enabled by the user
* At the bottom of the language links list you will see a link to edit
the language links that leads you to the corresponding page on
Wikidata.
* You can see an example of how it works at
http://it.wikipedia.org/wiki/Marie_Curie
* The second phase of Wikidata (which is about Infoboxes was started
on Wikidata but can't yet be used on any Wikipedia. This will follow
later.

Please let me know if you have any questions.


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Maria Miteva
Hi Yuri,

As I haven't created an extension myself( which I realize is a huge problem
when talking about this kind of a problem ), is this feasable for more
complex extensiosns?  If yes, why are people not doing it? Is is just
because of lack of information?

Mariya

On Mon, Feb 11, 2013 at 3:47 PM, Yuri Astrakhan wrote:

> I have a blasphemous proposal... extensions should only use the public API
> via FauxRequest objects. The public API interface has been very stable, and
> it gives MW core devs much more flexibility with changing the innards of
> the system...
>
>
> On Mon, Feb 11, 2013 at 8:36 AM, bawolff  wrote:
>
> > I don't think she means what we call the api - but more methods random
> > extensions are likely to use.
> >
> > We could start documenting certain stable methods with a special code
> > comment ( say @stable) which would mean something like this method will
> not
> > be removed, and if the need arises we'll remove the @stable and wait 2
> > versions before removing the method. Key candidates for this would
> include
> > title::newFromText, parser::recursiveTagParse, etc. Ideally one would
> wait
> > for the method to stand the test of time before tagging.
> >
> > >I
> > > am sure WMF developers are facing >similar issues especially
> >
> > I don't think that's the case. It used to be the responsibility of the
> > person making the breaking change to fix all callers in the wikimedia
> > extension repo. Im not sure if that's still the case but nonetheless I do
> > not feel this is a significant problem for deployed extensions.(im sure
> > someone will correct me if im wrong)
> >
> > -bawolff
> > On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:
> >
> > > Mariya,
> > >
> > > Could you be more specific? What types of changes caused extensions to
> > > break? I might be mistaken but the vast majority of the API framework
> > > classes have been established over 5 years ago, with very few breaking
> > > changes since. Most changes were related to adding new functionality
> (new
> > > actions, new query submodules, new parameters), but that shouldn't have
> > > significantly affected extension development.
> > >
> > > I do plan to introduce a few breaking changes (majority of the
> extensions
> > > should not be affected) in 1.21, such as the introduction of
> versioning,
> > > further modularization to allow pageset reuse, etc.
> > > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> > >
> > > Please note that in a rare case some features might be purposefully
> > removed
> > > due to the security or scalability issues.
> > >
> > > --Yuri
> > >
> > >
> > > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > > mariya.mit...@gmail.com> wrote:
> > >
> > > > Hi all,
> > > >
> > > > I have been talking to many third-party yas part of my WMF internship
> > in
> > > > the last few weeks and one the main concerns they express is the lack
> > of
> > > > stability in the PHP classes MediaWiki exposes from version to
> version.
> > > The
> > > > frequent changes in what I would call the PHP-API makes extentions
> > > > developement and maintenance much more difficult with compatibility
> > from
> > > > version to version becoming a problem. Solving the problem would
> > probably
> > > > result in the development of more extensions, easier maintenance,
> less
> > > > hacks to core and more users upgrading to the latest MediaWiki
> > version. I
> > > > am sure WMF developers are facing similar issues especially with
> > projects
> > > > like WikiData going on.
> > > >
> > > > My question is: With the given technical heritage that MediaWiki
> > carries,
> > > > is it possible to have a (relatively) stable set of PHP classes
> > defined,
> > > > with a pledge not to change them in the next X releases or at least
> > with
> > > > some longer deprecation time? What would maintaining such a PHP-API
> > > entail?
> > > > How difficult is it given the vast number of dependancies in
> MediaWiki
> > > > code? Does it require restructuring a lot of the current core code?
> Do
> > > you
> > > > think it should be a definite goal for WMF?
> > > >
> > > > Thank you.
> > > >
> > > > Mariya
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimed

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Brian Wolff
I disagree with only. Extensions dont just query the db and sometimes
extensions need to do things we cant put in a web api. But it would be *c*ool
if using the api internally was encouraged.

(Keep in mind action=parse and action=edit of the api is unsafe to call
internally at some points in the code)

-bawolff
On 2013-02-11 9:47 AM, "Yuri Astrakhan"  wrote:

> I have a blasphemous proposal... extensions should only use the public API
> via FauxRequest objects. The public API interface has been very stable, and
> it gives MW core devs much more flexibility with changing the innards of
> the system...
>
>
> On Mon, Feb 11, 2013 at 8:36 AM, bawolff  wrote:
>
> > I don't think she means what we call the api - but more methods random
> > extensions are likely to use.
> >
> > We could start documenting certain stable methods with a special code
> > comment ( say @stable) which would mean something like this method will
> not
> > be removed, and if the need arises we'll remove the @stable and wait 2
> > versions before removing the method. Key candidates for this would
> include
> > title::newFromText, parser::recursiveTagParse, etc. Ideally one would
> wait
> > for the method to stand the test of time before tagging.
> >
> > >I
> > > am sure WMF developers are facing >similar issues especially
> >
> > I don't think that's the case. It used to be the responsibility of the
> > person making the breaking change to fix all callers in the wikimedia
> > extension repo. Im not sure if that's still the case but nonetheless I do
> > not feel this is a significant problem for deployed extensions.(im sure
> > someone will correct me if im wrong)
> >
> > -bawolff
> > On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:
> >
> > > Mariya,
> > >
> > > Could you be more specific? What types of changes caused extensions to
> > > break? I might be mistaken but the vast majority of the API framework
> > > classes have been established over 5 years ago, with very few breaking
> > > changes since. Most changes were related to adding new functionality
> (new
> > > actions, new query submodules, new parameters), but that shouldn't have
> > > significantly affected extension development.
> > >
> > > I do plan to introduce a few breaking changes (majority of the
> extensions
> > > should not be affected) in 1.21, such as the introduction of
> versioning,
> > > further modularization to allow pageset reuse, etc.
> > > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> > >
> > > Please note that in a rare case some features might be purposefully
> > removed
> > > due to the security or scalability issues.
> > >
> > > --Yuri
> > >
> > >
> > > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > > mariya.mit...@gmail.com> wrote:
> > >
> > > > Hi all,
> > > >
> > > > I have been talking to many third-party yas part of my WMF internship
> > in
> > > > the last few weeks and one the main concerns they express is the lack
> > of
> > > > stability in the PHP classes MediaWiki exposes from version to
> version.
> > > The
> > > > frequent changes in what I would call the PHP-API makes extentions
> > > > developement and maintenance much more difficult with compatibility
> > from
> > > > version to version becoming a problem. Solving the problem would
> > probably
> > > > result in the development of more extensions, easier maintenance,
> less
> > > > hacks to core and more users upgrading to the latest MediaWiki
> > version. I
> > > > am sure WMF developers are facing similar issues especially with
> > projects
> > > > like WikiData going on.
> > > >
> > > > My question is: With the given technical heritage that MediaWiki
> > carries,
> > > > is it possible to have a (relatively) stable set of PHP classes
> > defined,
> > > > with a pledge not to change them in the next X releases or at least
> > with
> > > > some longer deprecation time? What would maintaining such a PHP-API
> > > entail?
> > > > How difficult is it given the vast number of dependancies in
> MediaWiki
> > > > code? Does it require restructuring a lot of the current core code?
> Do
> > > you
> > > > think it should be a definite goal for WMF?
> > > >
> > > > Thank you.
> > > >
> > > > Mariya
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l 

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Maria Miteva
Yes, precisely. Is there an accepted way to call it, so I don't confuse
people every time I talk about it ?

Yuri, I am working on getting some concrete examples.

Mariya

On Mon, Feb 11, 2013 at 3:24 PM, Chad  wrote:

> I believe Mariya is talking about the internal APIs available to
> extensions,
> (eg: User, Title, EditPage, so forth), not the public API.
>
> Yay, acronyms!
>
> -Chad
>
> On Mon, Feb 11, 2013 at 8:14 AM, Yuri Astrakhan 
> wrote:
> > Mariya,
> >
> > Could you be more specific? What types of changes caused extensions to
> > break? I might be mistaken but the vast majority of the API framework
> > classes have been established over 5 years ago, with very few breaking
> > changes since. Most changes were related to adding new functionality (new
> > actions, new query submodules, new parameters), but that shouldn't have
> > significantly affected extension development.
> >
> > I do plan to introduce a few breaking changes (majority of the extensions
> > should not be affected) in 1.21, such as the introduction of versioning,
> > further modularization to allow pageset reuse, etc.
> > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> >
> > Please note that in a rare case some features might be purposefully
> removed
> > due to the security or scalability issues.
> >
> > --Yuri
> >
> >
> > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > mariya.mit...@gmail.com> wrote:
> >
> >> Hi all,
> >>
> >> I have been talking to many third-party yas part of my WMF internship in
> >> the last few weeks and one the main concerns they express is the lack of
> >> stability in the PHP classes MediaWiki exposes from version to version.
> The
> >> frequent changes in what I would call the PHP-API makes extentions
> >> developement and maintenance much more difficult with compatibility from
> >> version to version becoming a problem. Solving the problem would
> probably
> >> result in the development of more extensions, easier maintenance, less
> >> hacks to core and more users upgrading to the latest MediaWiki version.
> I
> >> am sure WMF developers are facing similar issues especially with
> projects
> >> like WikiData going on.
> >>
> >> My question is: With the given technical heritage that MediaWiki
> carries,
> >> is it possible to have a (relatively) stable set of PHP classes defined,
> >> with a pledge not to change them in the next X releases or at least with
> >> some longer deprecation time? What would maintaining such a PHP-API
> entail?
> >> How difficult is it given the vast number of dependancies in MediaWiki
> >> code? Does it require restructuring a lot of the current core code? Do
> you
> >> think it should be a definite goal for WMF?
> >>
> >> Thank you.
> >>
> >> Mariya
> >> ___
> >> Wikitech-l mailing list
> >> Wikitech-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Mark Bergsma

On Feb 9, 2013, at 11:21 PM, Asher Feldman  wrote:
> For this particular case, the API requests are for either getting specific
> sections of an article as opposed to either the whole thing, or the first
> section as part of an initial pageview.  I might not have grokked the
> original RFC email well, but I don't understand why this was being
> discussed as a logging challenge or necessitating a request header.  A
> mobile api request to just get section 3 of the article on otters should
> already utilize a query param denoting that section 3 is being fetched, and
> is already clearly not a "primary" request.

Yes, that part remains a bit unclear to me as well - some more details would be 
welcome.

> Whether or not it makes sense for mobile to move in the direction of
> splitting up article views into many api requests is something I'd love to
> see backed up by data.  I'm skeptical for multiple reasons.

What is the main motivation used here? Reducing article sizes/transfers at the 
expense of more latency?

-- 
Mark Bergsma 
Lead Operations Architect
Wikimedia Foundation





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Brian Wolff
Maria: see  http://www.mediawiki.org/wiki/API:Calling_internally for info
on calling the api internally.

However thinking about it the interface is not really well designed for
calling internally. One would expect a high level internal api would be
returning title objects. One would not expect such an api to require you to
have fake request objects. There is no documentation about which api
methods are safe to call from a parser hook extension. There are several
that are not.

Last of all, it doesnt really solve the problem maria brought up. The api
called internally can really only act as a high level way to query the db.
The db related methods are some of the most stable methods in all of
mediawiki. Using the api might isolate you from certain db schema change
But realistically non backwards compatible db changes are exceedingly rare.

-bawolff

On 2013-02-11 10:01 AM, "Maria Miteva"  wrote:
>
> Yes, precisely. Is there an accepted way to call it, so I don't confuse
> people every time I talk about it ?
>
> Yuri, I am working on getting some concrete examples.
>
> Mariya
>
> On Mon, Feb 11, 2013 at 3:24 PM, Chad  wrote:
>
> > I believe Mariya is talking about the internal APIs available to
> > extensions,
> > (eg: User, Title, EditPage, so forth), not the public API.
> >
> > Yay, acronyms!
> >
> > -Chad
> >
> > On Mon, Feb 11, 2013 at 8:14 AM, Yuri Astrakhan 
> > wrote:
> > > Mariya,
> > >
> > > Could you be more specific? What types of changes caused extensions to
> > > break? I might be mistaken but the vast majority of the API framework
> > > classes have been established over 5 years ago, with very few breaking
> > > changes since. Most changes were related to adding new functionality
(new
> > > actions, new query submodules, new parameters), but that shouldn't
have
> > > significantly affected extension development.
> > >
> > > I do plan to introduce a few breaking changes (majority of the
extensions
> > > should not be affected) in 1.21, such as the introduction of
versioning,
> > > further modularization to allow pageset reuse, etc.
> > > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> > >
> > > Please note that in a rare case some features might be purposefully
> > removed
> > > due to the security or scalability issues.
> > >
> > > --Yuri
> > >
> > >
> > > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > > mariya.mit...@gmail.com> wrote:
> > >
> > >> Hi all,
> > >>
> > >> I have been talking to many third-party yas part of my WMF
internship in
> > >> the last few weeks and one the main concerns they express is the
lack of
> > >> stability in the PHP classes MediaWiki exposes from version to
version.
> > The
> > >> frequent changes in what I would call the PHP-API makes extentions
> > >> developement and maintenance much more difficult with compatibility
from
> > >> version to version becoming a problem. Solving the problem would
> > probably
> > >> result in the development of more extensions, easier maintenance,
less
> > >> hacks to core and more users upgrading to the latest MediaWiki
version.
> > I
> > >> am sure WMF developers are facing similar issues especially with
> > projects
> > >> like WikiData going on.
> > >>
> > >> My question is: With the given technical heritage that MediaWiki
> > carries,
> > >> is it possible to have a (relatively) stable set of PHP classes
defined,
> > >> with a pledge not to change them in the next X releases or at least
with
> > >> some longer deprecation time? What would maintaining such a PHP-API
> > entail?
> > >> How difficult is it given the vast number of dependancies in
MediaWiki
> > >> code? Does it require restructuring a lot of the current core code?
Do
> > you
> > >> think it should be a definite goal for WMF?
> > >>
> > >> Thank you.
> > >>
> > >> Mariya
> > >> ___
> > >> Wikitech-l mailing list
> > >> Wikitech-l@lists.wikimedia.org
> > >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikisource & ProofreadPage roadmaps

2013-02-11 Thread Sumana Harihareswara
https://meta.wikimedia.org/wiki/Wikisource_roadmap

https://www.mediawiki.org/wiki/Extension:Proofread_Page/Roadmap

Some key items:
* making VisualEditor work with ProofreadPage
* documenting ProofreadPage better (maybe a sprint sometime soon using
Etherpads?)
* developing Wikisource ideas to go on
https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects
* engaging with DjVu developers & encouraging maintainership of divuLibre
* better automation tools
* microtasks

Anyone who wants to pitch in or comment probably ought to drop in on the
Wikisource mailing list
https://lists.wikimedia.org/mailman/listinfo/wikisource-l .
-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Denny Vrandečić
I don't see how something like parser functions, wiki syntax extensions,
skins, and many other extensions could feasibly be done using the Web-API.


2013/2/11 Yuri Astrakhan 

> I have a blasphemous proposal... extensions should only use the public API
> via FauxRequest objects. The public API interface has been very stable, and
> it gives MW core devs much more flexibility with changing the innards of
> the system...
>
>
> On Mon, Feb 11, 2013 at 8:36 AM, bawolff  wrote:
>
> > I don't think she means what we call the api - but more methods random
> > extensions are likely to use.
> >
> > We could start documenting certain stable methods with a special code
> > comment ( say @stable) which would mean something like this method will
> not
> > be removed, and if the need arises we'll remove the @stable and wait 2
> > versions before removing the method. Key candidates for this would
> include
> > title::newFromText, parser::recursiveTagParse, etc. Ideally one would
> wait
> > for the method to stand the test of time before tagging.
> >
> > >I
> > > am sure WMF developers are facing >similar issues especially
> >
> > I don't think that's the case. It used to be the responsibility of the
> > person making the breaking change to fix all callers in the wikimedia
> > extension repo. Im not sure if that's still the case but nonetheless I do
> > not feel this is a significant problem for deployed extensions.(im sure
> > someone will correct me if im wrong)
> >
> > -bawolff
> > On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:
> >
> > > Mariya,
> > >
> > > Could you be more specific? What types of changes caused extensions to
> > > break? I might be mistaken but the vast majority of the API framework
> > > classes have been established over 5 years ago, with very few breaking
> > > changes since. Most changes were related to adding new functionality
> (new
> > > actions, new query submodules, new parameters), but that shouldn't have
> > > significantly affected extension development.
> > >
> > > I do plan to introduce a few breaking changes (majority of the
> extensions
> > > should not be affected) in 1.21, such as the introduction of
> versioning,
> > > further modularization to allow pageset reuse, etc.
> > > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> > >
> > > Please note that in a rare case some features might be purposefully
> > removed
> > > due to the security or scalability issues.
> > >
> > > --Yuri
> > >
> > >
> > > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > > mariya.mit...@gmail.com> wrote:
> > >
> > > > Hi all,
> > > >
> > > > I have been talking to many third-party yas part of my WMF internship
> > in
> > > > the last few weeks and one the main concerns they express is the lack
> > of
> > > > stability in the PHP classes MediaWiki exposes from version to
> version.
> > > The
> > > > frequent changes in what I would call the PHP-API makes extentions
> > > > developement and maintenance much more difficult with compatibility
> > from
> > > > version to version becoming a problem. Solving the problem would
> > probably
> > > > result in the development of more extensions, easier maintenance,
> less
> > > > hacks to core and more users upgrading to the latest MediaWiki
> > version. I
> > > > am sure WMF developers are facing similar issues especially with
> > projects
> > > > like WikiData going on.
> > > >
> > > > My question is: With the given technical heritage that MediaWiki
> > carries,
> > > > is it possible to have a (relatively) stable set of PHP classes
> > defined,
> > > > with a pledge not to change them in the next X releases or at least
> > with
> > > > some longer deprecation time? What would maintaining such a PHP-API
> > > entail?
> > > > How difficult is it given the vast number of dependancies in
> MediaWiki
> > > > code? Does it require restructuring a lot of the current core code?
> Do
> > > you
> > > > think it should be a definite goal for WMF?
> > > >
> > > > Thank you.
> > > >
> > > > Mariya
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister

[Wikitech-l] Password requirements RFC

2013-02-11 Thread Matthew Flaschen
I've started an RFC about making password requirements stronger, in a
workable way that is configurable by group.

See
https://www.mediawiki.org/wiki/Requests_for_comment/Password_requirements .

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password requirements RFC

2013-02-11 Thread Sumana Harihareswara
On 02/08/2013 02:31 PM, Matthew Flaschen wrote:
> I've started an RFC about making password requirements stronger, in a
> workable way that is configurable by group.
> 
> See
> https://www.mediawiki.org/wiki/Requests_for_comment/Password_requirements .
> 
> Matt Flaschen
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Thanks, Matthew.  This strikes me as something the enterprise community
might also want to chime in on, so I'm cc'ing the mediawiki-enterprise
list.  Please discuss onwiki.  Thanks.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread Daniel Barrett
Platonides (and others) offered comments like:
>You could do [Excel to HTML] with openoffice.org/libreoffice,
>although I agree that getting all the dependencies right for running in the 
>server is a bit tedious.
>You can also use Excel itself for that (eg. COM automation), as suggested by 
>vitalif...

My team investigated several Excel-to-HTML solutions, including openoffice.org, 
Excel itself, a free converter on Google Code, etc. The clear winner was Aspose 
(www.aspose.com), running under Mono, for ease of automation with MediaWiki and 
quality of results. Relatively expensive but it works great.

DanB
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Mariya Nedelcheva Miteva
Hi all,

For what I've undrstood from conversations so far the Web or HTML API is
not enough for extension developement and the variability of exposed
internal classes is often responsible for the incompatibility of extensions
with certain MediaWiki versions. Correct me if I am wrong.


> > We could start documenting certain stable methods with a special code
> > comment ( say @stable) which would mean something like this method will
> not
> > be removed, and if the need arises we'll remove the @stable and wait 2
> > versions before removing the method. Key candidates for this would
> include
> > title::newFromText, parser::recursiveTagParse, etc. Ideally one would
> wait
> > for the method to stand the test of time before tagging.

So would this be a feasable improvement? Are there any other ideas of what
can be done?

I will hopefully sent an email with more specific examples tomorrow. Also,
would it be useful to have some sort of a IRC chat or other form of
conversation about this ?

Mariya



On Mon, Feb 11, 2013 at 5:21 PM, Denny Vrandečić <
denny.vrande...@wikimedia.de> wrote:

> I don't see how something like parser functions, wiki syntax extensions,
> skins, and many other extensions could feasibly be done using the Web-API.
>
>
> 2013/2/11 Yuri Astrakhan 
>
> > I have a blasphemous proposal... extensions should only use the public
> API
> > via FauxRequest objects. The public API interface has been very stable,
> and
> > it gives MW core devs much more flexibility with changing the innards of
> > the system...
> >
> >
> > On Mon, Feb 11, 2013 at 8:36 AM, bawolff  wrote:
> >
> > > I don't think she means what we call the api - but more methods random
> > > extensions are likely to use.
> > >
> > > We could start documenting certain stable methods with a special code
> > > comment ( say @stable) which would mean something like this method will
> > not
> > > be removed, and if the need arises we'll remove the @stable and wait 2
> > > versions before removing the method. Key candidates for this would
> > include
> > > title::newFromText, parser::recursiveTagParse, etc. Ideally one would
> > wait
> > > for the method to stand the test of time before tagging.
> > >
> > > >I
> > > > am sure WMF developers are facing >similar issues especially
> > >
> > > I don't think that's the case. It used to be the responsibility of the
> > > person making the breaking change to fix all callers in the wikimedia
> > > extension repo. Im not sure if that's still the case but nonetheless I
> do
> > > not feel this is a significant problem for deployed extensions.(im sure
> > > someone will correct me if im wrong)
> > >
> > > -bawolff
> > > On 2013-02-11 9:14 AM, "Yuri Astrakhan" 
> wrote:
> > >
> > > > Mariya,
> > > >
> > > > Could you be more specific? What types of changes caused extensions
> to
> > > > break? I might be mistaken but the vast majority of the API framework
> > > > classes have been established over 5 years ago, with very few
> breaking
> > > > changes since. Most changes were related to adding new functionality
> > (new
> > > > actions, new query submodules, new parameters), but that shouldn't
> have
> > > > significantly affected extension development.
> > > >
> > > > I do plan to introduce a few breaking changes (majority of the
> > extensions
> > > > should not be affected) in 1.21, such as the introduction of
> > versioning,
> > > > further modularization to allow pageset reuse, etc.
> > > > See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
> > > >
> > > > Please note that in a rare case some features might be purposefully
> > > removed
> > > > due to the security or scalability issues.
> > > >
> > > > --Yuri
> > > >
> > > >
> > > > On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> > > > mariya.mit...@gmail.com> wrote:
> > > >
> > > > > Hi all,
> > > > >
> > > > > I have been talking to many third-party yas part of my WMF
> internship
> > > in
> > > > > the last few weeks and one the main concerns they express is the
> lack
> > > of
> > > > > stability in the PHP classes MediaWiki exposes from version to
> > version.
> > > > The
> > > > > frequent changes in what I would call the PHP-API makes extentions
> > > > > developement and maintenance much more difficult with compatibility
> > > from
> > > > > version to version becoming a problem. Solving the problem would
> > > probably
> > > > > result in the development of more extensions, easier maintenance,
> > less
> > > > > hacks to core and more users upgrading to the latest MediaWiki
> > > version. I
> > > > > am sure WMF developers are facing similar issues especially with
> > > projects
> > > > > like WikiData going on.
> > > > >
> > > > > My question is: With the given technical heritage that MediaWiki
> > > carries,
> > > > > is it possible to have a (relatively) stable set of PHP classes
> > > defined,
> > > > > with a pledge not to change them in the next X releases or at least
> > > with
> > > > > some longer dep

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread Daniel Barrett
>> Each department winds up writing its own wiki page about the same
>> topic (say, Topic X), and they're all different.

>So it means most of your departments work on something very similar?

Not exactly. Each team treats its wiki as "The" wiki, and they create 
general-purpose articles like "How to request a day off from your manager" and 
"Where is the company cafeteria" that apply to the whole company. Individual 
writers do not think about the big picture, that general-purpose articles might 
belong a different, general-purpose wiki. They just want to get their job done 
quickly.  So these kinds of articles get written in the wrong wiki, or in 
several wikis at once, and they drift out of sync.  With a single, central 
wiki, this is much less likely.

Imagine if Wikipedia had a separate wiki for every city in the world. The same 
problem would result.

>Do you have any extensions or modifications that you would like to make public 
>& free & open source?
>Or maybe you even already did it with something? :-)

Indeed, we are working out a way to open-source some of our extensions.

DanB

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread .
The bad thing about corporate users is that have special needs, the
good thing is that are willing to pay for a service. Maybe somebody
should start selling that service (in the form of a "package",
"mediawiki distro", or other mode).

My company uses GoogleDocs, but we are developers, so we don't have
the mindset of the people that share photos inside .doc files.
Apparently you can embed googledocs in html.
http://en.support.wordpress.com/google-docs/

It would be natural for us to find a way to embed google docs in a
wiki,... but we don't need to, because a simple link is enough.


-- 
--
ℱin del ℳensaje.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Dmitriy Sintsov

On 11.02.2013 20:19, Mariya Nedelcheva Miteva wrote:

Hi all,

For what I've undrstood from conversations so far the Web or HTML API is
not enough for extension developement and the variability of exposed
internal classes is often responsible for the incompatibility of extensions
with certain MediaWiki versions. Correct me if I am wrong.


Stable internal API means MediaWiki LTS version. Another alternative is 
to update extensions to keep them compatible to internal API changes / 
new features.

There were quite a lot of outdated and abandoned extensions at mediawiki.org
Sometimes they are abandoned because the author lost his interest in 
developing for MediaWiki.
But there are another cases when customer funding was small / short, 
very limited and thus extension's maintainer does not have enough time 
to support it.
WMF extensions are updated, that's true, however there are many more 
extensions used at 3rd party wikis. Not so top quality as major WMF 
extensions, though.
If there were corporate donations for non-WMF extensions, things could 
have been improved.
But usually 3rd party site owners want quality for free while not 
willing to provide funding for long-time support. That's life.

Developing free software is great, until living difficulties come.
Dmitriy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Asher Feldman
On Monday, February 11, 2013, Mark Bergsma wrote:

>
> On Feb 9, 2013, at 11:21 PM, Asher Feldman 
> >
> wrote:
>
> > Whether or not it makes sense for mobile to move in the direction of
> > splitting up article views into many api requests is something I'd love
> to
> > see backed up by data.  I'm skeptical for multiple reasons.
>
> What is the main motivation used here? Reducing article sizes/transfers at
> the expense of more latency?
>

In cases where most sections (probably not even all) are loaded, I'd expect
it to increase the amount of data transfered beyond just the overhead of
the additional requests. gzip might take a 30k article down to 4k but
will be less efficient on individual sections. Text compresses really well,
and roundtrip latency is high on many cell networks.

And then I'd wonder about the server side implementation. How will frontend
cache invalidation work? Are we going to need to purge every individual
article section relative to /w/api.php on edit? Article HTML in memcached
(parser cache), mobile processed HTML in memcached.. Now individual
sections in memcached? If so, should we calculate memcached space needs for
article text as 3x the current parser cache utilization? More memcached
usage is great, not asking to dissuade its use but because its better to
capacity plan than to react.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread vitalif

Hello Mariya,

I'm not a WMF developer, but in my opinion, MediaWiki is already a 
fairly stable product. I personally had no problems with any PHP API 
changes :-) and I think it would be good if you provide some examples - 
what changes were a problem for someone?



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Antoine Musso
Le 11/02/13 13:58, Mariya Nedelcheva Miteva a écrit :
> I have been talking to many third-party yas part of my WMF internship in
> the last few weeks and one the main concerns they express is the lack of
> stability in the PHP classes MediaWiki exposes from version to version.


That has been kind of a recurring issue over the last few years.
VistaPrint have some nice and smart people, some of them are subscribed
to our lists.  Back in January 2012, "danb" posted a long email
regarding the upgrade of their internal installation from 1.18 to 1.19.
Do read it:

http://www.gossamer-threads.com/lists/wiki/wikitech/267106

He list 9 examples of breakages VistaPrint encountered in MediaWiki
extensions:

1) removal of global $action
2) removal of Xml::hidden()
3) broken Output::add() (had to migrate to resource loader)
4) various parser tag bugs
5) removal of MessageCache::addMessage()
6) removal of ts_makeSortable() (javascript)
7) brokage of WikiEditor adaptation
8) MediaWiki:common.js no more loading by default (security)
9) addHandler() javascript broken in IE8


Then the awesome Rob Lanphier (disclaimer: he is the director signing my
contracts within the WMF) replied about us "breaking backwards
compatibility" at http://www.gossamer-threads.com/lists/wiki/wikitech/267180

Since then, Rob has constantly stressed the MediaWiki core team to
retain backward compatibility as much as possible.  I think nowadays we
most of the time leave behind a back compatibility call flagged with
wfDeprecated() and remove the function after a few releases.  I guess
that would prevent lot of breakage in the future.

The JavaScript issues are probably related to us not having the nice JS
we have today nor the awesome QUnit test suite Krinkle bootstraped. I
guess some of those issues would be catched nowadays.

Same for IE8 compatibility, we have some virtual machines running
Selenium tests against tons and tons of different browser.  That raises
our likeness to detect such issues.



Anyway, what we could do is start writing PHP interfaces for our legacy
code and have our classes defined as implementing them:

 http://php.net/manual/en/language.oop5.interfaces.php

That would force us to actually document all of our legacy code and will
present a nice list of what we expose to the public.

cheers,

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-11 Thread Antoine Musso
Le 11/02/13 14:53, Lydia Pintscher a écrit :
> * You can see an example of how wikidata works at
> http://it.wikipedia.org/wiki/Marie_Curie

For those not familiar with WikiData, scroll down to the very bottom and
look at the bottom left. There is a link "Modifica Link" which points to
the Wikidata dataset:
 http://www.wikidata.org/wiki/Q7186

From there our awesome community can fill information about "Marie
Curie" such as the interwiki links but also her place of birth, spouse
or image ...

That project is going to rock'n roll once it is deployed everywhere. I
cant wait for the English community to start using it :-]



-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread vitalif

1) removal of global $action
2) removal of Xml::hidden()
3) broken Output::add() (had to migrate to resource loader)
4) various parser tag bugs
5) removal of MessageCache::addMessage()
6) removal of ts_makeSortable() (javascript)
7) brokage of WikiEditor adaptation
8) MediaWiki:common.js no more loading by default (security)
9) addHandler() javascript broken in IE8


Most of these were deprecations, am I correct?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Brian Wolff
>
> 1) removal of global $action
> 2) removal of Xml::hidden()
> 3) broken Output::add() (had to migrate to resource loader)
> 4) various parser tag bugs
> 5) removal of MessageCache::addMessage()
> 6) removal of ts_makeSortable() (javascript)
> 7) brokage of WikiEditor adaptation
> 8) MediaWiki:common.js no more loading by default (security)
> 9) addHandler() javascript broken in IE8
>
>

Documentation seems partially the answer here. $action being available at
any point was an accident afaik. Regardless of any precaitions we take, the
breakage of $action would probably not be ptevented. (That said most of the
other issues mentioned should not have happened).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Daniel Barrett
vita...@yourcmc.ru writes:
> I think it would be good if you provide some examples - what changes were a 
> problem for someone?

See extensive examples in my post about 1.18 update difficulties:
http://lists.wikimedia.org/pipermail/wikitech-l/2012-January/057486.html

DanB
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help us test the latest Article Feedback Tool features

2013-02-11 Thread Chris McMahon
Links and details, along with updates as they happen, are available on the
testing page at
http://www.mediawiki.org/wiki/Article_feedback/Version_5/Testing_Feb_2013
...



The WMF Editor Engagement team has been updating the Article Feedback tool,
improving both the back end architecture and the user experience in
preparation for further deployments of AFTv5 to French and German
Wikipedias.

For the week of 11 February 2013, the Editor Engagement team welcomes any
reports of issues or problems with AFTv5. We have arranged a dedicated test
environment for the latest version of AFTv5 and we can fully support any
testers from the community who care to contribute their time and expertise
to examining the new features.

The latest AFTv5 is fully implemented on the host at
http://ee-prototype.wmflabs.org/wiki/Main_Page, in particular the page at
http://ee-prototype.wmflabs.org/wiki/Special:ArticleFeedbackv5/Golden-crowned_Sparrow

Detailed descriptions of the new features are as always available in the AFTv5
documentation,
but a brief overview of features to be tested contains:

*Link for editors to see feedback on their
articles
*

   - Provides an easy way to see current feedback for each article

*Simplified moderation
tools
*

   - New page elements to make article moderation more intuitive

*More filters for feedback
page
*

   - Allow more configurable views of article feedback

*Streamline editor tool
set
*

   - Remove reader tools from editor's view of article feedback

*More abuse 
filters
*

   - Increase signal and reduce noise in feedback with abuse filters

*Satisfaction 
rating
*

   - Present an overall view of aggregate feedback per article

*Post without 
comment
*

   - Support simple approval/disapproval without requiring text input

Some of these new features may be tested by the casual user, but some
features require special accounts or privileges in order to exercise them.

Testers who require special privileges to examine the more sophisticated
aspects of AFTv5 are encouraged to reply to this message, update the Talk
page for the test plan, or contact the Editor Engagement team directly for
help.
[edit
]Reporting Issues

   - Use this 
link
to
   access an input screen for Bugzilla with the appropriate first fields
   filled automatically:
  - Product: MediaWiki extensions
  - Component: 'ArticleFeedbackv5'
  - Keywords: 'aftv5-1.5'
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GSoC 2013 - we'll apply

2013-02-11 Thread Sumana Harihareswara
Google's announced that they will run Google Summer of Code again this
year, and we plan to apply.  I've updated
https://www.mediawiki.org/wiki/Summer_of_Code_2013 and I'd love for
potential mentors to add their ideas to
https://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects .

Note that this year students are expected to work on their project June
17-September 23, so mentors would also be needed during that period, and
somewhat less April 8th-June 17th for talking with potential candidates
and helping with the community bonding period.
-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Max Semenik
On 11.02.2013, 22:11 Asher wrote:

> And then I'd wonder about the server side implementation. How will frontend
> cache invalidation work? Are we going to need to purge every individual
> article section relative to /w/api.php on edit?

Since the API doesn't require pretty URLs, we could simply append the
current revision ID to the mobileview URLs.

> Article HTML in memcached
> (parser cache), mobile processed HTML in memcached.. Now individual
> sections in memcached? If so, should we calculate memcached space needs for
> article text as 3x the current parser cache utilization? More memcached
> usage is great, not asking to dissuade its use but because its better to
> capacity plan than to react.

action=mobileview caches pages only in full and serves
only sections requested, so no changes in request patterns will result
in increased memcached usage.

-- 
Best regards,
  Max Semenik ([[User:MaxSem]])


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Asher Feldman
Max - good answers re: caching concerns.  That leaves studying if the bytes
transferred on average mobile article view increases or decreases with lazy
section loading.  If it increases, I'd say this isn't a positive direction
to go in and stop there.  If it decreases, then we should look at the
effect on total latency, number of requests required per pageview, and the
impact on backend apache utilization which I'd expect to be > 0.

Does the mobile team have specific goals that this project aims to
accomplish?  If so, we can use those as the measure against which to
compare an impact analysis.

On Mon, Feb 11, 2013 at 12:21 PM, Max Semenik  wrote:

> On 11.02.2013, 22:11 Asher wrote:
>
> > And then I'd wonder about the server side implementation. How will
> frontend
> > cache invalidation work? Are we going to need to purge every individual
> > article section relative to /w/api.php on edit?
>
> Since the API doesn't require pretty URLs, we could simply append the
> current revision ID to the mobileview URLs.
>
> > Article HTML in memcached
> > (parser cache), mobile processed HTML in memcached.. Now individual
> > sections in memcached? If so, should we calculate memcached space needs
> for
> > article text as 3x the current parser cache utilization? More memcached
> > usage is great, not asking to dissuade its use but because its better to
> > capacity plan than to react.
>
> action=mobileview caches pages only in full and serves
> only sections requested, so no changes in request patterns will result
> in increased memcached usage.
>
> --
> Best regards,
>   Max Semenik ([[User:MaxSem]])
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-11 Thread S Page
On Mon, Feb 11, 2013 at 10:49 AM, Antoine Musso  wrote:

>  http://www.wikidata.org/wiki/Q7186
>
> From there our awesome community can fill information about "Marie
> Curie" such as the interwiki links but also her place of birth, spouse
> or image ...

I assume the "[ add source ]" is the equivalent of {{citation needed}}
for facts. I couldn't find any help or guidance on filling it in,
though the FAQ says "Some pieces are not yet working, however,
including the sources interface."

> That project is going to rock'n roll once it is deployed everywhere. I
> cant wait for the English community to start using it :-]

Yes, so many facts!

--
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deployment of the first phase of Wikidata on enwp

2013-02-11 Thread Lydia Pintscher
Heya :)

We tried to deploy phase 1 on enwp today but ran into issues. We'll
have to reschedule. Currently it looks like we'll do this on
Wednesday.
Sorry folks.


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: Watchlists are coming to mobile

2013-02-11 Thread Jon Robson
-- Forwarded message --
From: Jon Robson 
Date: Mon, Feb 11, 2013 at 2:44 PM
Subject: Watchlists are coming to mobile
To: mobile-l , Wikimedia Mailing List



After a few months of development the mobile team is launching
watchlists to the mobile site tomorrow. The feature to start with will
give a simplified recent changes view a la the desktop site as well as
a reading list view for users who are simply interested in keeping
track of articles they are interested in reading.

This change also brings in login and account creation functionality to
all Wiki* projects.

This from my perspective is one of the most exciting mobile
developments so far as it makes the watchlist star a much more
prominent figure in the user interface and will hopefully encourage
new users to create accounts to use it, many of whom may be unaware
that Wikipedia can be edited. The hope is these new users will find
the feature useful and can be lured into the realm of becoming a
contributor to our projects via mobile.

Provided we don't run into any issues during deployment this should be
deployed tomorrow.

Please feel free to comment on this mail but if you experience any
bugs in the aftermath please raise them here:
https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions&component=MobileFrontend

You can also try out the new feature a little early by opting into the
beta here:
https://en.m.wikipedia.org/w/index.php?title=Special:MobileOptions

Looking forward to all your feedback.

Screenshot:
http://imgur.com/jnl5XA4

-- 
Jon Robson
http://jonrobson.me.uk
@rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Fwd: Watchlists are coming to mobile

2013-02-11 Thread Philippe Beaudette
Srsly.  Cool.  Really.

___
Philippe Beaudette
Director, Community Advocacy
Wikimedia Foundation, Inc.

415-839-6885, x 6643

phili...@wikimedia.org


On Mon, Feb 11, 2013 at 2:52 PM, Jon Robson  wrote:

> -- Forwarded message --
> From: Jon Robson 
> Date: Mon, Feb 11, 2013 at 2:44 PM
> Subject: Watchlists are coming to mobile
> To: mobile-l , Wikimedia Mailing List
> 
>
>
> After a few months of development the mobile team is launching
> watchlists to the mobile site tomorrow. The feature to start with will
> give a simplified recent changes view a la the desktop site as well as
> a reading list view for users who are simply interested in keeping
> track of articles they are interested in reading.
>
> This change also brings in login and account creation functionality to
> all Wiki* projects.
>
> This from my perspective is one of the most exciting mobile
> developments so far as it makes the watchlist star a much more
> prominent figure in the user interface and will hopefully encourage
> new users to create accounts to use it, many of whom may be unaware
> that Wikipedia can be edited. The hope is these new users will find
> the feature useful and can be lured into the realm of becoming a
> contributor to our projects via mobile.
>
> Provided we don't run into any issues during deployment this should be
> deployed tomorrow.
>
> Please feel free to comment on this mail but if you experience any
> bugs in the aftermath please raise them here:
>
> https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions&component=MobileFrontend
>
> You can also try out the new feature a little early by opting into the
> beta here:
> https://en.m.wikipedia.org/w/index.php?title=Special:MobileOptions
>
> Looking forward to all your feedback.
>
> Screenshot:
> http://imgur.com/jnl5XA4
>
> --
> Jon Robson
> http://jonrobson.me.uk
> @rakugojon
>
> ___
> Wikimedia-l mailing list
> wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-11 Thread Chad
On Fri, Feb 8, 2013 at 8:52 AM, Chad  wrote:
> Rest assured--we will still be getting the latest and greatest. And we're 
> still
> on target for late Monday/early Tuesday.
>

Friendly reminder that Gerrit will be coming down in about an hour
for the upgrade. During the upgrade, you may be able to hit Gerrit
intermittently as it'll restart several times. Once the dust has
settled, I'll be sure to let everyone know.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Jon Robson
I'm a bit worried that now we are asking why pages are lazy loaded
rather than focusing on the fact that they currently __are doing
this___ and how we can log these (if we want to discuss this further
let's start another thread as I'm getting extremely confused doing so
on this one).

Lazy loading sections

For motivation behind moving MobileFrontend into the direction of lazy
loading section content and subsequent pages can be found here [1], I
just gave it a refresh as it was a little out of date.

In summary the reason is to
1) make the app feel more responsive by simply loading content rather
than reloading the entire interface
2) reducing the payload sent to a device.

Session Tracking


Going back to the discussion of tracking mobile page views, it sounds
like a header stating whether a page is being viewed in alpha, beta or
stable works fine for standard page views.

As for the situations where an entire page is loaded via the api it
makes no difference to us to whether we
1) send the same header (set via javascript) or
2) add a query string parameter.

The only advantage I can see of using a header is that an initial page
load of the article San Francisco currently uses the same api url as a
page load of the article San Francisco via javascript (e.g. I click a
link to 'San Francisco' on the California article).

In this new method they would use different urls (as the data sent is
different). I'm not sure how that would effect caching.

Let us know which method is preferred. From my perspective
implementation of either is easy.

[1] http://www.mediawiki.org/wiki/MobileFrontend/Dynamic_Sections

On Mon, Feb 11, 2013 at 12:50 PM, Asher Feldman  wrote:
> Max - good answers re: caching concerns.  That leaves studying if the bytes
> transferred on average mobile article view increases or decreases with lazy
> section loading.  If it increases, I'd say this isn't a positive direction
> to go in and stop there.  If it decreases, then we should look at the
> effect on total latency, number of requests required per pageview, and the
> impact on backend apache utilization which I'd expect to be > 0.
>
> Does the mobile team have specific goals that this project aims to
> accomplish?  If so, we can use those as the measure against which to
> compare an impact analysis.
>
> On Mon, Feb 11, 2013 at 12:21 PM, Max Semenik  wrote:
>
>> On 11.02.2013, 22:11 Asher wrote:
>>
>> > And then I'd wonder about the server side implementation. How will
>> frontend
>> > cache invalidation work? Are we going to need to purge every individual
>> > article section relative to /w/api.php on edit?
>>
>> Since the API doesn't require pretty URLs, we could simply append the
>> current revision ID to the mobileview URLs.
>>
>> > Article HTML in memcached
>> > (parser cache), mobile processed HTML in memcached.. Now individual
>> > sections in memcached? If so, should we calculate memcached space needs
>> for
>> > article text as 3x the current parser cache utilization? More memcached
>> > usage is great, not asking to dissuade its use but because its better to
>> > capacity plan than to react.
>>
>> action=mobileview caches pages only in full and serves
>> only sections requested, so no changes in request patterns will result
>> in increased memcached usage.
>>
>> --
>> Best regards,
>>   Max Semenik ([[User:MaxSem]])
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
http://jonrobson.me.uk
@rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Announcement: Ed Sanders joins Wikimedia as Visual Editor Software Engineer

2013-02-11 Thread Terry Chay
Everyone,

I'm delighted to announce that Ed Sanders has joined the VisualEditor[0] team 
as a software engineer from today. He will be focussed on the data structures 
and APIs inside the VisualEditor, and in particular its "data model" component. 
Ed will be working remotely from the UK.

Ed has worked as a Web front-end software engineer for several years, most 
recently working for TripAdvisor. He received his MA in Computer Science from 
Clare College at the University of Cambridge.

Ed  is a long-term contributor to Commons and the English Wikipedia, where he 
has been a sysop since 2004.[1] In his spare time, Ed enjoys photography, 
watching his local football team (real football, not handegg), Arsenal,[2] and 
sometimes doing both at the same time.

He'll be in the SF office for the next few weeks so be sure to stop by and say 
hi!

Please join me in welcoming Ed!

terry

[0] - https://www.mediawiki.org/wiki/VisualEditor
[1] - https://commons.wikimedia.org/wiki/User:Ed_g2s
[2] - https://en.wikipedia.org/wiki/Arsenal

terry chay  최태리
Director of Features Engineering
Wikimedia Foundation
“Imagine a world in which every single human being can freely share in the sum 
of all knowledge. That's our commitment.”

p: +1 (415) 839-6885 x6832
m: +1 (408) 480-8902
e: tc...@wikimedia.org
i: http://terrychay.com/
w: http://meta.wikimedia.org/wiki/User:Tychay
aim: terrychay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread Matthew Flaschen
On 02/11/2013 09:50 AM, Brian Wolff wrote:
> Maria: see  http://www.mediawiki.org/wiki/API:Calling_internally for info
> on calling the api internally.

I think you're misunderstanding what she's saying.  She's talk about
ordinary PHP classes, methods and hooks, basically the way an extension
normally talks to core.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-11 Thread Chad
On Mon, Feb 11, 2013 at 6:49 PM, Chad  wrote:
> On Fri, Feb 8, 2013 at 8:52 AM, Chad  wrote:
>> Rest assured--we will still be getting the latest and greatest. And we're 
>> still
>> on target for late Monday/early Tuesday.
>>
>
> Friendly reminder that Gerrit will be coming down in about an hour
> for the upgrade. During the upgrade, you may be able to hit Gerrit
> intermittently as it'll restart several times. Once the dust has
> settled, I'll be sure to let everyone know.
>

Took a few minutes longer than expected, but we're back up and
everything's live. We had to deploy a newer version to grab one
last fix we spotted during the upgrade. Our deployed version is
now 2.5.1-1266-gcc231e1.

There might be a few problems left over with the IRC notifications,
I'll tackle those tomorrow (making sure replication is working properly
now). If you spot any other problems, please let me know.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Arthur Richards
Thanks, Jon. To try and clarify a bit more about the API requests... they
are not made on a per-section basis. As I mentioned earlier, there are two
cases in which article content gets loaded by the API:

1) Going directly to a page (eg clicking a link from a Google search) will
result in the backend serving a page with ONLY summary section content and
section headers. The rest of the page is lazily loaded via API request once
the JS for the page gets loaded. The idea is to increase responsiveness by
reducing the delay for an article to load (further details in the article
Jon previously linked to). The API request looks like:
http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all

2) Loading an article entirely via Javascript - like when a link is clicked
in an article to another article, or an article is loaded via search. This
will make ONE call to the API to load article content. API request looks
like:
http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all

These API requests are identical, but only #2 should be counted as a
'pageview' - #1 is a secondary API request and should not be counted as a
'pageview'. You could make the argument that we just count all of these API
requests as pageviews, but there are cases when we can't load article
content from the API (like devices that do not support JS), so we need to
be able to count the traditional page request as a pageview - thus we need
a way to differentiate the types of API requests being made when they
otherwise share the same URL.



On Mon, Feb 11, 2013 at 6:42 PM, Jon Robson  wrote:

> I'm a bit worried that now we are asking why pages are lazy loaded
> rather than focusing on the fact that they currently __are doing
> this___ and how we can log these (if we want to discuss this further
> let's start another thread as I'm getting extremely confused doing so
> on this one).
>
> Lazy loading sections
> 
> For motivation behind moving MobileFrontend into the direction of lazy
> loading section content and subsequent pages can be found here [1], I
> just gave it a refresh as it was a little out of date.
>
> In summary the reason is to
> 1) make the app feel more responsive by simply loading content rather
> than reloading the entire interface
> 2) reducing the payload sent to a device.
>
> Session Tracking
> 
>
> Going back to the discussion of tracking mobile page views, it sounds
> like a header stating whether a page is being viewed in alpha, beta or
> stable works fine for standard page views.
>
> As for the situations where an entire page is loaded via the api it
> makes no difference to us to whether we
> 1) send the same header (set via javascript) or
> 2) add a query string parameter.
>
> The only advantage I can see of using a header is that an initial page
> load of the article San Francisco currently uses the same api url as a
> page load of the article San Francisco via javascript (e.g. I click a
> link to 'San Francisco' on the California article).
>
> In this new method they would use different urls (as the data sent is
> different). I'm not sure how that would effect caching.
>
> Let us know which method is preferred. From my perspective
> implementation of either is easy.
>
> [1] http://www.mediawiki.org/wiki/MobileFrontend/Dynamic_Sections
>
> On Mon, Feb 11, 2013 at 12:50 PM, Asher Feldman 
> wrote:
> > Max - good answers re: caching concerns.  That leaves studying if the
> bytes
> > transferred on average mobile article view increases or decreases with
> lazy
> > section loading.  If it increases, I'd say this isn't a positive
> direction
> > to go in and stop there.  If it decreases, then we should look at the
> > effect on total latency, number of requests required per pageview, and
> the
> > impact on backend apache utilization which I'd expect to be > 0.
> >
> > Does the mobile team have specific goals that this project aims to
> > accomplish?  If so, we can use those as the measure against which to
> > compare an impact analysis.
> >
> > On Mon, Feb 11, 2013 at 12:21 PM, Max Semenik 
> wrote:
> >
> >> On 11.02.2013, 22:11 Asher wrote:
> >>
> >> > And then I'd wonder about the server side implementation. How will
> >> frontend
> >> > cache invalidation work? Are we going to need to purge every
> individual
> >> > article section relative to /w/api.php on edit?
> >>
> >> Since the API doesn't require pretty URLs, we could simply append the
> >> current revision ID to the mobileview URLs.
> >>
> >> > Article HTML in memcached
> >> > (parser cache), mobile processed HTML in memcached.. Now individual
> >> > sections in memcached? If so, should we calculate memcached space
> needs
> >> for
> >> > arti

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-11 Thread Asher Feldman
Thanks for the clarification Arthur, that clears up some misconceptions I
had.  I saw a demo around the allstaff where individual sections were lazy
loaded, so I think I had that in my head.

It does still seem to me that the data to determine secondary api requests
should already be present in the existing log line. If the value of the
page param in an action=mobileview api request matches the page in the
referrer (perhaps with normalization), it's a secondary request as per case
1 below.  Otherwise, it's a pageview as per case 2.  Difficult or expensive
to reconcile?  Not when you're doing distributed log analysis via hadoop.

On Mon, Feb 11, 2013 at 7:11 PM, Arthur Richards wrote:

> Thanks, Jon. To try and clarify a bit more about the API requests... they
> are not made on a per-section basis. As I mentioned earlier, there are two
> cases in which article content gets loaded by the API:
>
> 1) Going directly to a page (eg clicking a link from a Google search) will
> result in the backend serving a page with ONLY summary section content and
> section headers. The rest of the page is lazily loaded via API request once
> the JS for the page gets loaded. The idea is to increase responsiveness by
> reducing the delay for an article to load (further details in the article
> Jon previously linked to). The API request looks like:
>
> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
>
> 2) Loading an article entirely via Javascript - like when a link is clicked
> in an article to another article, or an article is loaded via search. This
> will make ONE call to the API to load article content. API request looks
> like:
>
> http://en.m.wikipedia.org/w/api.php?format=json&action=mobileview&page=Liverpool+F.C.+in+European+football&variant=en&redirects=yes&prop=sections%7Ctext&noheadings=yes§ionprop=level%7Cline%7Canchor§ions=all
>
> These API requests are identical, but only #2 should be counted as a
> 'pageview' - #1 is a secondary API request and should not be counted as a
> 'pageview'. You could make the argument that we just count all of these API
> requests as pageviews, but there are cases when we can't load article
> content from the API (like devices that do not support JS), so we need to
> be able to count the traditional page request as a pageview - thus we need
> a way to differentiate the types of API requests being made when they
> otherwise share the same URL.
>
>
>
> On Mon, Feb 11, 2013 at 6:42 PM, Jon Robson  wrote:
>
> > I'm a bit worried that now we are asking why pages are lazy loaded
> > rather than focusing on the fact that they currently __are doing
> > this___ and how we can log these (if we want to discuss this further
> > let's start another thread as I'm getting extremely confused doing so
> > on this one).
> >
> > Lazy loading sections
> > 
> > For motivation behind moving MobileFrontend into the direction of lazy
> > loading section content and subsequent pages can be found here [1], I
> > just gave it a refresh as it was a little out of date.
> >
> > In summary the reason is to
> > 1) make the app feel more responsive by simply loading content rather
> > than reloading the entire interface
> > 2) reducing the payload sent to a device.
> >
> > Session Tracking
> > 
> >
> > Going back to the discussion of tracking mobile page views, it sounds
> > like a header stating whether a page is being viewed in alpha, beta or
> > stable works fine for standard page views.
> >
> > As for the situations where an entire page is loaded via the api it
> > makes no difference to us to whether we
> > 1) send the same header (set via javascript) or
> > 2) add a query string parameter.
> >
> > The only advantage I can see of using a header is that an initial page
> > load of the article San Francisco currently uses the same api url as a
> > page load of the article San Francisco via javascript (e.g. I click a
> > link to 'San Francisco' on the California article).
> >
> > In this new method they would use different urls (as the data sent is
> > different). I'm not sure how that would effect caching.
> >
> > Let us know which method is preferred. From my perspective
> > implementation of either is easy.
> >
> > [1] http://www.mediawiki.org/wiki/MobileFrontend/Dynamic_Sections
> >
> > On Mon, Feb 11, 2013 at 12:50 PM, Asher Feldman 
> > wrote:
> > > Max - good answers re: caching concerns.  That leaves studying if the
> > bytes
> > > transferred on average mobile article view increases or decreases with
> > lazy
> > > section loading.  If it increases, I'd say this isn't a positive
> > direction
> > > to go in and stop there.  If it decreases, then we should look at the
> > > effect on total latency, number of requests required per pageview, and
> > the
> > > impact on backend apache utilization which I'd expect to be > 0

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-11 Thread Ori Livneh


On Monday, February 11, 2013 at 6:33 PM, Chad wrote:

> Took a few minutes longer than expected, but we're back up and
> everything's live. We had to deploy a newer version to grab one
> last fix we spotted during the upgrade. Our deployed version is
> now 2.5.1-1266-gcc231e1.
> 
> There might be a few problems left over with the IRC notifications,
> I'll tackle those tomorrow (making sure replication is working properly
> now). If you spot any other problems, please let me know.
> 
> -Chad

Cool, well done. The UI has become more polished, it seems, and considerably 
more responsive. It feels a lot faster than before. LGTM!


--
Ori Livneh




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread S Page
On Thu, Feb 7, 2013 at 1:31 PM, Daniel Barrett  wrote:
> ...
> 1. A desire for a department to have "their own space" on the wiki.

I assume you looked at enabling subpages in the main namespace?[1]
That way Human Resources/Payroll/Show_me_the_money gets a nice
breadcrumb up to Payroll and Human Resources landing pages.  You can
encourage people to create subpages rather than making yet another
top-level page by putting [Create page] forms on landing pages  that
use a local template[2] and prepend the local hierarchy.

> I'm not talking about access control, but (1) customized look & feel, and (2) 
> ability to narrow searches to find articles only within that space.

(1) Code could infer subpage hierarchy and apply CSS from a
corresponding CSS hierarchy.

(2) Add prefix: to the searches to search subpages, you can make a
form for it[3].  Also Special:PrefixIndex can be helpful, e.g. just
listing all subpages of the current landing page:
  == Subpages of {{FULLPAGENAME}}==
  {{Special:PrefixIndex/{{FULLPAGENAME}}/}}


Cheers,

[1] http://www.mediawiki.org/wiki/Manual:$wgNamespacesWithSubpages

[2] something like

type=create
preload=Template:Human Resources meeting
buttonlabel=Create a new page for a Human Resources meeting
default=Human Resources/Meetings/{{CURRENTYEAR}}-{{CURRENTMONTH}}-{{CURRENTDAY}}
width=40
bgcolor=#f0f0ff


[3] http://en.wikipedia.org/wiki/Template:Search_box and similar.

--
=S Page  software engineer on Editor Engagement Experiments

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-11 Thread Matthew Flaschen
On 02/11/2013 09:33 PM, Chad wrote:
> There might be a few problems left over with the IRC notifications,
> I'll tackle those tomorrow (making sure replication is working properly
> now). If you spot any other problems, please let me know.

I'm getting Internal Server Error intermittently in the add a reviewer box.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-11 Thread Chad
On Mon, Feb 11, 2013 at 11:57 PM, Matthew Flaschen
 wrote:
> On 02/11/2013 09:33 PM, Chad wrote:
>> There might be a few problems left over with the IRC notifications,
>> I'll tackle those tomorrow (making sure replication is working properly
>> now). If you spot any other problems, please let me know.
>
> I'm getting Internal Server Error intermittently in the add a reviewer box.
>

Roan was hitting this earlier, but I wasn't able to replicate at the
time. It's on my todo list to check tomorrow morning.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-11 Thread Brian Wolff
On 2013-02-12 12:55 AM, "S Page"  wrote:
>
> On Thu, Feb 7, 2013 at 1:31 PM, Daniel Barrett 
wrote:
> > ...
> > 1. A desire for a department to have "their own space" on the wiki.
>
> I assume you looked at enabling subpages in the main namespace?[1]
> That way Human Resources/Payroll/Show_me_the_money gets a nice
> breadcrumb up to Payroll and Human Resources landing pages.  You can
> encourage people to create subpages rather than making yet another
> top-level page by putting [Create page] forms on landing pages  that
> use a local template[2] and prepend the local hierarchy.
>
> > I'm not talking about access control, but (1) customized look & feel,
and (2) ability to narrow searches to find articles only within that space.
>
> (1) Code could infer subpage hierarchy and apply CSS from a
> corresponding CSS hierarchy.
>
> (2) Add prefix: to the searches to search subpages, you can make a
> form for it[3].

It should be noted that that doesnt work out of the box but needs
lucene/MWSearch extension.

For subpages to really fill this use case I think the page title would have
to show only (or primarily emphasize) the subpage name instead of the full
page name.

Also it sounds like in such a use case, one would want links to be relative
to the current path first. If on page a/b/c you would want [[foo]] to link
to a/b/foo if it exists and link to just foo if that page does not exist.

I think a good take away from this thread is that mediawiki has a lot of
featuters that almost fit the bill but don't quite fully.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password requirements RFC

2013-02-11 Thread Matthew Flaschen
On 02/11/2013 10:27 AM, Sumana Harihareswara wrote:
> On 02/08/2013 02:31 PM, Matthew Flaschen wrote:
>> I've started an RFC about making password requirements stronger, in a
>> workable way that is configurable by group.
>>
>> See
>> https://www.mediawiki.org/wiki/Requests_for_comment/Password_requirements .

I've posted at
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Password_requirements_discussion_regarding_MediaWiki
.
I encourage people to publicize the RFC on other wikis.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password requirements RFC

2013-02-11 Thread OQ
I  really don't see the drawbacks to making the settings more configurable
as long as it defaults to currently configured settings for a wiki. As long
as what would trigger a forced password change can be controlled by the end
user of the software, I can't think anything that should stop development.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l