[Wikitech-l] Language team celebrates 10 years of MediaWiki Language Extension Bundle

2023-02-08 Thread Niklas Laxström
Slightly over ten years ago the Language team released the first MediaWiki
Language Extension Bundle, or MLEB for short. To date, we have made 64
releases supporting MediaWiki versions from 1.19 to 1.40alpha, and PHP
versions from 5.2 to 8.

Some things are still the same:
* MediaWiki version compatibility policy
* tarball releases
* the goal of providing top notch language support in a convenient package

Some things have changed:
* we went from monthly releases to quarterly releases
* LocalizationUpdate is no longer included in the bundle because it was
made obsolete by automatic translation backports

We don't know much about the users of the MLEB, but according to our
statistics, each tarball release is downloaded on average approximately 300
times. Some people use MLEB releases via git.

As the initiator of MLEB, I am happy to see how far we have come. Thanks
belong to Abijeet Patro and Kartik Mistry who together have done the actual
work to make the releases in the past years.

Fun fact: the tool used to make MLEB releases is called melange. I don't
remember for sure, but I think the name was chosen because it's short for
"MEdiawiki LANGuagE'' and it had a quite appropriate meaning: "A mixture of
different things; a disordered mixture" (English Wiktionary).

The first release announcement is quoted below.

  -Niklas

ke 28. marrask. 2012 klo 12.17 Niklas Laxström (niklas.laxst...@gmail.com)
kirjoitti:

> The Wikimedia Language Engineering team is pleased to announce the
> first release of the MediaWiki Language Extension Bundle. The bundle
> is a collection of selected MediaWiki extensions needed by any wiki
> which desires to be multilingual.
>
> This first bundle release (2012.11) is compatible with MediaWiki 1.19,
> 1.20 and 1.21alpha.
> Get it from https://www.mediawiki.org/wiki/MLEB
>
> The Universal Language Selector is a must have, because it provides an
> essential functionality for any user regardless on the number of
> languages he/she speaks: language selection, font support for
> displaying scripts badly supported by operating systems and input
> methods for typing languages that don't use Latin (a-z) alphabet.
>
> Maintaining multilingual content in a wiki is a mess without the
> Translate extension, which is used by Wikimedia, KDE and
> translatewiki.net, where hundreds of pieces of documentation and
> interface translations are updated every day; with Localisation Update
> your users will always have the latest translations freshly out of the
> oven. The Clean Changes extension keeps your recent changes page
> uncluttered from translation activity and other distractions.
>
> Don't miss the chance to practice your rusty language skills and use
> the Babel extension to mark the languages you speak and to find other
> speakers of the same language in your wiki. And finally the cldr
> extension is a database of language and country translations.
>
> We are aiming to make new releases every month, so that you can easily
> stay on the cutting edge with the constantly improving language
> support. The bundle comes with clear installation and upgrade
> installations. The bundle is tested against MediaWiki release
> versions, so you can avoid most of the temporary breaks that would
> happen if you were using the latest development versions instead.
>
> Because this is our first release, there can be some rough edges.
> Please provide us a lot of feedback so that we can improve for the
> next release.
>
>   -Niklas
>
> --
> Niklas Laxström
>
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Feedback wanted: PHPCS in a static types world

2022-10-28 Thread Niklas Laxström
pe 28. lokak. 2022 klo 17.04 Lucas Werkmeister (
lucas.werkmeis...@wikimedia.de) kirjoitti:

> In my opinion, MediaWiki’s PHPCS ruleset feels largely rooted in an older
> version of PHP, where static type declarations (formerly known as “type
> hints”) did not exist. As we move towards more modern code, I think some
> rules should be relaxed, and others adjusted. More specifically, I’d like
> to know if most people agree with the following propositions and conclusion:
>

I support relaxing the phpcs rules by default. We have already disabled
some of these rules for new code in the Translate extension (ref
)
with the same reasoning you gave.

  -Niklas
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Requesting feedback about the future of the LocalisationUpdate extension

2022-04-28 Thread Niklas Laxström
The git-based approach is in my opinion superior because it works out
of the box without any extra effort: neither MediaWiki releases nor
Wikimedia deployments do not need to do anything special to receive
translation updates.

The challenge has been and still is to some extent the frequency of
the releases/deployments. Here is a summary:
* Wikimedia-sites: weekly
** used to be months before LU and current deployment trains, used to
be daily when LU was enabled, not affected by translation backports
* Other sites following the git master branch: (almost) daily
* Other sites following the git release branches: weekly
** for years they did not receive translation updates, fixed by
translation backports
* Other sites following the release tarballs: at least quarterly
** for years they did not receive translation updates, fixed by
translation backports
* Other sites using LocalisationUpdate: depends on the configuration,
up to (almost) daily

  -Niklas

ke 27. huhtik. 2022 klo 14.40 Adam Wight (adam.wi...@wikimedia.de) kirjoitti:
>
> I wonder if this would be a good candidate for event-based replication?  One 
> drawback is that current streams keep at most one month of data [1], but that 
> might be extended for translations depending on the volume.  Another 
> workaround might be to combine regular releases with a streaming update, for 
> example if a language bundle were released once per month.
>
> This approach might also work well for Wikimedia sites, I wasn't sure from 
> the final question in the email whether or not this is an outstanding 
> technical gap.
>
> Regards,
> [[mw:User:Adamw]]
>
> [1] 
> https://wikitech.wikimedia.org/wiki/Event_Platform/EventStreams#Historical_Consumption
>
> On 4/27/22 1:22 PM, Niklas Laxström wrote:
>
> Since the beginning of the year, the Wikimedia Language team has enabled 
> translation backports for MediaWiki core, extensions and skins hosted on 
> Gerrit. On a weekly schedule compatible translations from master branch are 
> backpored to all the supported release branches. Currently supported branches 
> are 1.35–1.38.
>
> Translation backports partially replace the purpose of the LocalisationUpdate 
> extension. Wikimedia sites no longer use the extension, and to our knowledge 
> only a few other users of the extension exist, because it needs manual setup 
> to use.
>
> We, the Language team, think that maintaining the LocalisationUpdate 
> extension is no longer a good use of our time. We are asking for your 
> feedback about the future of this extension.
>
> We are planning to:
> * Remove LocalisationUpdate from the MediaWiki Language Extension Bundle 
> starting from version 2022.07
> * Remove us as maintainers of the extension
>
> Additionally, based on the feedback, we are planning to either mark the 
> extension as unmaintained, transfer maintenance to a new maintainer, or 
> request the extension to be archived and removed from the list of extensions 
> bundled with MediaWiki core if there is no indication that anyone uses this 
> extension.
>
> We request your feedback and welcome discussion on 
> https://phabricator.wikimedia.org/T300498. Please let us know if you are 
> using this extension and whether you would be interested in maintaining it.
>
> Anticipated questions
> Q: What about Wikimedia sites: does this mean they will not get frequent 
> translation updates as they used to have?
>
> A: We still think this is important, but we do not think the previous 
> solution can be restored. We would like to collaborate on new solutions. One 
> solution could be more frequent deployments.
>
>   -Niklas
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
>
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
> https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Requesting feedback about the future of the LocalisationUpdate extension

2022-04-28 Thread Niklas Laxström
to 28. huhtik. 2022 klo 0.12 Tyler Cipriani (tcipri...@wikimedia.org) kirjoitti:
> On Wed, Apr 27, 2022 at 5:22 AM Niklas Laxström
>  wrote:
> > Since the beginning of the year, the Wikimedia Language team has enabled 
> > translation backports for MediaWiki core, extensions and skins hosted on 
> > Gerrit. On a weekly schedule compatible translations from master branch are 
> > backpored to all the supported release branches. Currently supported 
> > branches are 1.35–1.38.
>
> What does that mean for maintenance for the extension for the released
> versions? Version 1.35 is an LTS that will go end-of-life in September
> 2023.

I am not sure what your question is, but I'll provide two answers:
1) Backports are an automatic service for extensions, as long as the
extension is not archived and is enabled on translatewiki.net.
2) The LocalisationUpdate extension, which is bundled with the
tarballs, will continue to receive minimal support (security fixes) in
the release branches unless there is a new maintainer who wants to do
more than that.

> > Anticipated questions
> > Q: What about Wikimedia sites: does this mean they will not get frequent 
> > translation updates as they used to have?
> > A: We still think this is important, but we do not think the previous 
> > solution can be restored. We would like to collaborate on new solutions. 
> > One solution could be more frequent deployments.
>
> I want to be sure I understand this. Is it correct to say: new l10n
> messages will continue to be merged into the mainline branches of
> MediaWiki and extensions and will go live with the weekly train?

This is correct. Backports are an additional service on top of the
regular translation updates. MediaWiki core, extensions and skins in
Gerrit will continue to receive almost daily translation updates. See
for example 
https://gerrit.wikimedia.org/r/q/owner:L10n-bot+branch:master+project:mediawiki/core

  -Niklas
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Requesting feedback about the future of the LocalisationUpdate extension

2022-04-27 Thread Niklas Laxström
Since the beginning of the year, the Wikimedia Language team has enabled
translation backports for MediaWiki core, extensions and skins hosted on
Gerrit. On a weekly schedule compatible translations from master branch are
backpored to all the supported release branches. Currently supported
branches are 1.35–1.38.

Translation backports partially replace the purpose of the
LocalisationUpdate extension. Wikimedia sites no longer use the extension,
and to our knowledge only a few other users of the extension exist, because
it needs manual setup to use.

We, the Language team, think that maintaining the LocalisationUpdate
extension is no longer a good use of our time. We are asking for your
feedback about the future of this extension.

We are planning to:
* Remove LocalisationUpdate from the MediaWiki Language Extension Bundle
starting from version 2022.07
* Remove us as maintainers of the extension

Additionally, based on the feedback, we are planning to either mark the
extension as unmaintained, transfer maintenance to a new maintainer, or
request the extension to be archived and removed from the list of
extensions bundled with MediaWiki core if there is no indication that
anyone uses this extension.

We request your feedback and welcome discussion on
https://phabricator.wikimedia.org/T300498. Please let us know if you are
using this extension and whether you would be interested in maintaining it.

*Anticipated questions*
Q: What about Wikimedia sites: does this mean they will not get frequent
translation updates as they used to have?

A: We still think this is important, but we do not think the previous
solution can be restored. We would like to collaborate on new solutions.
One solution could be more frequent deployments.

  -Niklas
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Enabling translating RCFeed log entry messages(IRC log entry messages)

2021-09-23 Thread Niklas Laxström
Messages like "1movedto2" are preserved for compatibility reasons and
should not be used. Long time ago they were used on Special:Log (and the
non-special pages before them). The problem with them is that they are not
full sentences, so translating them into other languages correctly is not
possible. 10 years ago I rewrote the log formatting

code to allow using full sentences. It uses messages of the format
logentry-*-*. However, changing the messages in the IRC output would have
been breaking change for all bots parsing that output, so the feed system
was intentionally changed to use the legacy messages and new translations
for the legacy messages were disabled.

The way forward would be to decouple the formatting so that you can choose
to use the new messages in the output.
  -Niklas

to 23. syysk. 2021 klo 22.01 lens0021 lens0021 (lorentz0...@gmail.com)
kirjoitti:
>
> TLDR: I don't think the system messages for IRC are just for IRC. So I
hope to change it.
>
> Hi,
> I am an extension developer and I am recently developing an extension[1]
that provides a custom RCFeedEngine and an RCFeedFormatter. The purpose of
the extension is to stream the recent changes in a wiki to a given Discord
webhook.*
>
> The RC feed engines send messages in a freely configurable format to an
engine set in $wgRCEngines[2] and every RC log entry has a system message
for RC feed output. For instance, logs for moving a page have a message
named "1movedto2" which is represented "moved [[$1]] to [[$2]]" in English.
>
> Disappointingly, I have found [[MediaWiki:1movedto2/qqq]] and similar
messages on translatewiki include {{ignored}} template that says "This
message is ignored on export for MediaWiki. Translating it is a waste of
your effort!" and it seems to be true. The main cause of this is probably
that the messages are only for irc.wikimedia.org and irc.wikimedia.org will
be replaced with EventStreams? I couldn't find the exact reason.
>
> In my opinion, the log entry messages are not just for IRC. IRC is an
implementation of RCFeed and RCFeed is a general interface and can be
extended in many ways as even the core includes multiple RCFeedEngines. So,
if I'm not wrong, I'd like to create a ticket for enabling translation and
request your opinions.
>
> Regards.
> -User:Lens0021
>
> * There are a few extensions with a similar purpose. But the extensions
are not RCFeedEngines and define their own system messages and use them
instead of using the log entry messages.
>
> ---
> [1] https://www.mediawiki.org/wiki/Extension:DiscordRCFeed
> [2] https://www.mediawiki.org/wiki/Manual:$wgRCEngines
> ___
> Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
> To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] TechCom meeting 2020-12-16

2020-12-16 Thread Niklas Laxström
I'll also add that I wrote an implementation plan (proposal, rather) for
translatable modules. I'd appreciate feedback for the open questions and in
general. Do note though that there isn't currently a timeline for the
implementation.
https://www.mediawiki.org/wiki/Translatable_modules/Technical_implementation

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom meeting 2020-12-16

2020-12-15 Thread Niklas Laxström
This is the weekly TechCom board review in preparation of our meeting on
Wednesday. If there are additional topics for TechCom to review, please let
us know by replying to this email. However, please keep discussion about
individual RFCs to the Phabricator tickets.

Activity since Tuesday 2020-12-08 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox: (none)

Committee board activity:

   - T42787  Remove legacy ajax
   interface
  - I moved from Inbox to Watching
   - T267213  Create WikiTeq
   group on Gerrit
   - Kizule is asking for an update.
  - It seems there are two things to do: create the group and update
  the policy.

New RFCs: (none)

Phase progression: (none)

IRC meeting request: (none)

Other RFC activity:

   - T263841  RFC: Expand API
   title generator to support other generated data
   - There is a new proposal text. Please comment.
   - T259771 : RFC: Drop support
   for database upgrade older than two LTS releases.
   - Discussion on the task. To me it looks mostly in favor or commenting
  that some of the problems are not fully solved by this proposal.
   - T268326  RFC: Amendment to
   the Stable interface policy (November 2020)
   - Feedback given about the proposed 3 month minimum period
   - T119173 : RFC: Discourage
   use of MySQL's ENUM type.
   - Amir showed how to automatically generate on-wiki documentation for
  tables.
   - T133452 : RFC: Create
   temporary accounts for anonymous editors.
   - Many comments after Tim proposed to use cloaks (see his comment in the
  task for details)
   - T214362 : RFC: Store
   WikibaseQualityConstraint check data in persistent storage.
   - Krinkle asked for clarifications. Lucas responded with a comment and
  Lydia offered to have it explained in a call.
   - T208776 : RFC: Introduce
   PageIdentity to be used instead of WikiPage.
   - Krinkle and Daniel discuss implementation details in context of what's
  best for type safety, clarity and migration.


  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom meeting 2020-10-21

2020-10-19 Thread Niklas Laxström
This is the weekly TechCom board review in preparation of our meeting on
Wednesday. If there are additional topics for TechCom to review, please let
us know by replying to this email. However, please keep discussion about
individual RFCs to the Phabricator tickets.

Activity since Monday 2020-10-15 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox:

   - T239742  Should npm
   packages maintained by Wikimedia be scoped or unscoped?
  - Still in inbox

Committee board activity:

   - T263904  Are traits part of
   the stable interface?
  - Daniel moved to in progress (see last weeks email thread)

New RFCs: (none)

Phase progression:

   - T262946 Bump Firefox version in basic support to 3.6 or newer
   - P3 -> P4: ready to go on last call

IRC meeting request: (none)

Other RFC activity:

   - T119173 RFC: Discourage use of MySQL's ENUM type
  - Concerns about removing existing uses, but it doesn't seem
  necessary to remove them
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 📈 Wikimedia production errors help

2020-09-15 Thread Niklas Laxström
ma 14. syysk. 2020 klo 23.49 Tyler Cipriani (tcipri...@wikimedia.org) kirjoitti:
> The number of new tasks being created with this tag in a given week is
> outpacing the number of tasks being closed in a given week: this past
> week we added 41 tasks and only closed 22.

Majority of the recently created tasks are frontend JavaScript errors.
The logging of these errors have only started recently. These issues
may have been present for years already, but they are reported now.

> This is beginning to be unsustainable :(

If there is an increase in the amount of real new issues and/or
decrease in the amount of issues fixed, then I would be worried. Given
what I said above, it's difficult to see if this is the case.

Regardless, I do agree that we should aim to minimize production
errors to make it easier to spot any new issues. I would encourage all
maintainers and development teams to ensure that they have a regular
process to check if they have and triage any production issues in code
they maintain.

I think we should expect the number to go up while the backlog of
unreported frontend errors are being reported, and then it would start
going down as developers work on to reduce the backlog of reported
issues. It will probably stabilize at some level, higher than
previously, indicating that some areas of code lack maintainers or
maintenance resources.

Ending with a question: do we want to have both frontend and backend
errors on the same tag/board, or should they be on separate ones?

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom board review 2020-09-07

2020-09-07 Thread Niklas Laxström
This is the weekly TechCom board review in preparation of our meeting
on Wednesday. If there are additional topics for TechCom to review,
please let us know by replying to this email. However, please keep
discussion about individual RFCs to the Phabricator tickets.

Activity since Monday 2020-08-31 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox:
* RFC: Parsoid Extension API 
 - see TechCom weekly digest 2020-09-02

Committee board activity: none
New RFCs: none
Phase progression: none
IRC meeting request: none

Other RFC activity:
* MediaWiki's anonymous edit token leaves wiki installations (incl.
Wikipedia) open to mass anonymous spam we can't block

 - dbarratt suggests to block anonymous POST requests with the Origin
header that contains a value that is not in the allowlist.
 - Bawolff thinks it could work if done correctly.

  - Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Service wiring conventions in extensions

2020-07-15 Thread Niklas Laxström
The example below works, and I see it used in some extensions, but it
has no autocompletion and not catching typos.

Services.php:
class TranslateServices implements ContainerInterface {
public function getParsingPlaceholderFactory(): ParsingPlaceholderFactory {
return $this->container->get( 'Translate:ParsingPlaceholderFactory' );
}

public function getTranslatablePageParser(): TranslatablePageParser {
return $this->container->get( 'Translate:TranslatablePageParser' );
}
}

ServiceWiring.php:
return [
'Translate:ParsingPlaceholderFactory' => function ():
ParsingPlaceholderFactory {
return new ParsingPlaceholderFactory();
},

'Translate:TranslatablePageParser' => function ( MediaWikiServices
$services )
: TranslatablePageParser
{
return new TranslatablePageParser(
$services->get( 'Translate:ParsingPlaceholderFactory' ) # <
);
},
];

Do you see any downsides of using code like below instead?

'Translate:TranslatablePageParser' => function (): TranslatablePageParser {
$services = TranslateServices::getInstance();
return new TranslatablePageParser(
$services->getParsingPlaceholderFactory() );
},

I looked at other extensions and I noticed a lot of small differences
among them:
* Some extensions use static methods as opposed to wrapping the core
service container
* Some extensions use constants for service identifiers
* Lots of different implementations of "To avoid name conflicts, the
service names should be prefixed with the extension's name.":
** ExtensionService
** Extension.Service
** Extension:Service
** Extension_Service

Are we yet in a stage to agree on some (additional) conventions and
document them somewhere? Maybe in
https://www.mediawiki.org/wiki/Dependency_Injection

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Documentation/Examples on enabling localization in Gadgets

2019-11-27 Thread Niklas Laxström
ke 27. marrask. 2019 klo 0.38 Egbe Eugene (agboreug...@gmail.com) kirjoitti:
>
> Hi All,
>
> Is there any documentation or Gadget I can have a quick look at yo be able
> to learn how to enable translation in gadgets?

The only example I know is
https://commons.wikimedia.org/wiki/Help:Gadget-ProveIt and it is using
Gerrit and translatewiki.net.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Avatars coming to gerrit

2018-10-09 Thread Niklas Laxström
la 6. lokak. 2018 klo 3.44 Paladox via Wikitech-l
(wikitech-l@lists.wikimedia.org) kirjoitti:

> We also have a license file for you to specify the license of your image at 
> https://gerrit.wikimedia.org/r/plugins/gitiles/All-Avatars/+/refs/heads/master/LICENSE
> If none is specified then it defaults to GPL 2.0+.

What if one is uncomfortable or unwilling to license a photo of
themselves under an open source licence?
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] You can now translate Phabricator to your language

2018-03-02 Thread Niklas Laxström
It's now possible to translate Phabricator in translatewiki.net thanks
to the Phabricator developers, Mukunda Modell, and many others who
participated in https://phabricator.wikimedia.org/T225

We are currently in an experimental phase, where these translations
are only used in https://phabricator.wikimedia.org, but the plan is to
propose these translations to Phabricator upstream.

A few languages are already available and the language setting can be
changed at https://phabricator.wikimedia.org/settings

You can help our multilingual user base by translating Phabricator.
You can find some more info about translating this project at
https://translatewiki.net/wiki/Translating:Phabricator, and you can
start translating directly at
https://translatewiki.net/w/i.php?title=Special:Translate&group=phabricator

The whole Phabricator project is large, with over 17,000 strings to
translate. Some simple and familiar strings can be found under the
"Maniphest" and "Project" sub-groups. "Maniphest" includes strings for
creating and searching tasks, and "Project" includes strings for
managing project workboards with columns. Here are the direct links to
them
* 
https://translatewiki.net/w/i.php?title=Special:Translate&group=phabricator-phabricator-maniphest
* 
https://translatewiki.net/w/i.php?title=Special:Translate&group=phabricator-phabricator-project

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Stakeholders' meta-repository of extensions not in gerrit

2017-12-25 Thread Niklas Laxström
2017-12-24 1:05 GMT+02:00 Mark A. Hershberger :

>
> This afternoon, while procrastinating by cleaning up my clone of
> gerrit's mediawiki extensions repository, I decided to make a meta
> repository for some non-WMF-hosted extensions.  (Hopefully this will
> mean less need to clean up the clone in the future.)
>
> I got some low-hanging fruit to start it off, but I'm hoping I can get
> some pull requests to add more repositories.
>

Some more:
https://github.com/wikimedia/translatewiki/blob/master/repoconfig.yaml#L314-L433

Why submodules? Is someone planning to keep those up to date?

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Language engineering monthly reports for September-November 2017

2017-12-05 Thread Niklas Laxström
The language engineering monthly reports for September, October and
November 2017 are ready.

**Highlights for these months**
* Lots of usability and design improvements to the Content Translation
dashboard.
* There is now a way to link directly to a particular message in
Special:Translate.
* Mediawiki Language Extension Bundle 2017.10 was released.
* Style updates in Universal Language Selector and Translate to align
with WikimediaUI styleguide.
* Translatable pages are now editable with the 2017 wikitext editor.
* An issue that made it difficult to log in or save translations in
translatewiki.net was investigated and resolved.
* Translate recent changes filters now support the new filter user interface.
* The old Translate extension translation view interface was finally
removed after four and half years.
* Universal Language Selector search is much improved. It now returns
more consistent and comprehensive results and autocompletion
suggestions are more relevant.
* There is now a Latin-Cyrillic language converter for Crimean Tatar.

For more information about these and other updates, please see the full reports.

**Full reports**
https://www.mediawiki.org/wiki/User:Nikerabbit/Monthly_report/2017-09
https://www.mediawiki.org/wiki/User:Nikerabbit/Monthly_report/2017-10
https://www.mediawiki.org/wiki/User:Nikerabbit/Monthly_report/2017-11


For those unfamiliar with this report, its goal is to summarize all
technical changes to internationalization, translation tools and other
language support products. It also highlights the diversity of
contributors to this area and that many of them are volunteers.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Try out the commit message validator

2017-11-06 Thread Niklas Laxström
2017-11-07 7:58 GMT+02:00 Kunal Mehta :
>
> But sometimes people aren't familiar with the guidelines, or more likely
> we make a typo somewhere. Here's where the commit-message-validator[2]
> comes in handy!
>

Does the tool check for typos? Typos in commit messages get merged all the
time, including my own. I would like to avoid that :)

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Language engineering monthly report for August 2017

2017-10-09 Thread Niklas Laxström
The language engineering monthly report for August 2017 is ready.

**Highlights for this month**
Language database is now an independent project to make reuse easier. It
used to be part of jquery.uls library. It contains over 500 entries
detailing basic information about languages, such as their autonym, their
writing script, and region(s) of the world where they are spoken.
https://github.com/wikimedia/language-data

August was, so far, the most active month of 2017 for translatewiki.net
with the highest number of translation updates (55k) and translation
reviews (15k) by over 350 translators.

**Full report**
https://www.mediawiki.org/wiki/User:Nikerabbit/Monthly_report/2017-08


For those unfamiliar with this report, its goal is to summarize all
technical changes to internationalization, translation tools and other
language support products. It also highlights the diversity of contributors
to this area and that many of them are volunteers.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Language engineering monthly report for July 2017

2017-09-06 Thread Niklas Laxström
The language engineering monthly report for July 2017 is ready.

*Highlights for this month*
Content Translation dashboard has received a major facelift that aligns it
with the Wikimedia style guide and makes it easier to use.

Translatewiki.net now imports new messages up to 9 times per day. New
messages are made available for translation automatically but any changed
messages must be checked by a human. We want to ensure that users get the
new features in their own language. Please contact me if you want to help
us to have a robust around-the-clock coverage.

*Full report*
https://www.mediawiki.org/wiki/User:Nikerabbit/Monthly_report/2017-07


For those unfamiliar with this report, its goal is to summarize all
technical changes to internationalization, translation tools and other
language support products. It also highlights the diversity of contributors
to this area and that many of them are volunteers.

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VisualEditor in 1.28 - fails after upgrade from 1.27

2016-12-19 Thread Niklas Laxström
Maybe related: having both SemanticMediaWiki and Visual Editor
installed will still reliably trigger this error.

This was filed as https://phabricator.wikimedia.org/T134562 (which
imho should be reopened).

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] About the frontend development tools we use

2016-11-18 Thread Niklas Laxström
I was reading http://stateofjs.com/2016/introduction/#sections and
could not avoid noticing that the frameworks or technologies we use
are not among the most popular or most liked among the participants of
this survey.

Examples:
* Frontend frameworks: We use jQuery and OOjs UI. The latter does not
appear at all in the list, jQuery is not in the top ten. This question
might be biased though on what people perceive as a framework.

* Testing framework: We mostly use qUnit, Cucumber, Selenium. Of these
only Cucumber appears in the top 6 and it has very low satisfaction
(people who have used do not like it).

* CSS tools: We use plain CSS and Less. Less has considerable lower
satisfaction than SASS/SCSS and is less popular.

* Build tools: We don't use these in core to my knowledge, but many
extensions seem to use Grunt for running linting tools. Again, Grunt
has very low satisfaction compared to other tools.

It is natural, that as large and complex project we do not jump to the
latest cool thing. I am not advocating to change tools that work well
for us, but I don't remember seeing a public discussion whether they
work well or not. Though, I am seeing some changes, for example
jscs+jshint being replaced with eslint.

We could possibly go faster or write better software with better tools
(of course this would need a careful evaluation). And while doing that
we could perhaps lower the barrier for new developers by using
something they already know. The topic of how to attract new
developers to our movement has been popular lately (for example [1]).

[1] https://phabricator.wikimedia.org/T148911



For me one pain point is automated testing of JavaScript code. It
seems that testing frameworks, development practices and the way code
is written could all be improved to make automated testing easier.
Would there be interest in sharing comments how you do this and does
what you do work well for you?

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Discovery Weekly Update for the week starting 2016-10-03

2016-10-11 Thread Niklas Laxström
2016-10-10 1:25 GMT+03:00 Chris Koerner :
> = Discussions =
>
> == Search ==
> * Translate extension updated to allow searches in a specific language
> (without translations) [3]

Actually it was CirrusSearch that was updated.

Translate already supports filtering the search results by language
when using database search backend, and there is
Special:SearchTranslations when using ElasticSearch as translation
memory backend.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.12.2 test instance - PLEASE TEST

2016-07-12 Thread Niklas Laxström
Many good and bad changes I can live with. One thing I will miss:
* Columns setting no longer wraps the diff lines. Horizontal scroll is
now unavoidable and cumbersome due to having to use either the mouse
or the arrow keys to move the cursor on the line.

Tips for others:
* I previously used 'r' to go straight to publishing my comments. This
shortcut is now 'a'.
* 'x' expands all history – the preference for having it always
expanded seems to be gone.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Announcements from Wikimedia Language team

2016-06-17 Thread Niklas Laxström
The Wikimedia Language team has been assembling monthly reports about
language support activities for one year. You can read the latest
report at:
https://www.mediawiki.org/wiki/Wikimedia_Language_engineering/Reports/2016-May

Highlights for May include: Special:Translate got an edit summary
field and modernization of web font formats: woff2 is in, eot is out.

Due to the nature of our work, the Language team [1] (Amir, Kartik,
Pau, Runa, Santhosh, and myself) alone cannot adequately support all
the languages of the Wikimedia movement. That is why the report
includes work by volunteers. We have bolded the names who we believe
are contributing as volunteers.

This report focuses on technical activities. You wont find future
plans or high level roadmap items on it. There is currently a major
omission: the i18n work of MediaWiki core itself. That is lacking
because it is more difficult to filter those activities and also
because we have not had much time for MediaWiki core i18n work.

To acknowledge the work of volunteers and to support them better, the
Language team released a statement of intent for code review [2] about
six months ago. To summarize: we attempt to review patches not by us
within a week, and patches stalled due to no updates after review for
three months will be abandoned -- unless we feel they are worth fixing
ourselves.

When we released the statement, we also agreed to reduce the existing
backlog of open patches. The results so far are positive, even though
it is easy to find examples where we have not been able to follow our
intent. Translate extension had 35 open patches when we started in
February -- at end of May it had only 12 open patches [3]. Universal
Language Selector had gone from 10 to 6, and fewer of them unreviewed.
Content Translation had gone from 15 to zero. Our jquery repositories
in GitHub have not fared as well, but we hope to achieve similar
results there in the future.

We excluded many repositories from the statement of intent in the fear
that we would add too much of a burden to ourselves. To our delight,
except MediaWiki core i18n, all those repositories have had swift
reviews and I count only two open patches in them.

  - Niklas (on behalf of the Language team)

[1] https://www.mediawiki.org/wiki/Wikimedia_Language_engineering
[2] 
https://www.mediawiki.org/wiki/Wikimedia_Language_engineering/Code_review_statement_of_intent
[3] The numbers change constantly. As of 2016-06-17 Translate has 23
open patches, but only 10 of them not from our team. Universal
Language Selector has 13 patches, 5 of them not from our team. Content
Translation currently has 6, one of them not from our team.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016-05-25 Scrum of Scrums meeting notes

2016-05-31 Thread Niklas Laxström
2016-06-01 9:32 GMT+03:00 Niklas Laxström :
> 2016-05-31 19:23 GMT+03:00 Grace Gellerman :
>> https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-25
>> === Release Engineering ===
>> * '''Blocking''': ???
>> * '''Blocked''': none
>> * '''Updates''':
>> ** wmf.3 is rolling forward this week
>> ** rc.0 of 1.27 should be out this week
>
> According to the deployment calendar wmf.4 is going out normally. I
> assume that one is right?

Okay I just realized this is for last week. Sorry for the noise.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016-05-25 Scrum of Scrums meeting notes

2016-05-31 Thread Niklas Laxström
2016-05-31 19:23 GMT+03:00 Grace Gellerman :
> https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-25
> === Release Engineering ===
> * '''Blocking''': ???
> * '''Blocked''': none
> * '''Updates''':
> ** wmf.3 is rolling forward this week
> ** rc.0 of 1.27 should be out this week

According to the deployment calendar wmf.4 is going out normally. I
assume that one is right?

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] update: wikipedia.org portal

2016-05-24 Thread Niklas Laxström
2016-05-25 2:58 GMT+03:00 Dan Garry :
> Hey Purodha,
>
> On 21 May 2016 at 05:13, Purodha Blissenbach 
> wrote:
>
>> On the long run, I think, these portals and their texts should
>> be translatable. Browser settings determining the target language.
>> Looking forward to have them on translatewiki.net !
>>
>
> I agree that localising these strings would be helpful and in-line with our
> practices. That's definitely something that we're interested in doing, and
> we're going to be doing an investigation on that soon. We're hoping it'll
> be fairly straightforward to get this done... but if it's not, we may need
> to deprioritise the work. We'll see.

Dan,

Localising a few static strings is not difficult and I (and others)
have already met with your team to share information how to do this.
In my opinion deprioritizing this work would be against our mission
and values. I am available for help if you run into any issues during
the process.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Security release later today + 1.28 stuff

2016-05-20 Thread Niklas Laxström
I realize no time is a good time for security releases, but this will
be Friday evening or even Saturday night in some parts of the world.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki codesniffer updates on extensions

2016-05-10 Thread Niklas Laxström
I noticed that lots of extensions were recently updated to use
mediawiki-codesniffer 0.7.1 [1].

This has some implications I think are worth being aware of:

1) Many open patches will need manual rebasing and conflict resolution.
2) Those extensions will now depend on at least PHP 5.4 due to short
array syntax [2] instead of PHP 5.3.

Both can cause inconveniences, (1) mainly for developers and (2)
mostly for users.

Some patches doing this update in a more thorough and tested way [3]
existed but were ignored. It would be nice to avoid that in the
future.

Please also watch out for code that might have got removed
unexpectedly due to a bug [4]. On quick inspection I did not see this
happening in those patches.

[1] https://gerrit.wikimedia.org/r/#/q/status:merged+topic:bump-dev-deps,n,z
[2] https://secure.php.net/manual/en/migration54.new-features.php
[3] For example https://gerrit.wikimedia.org/r/#/c/287315/
[4] https://phabricator.wikimedia.org/T134857

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-i18n] Providing the effective language of messages

2016-04-13 Thread Niklas Laxström
2016-04-12 14:01 GMT+03:00 Adrian Heine :
> Hi everyone,
>
> as some of you might know, I'm a software developer at Wikimedia
> Deutschland, working on Wikidata. I'm currently focusing on improving
> Wikidata's support for languages we as a team are not using on a daily
> basis. As part of my work I stumbled over a shortcoming in MediaWiki's
> message system that – as far as I see it – prevents me from doing the right
> thing(tm). I'm asking you to verify that the issue I see indeed is an issue
> and that we want to fix it. Subsequently, I'm interested in hearing your
> plans or goals for MediaWiki's message system so that I can align my
> implementation with them. Finally, I am hoping to find someone who is
> willing to help me fix it.

First of all, thanks for working on this issue. It is a real issue,
but not often requested. I think that is because manually checking in
every place whether the language code is unexpected (different from
the one in current context) would be cumbersome and always outputting
language codes on every tag would be bloaty. Best would be if this
checking was automated in a templating library, but so far templating
hasn't been much adopted in MediaWiki core. But of course this
information needs to be exposed first, which is what I understand you
are doing.


> == The issue ==
>
> On Wikidata, we regularly have content in different languages on the same
> page. We use the HTML lang and dir attributes accordingly. For example, we
> have a table with terms for an entity in different languages. For missing
> terms, we would display a message in the UI language within this table. The
> corresponding HTML (simplified) might look like this:
>
> 
>   
>  dir="OTHERLANG1_DIR">
>   
> 
>   
> 
>   
> 
>   
> 
>
> This works great as long as the missing label message is available in the UI
> language. If that is not the case, though, the message is translated
> according to the defined language fallbacks. In that case, we might end up
> with something like this:
>
> No label defined
>
> That's obviously wrong, and I'd like to fix it.
>
> == Fixing it ==
>
> For fixing this, I tried to make MessageCache provide the language a message
> was taken from [1]. That's not too straight-forward to begin with, but while
> working on it I realized that MessageCache is only responsible for following
> the language fallback chain for database translations. For file-based
> translations, the fallbacks are directly merged in by LocalisationCache, so
> the information is not there anymore at the time of translating a message. I
> see some ways to fix this:
>
> * Don't merge messages in LocalisationCache, but perform the fallback on
> request (possibly caching the result)
> * Tag message strings in LocalisationCache with the language they are in
> (sounds expensive to me)
> * Tag message strings as being a fallback in LocalisationCache (that way we
> could follow the fallback until we find a language in which the message
> string is not tagged as being a fallback)
>
> What do you think?

The current localisation cache implementation quite obviously trades
space for speed. In this light I would suggest option two, to tag the
actual language the string is in.

However, this trade-off might not make sense anymore, as we have more
languages and more messages, resulting in almost gigabyte size caches.
See also for example https://phabricator.wikimedia.org/T99740. I added
wikitech-l to CC in hopes that people who have worked on localisation
cache more recently would comment on whether option one, to not merge
messages, would make more sense nowadays.

>
> [1] https://gerrit.wikimedia.org/r/282133
>

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-11 Thread Niklas Laxström
2016-04-05 8:51 GMT+03:00 Jon Robson :
> Special:Translate doesn't work [1] and the current plan is to make it
> redirect to desktop which is disappointing and I'd guess loses us lots of
> potential editors (myself included).

When we developed the new interface for Special:Translate (aka TUX) we
did some testing that it works on tablets. Is there way to mark it
suitable for tablets alone because we have not designed it for smart
phones?

And what can we do for the rest? Let them too access TUX acknowledging
it will be heavy and clunky? Would it make sense to generate minimal
non-JavaScript version for all the rest using which they can get the
job done if they are desperate but without all the advanced features
of the regular TUX UI?

Or in short, I am wondering whether "mobile support" is all or
nothing, or whether there is some middle way where we can have some
quick wins if the alternative is to have no support at all?


> To take a more concrete example of impact on mobile - we've made the mobile
> skin play nicely with language interwiki links (we've even dedicated this
> entire quarter to improving language switching on mobile web [2] ). On the
> other hand, the languages tag does not work the same way as an interwiki
> link. It does it's own thing which is sadly suffering from usability issues
> on mobile [3].

This is technical debt. After over a year long push on Content
Translation, the Language team is now dedicating some time to address
high priority bugs in Translate and Universal Language Selector. There
is some related work happening such as the "compact interlanguage
links out of beta" and a soon 2 year old patch in Translate to improve
the display of the language list [1]. It would be useful to look at
the big picture here but there likely isn't enough time beyond fixing
the most obvious for now.

[1] https://gerrit.wikimedia.org/r/#/c/149585/

> I hope my views on this are a little clearer to you now and apologies for
> putting you on the defensive if I did.

Thanks for the explanation, because it was not obvious to me what are
the pain points from your point of view. For the usability issues you
listed, they do affect desktop users as well, but I do appreciate that
they are more severe on mobile. For performance, is that only about
the unsuitable output from  tag or is there more to it?


> I'd love to see our new language switcher compatible with the output of the
> translate tag and the translation mechanisms available on mobile phones, so
> readers can view translations and edit seamlessly around our projects.

This is helpful and I will consider it when prioritizing work. I am
not well aware of your current priorities [2], but if you think this
is important, would you consider helping us on this issue?

[2] I am having hard time navigating through outdated pages on
mediawiki.org. Is this the place to look:
https://www.mediawiki.org/wiki/Reading/Web/Projects ?

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] productivity of mediawiki developers

2016-04-04 Thread Niklas Laxström
2016-04-04 17:02 GMT+03:00 Quim Gil :
> The first question to answer is what information are you looking for when
> you want to measure developers' "productivity". What would be the
> motivation of that estimation? What is the motivation behind this thread?

One reason comes to me mind. My gut feeling is that we are not very
good at consistently giving recognition for technical work. One
possible reason is that we do not have clear and understandable
metrics or promote those metrics enough. Nor am I aware of any process
for awards and celebration (The Academy Awards would be an example in
another context, also Wikipedian of the year).

As an example, I recall vaguely that during the Bugzilla times we used
to have regular emails on wikitech-l with list of people who closed
most bugs.

Having some metrics for different activities could stir up some
healthy competition (also unhealthy if we are not careful) and of
course there is a lot of important work that is not visible from the
numbers only.

I am not expert on this subject, but I think developers (especially
volunteers, but also others) are more likely to stick around if they
feel that their work is recognized and appreciated. For the latter we
already know that we should improve our code review process.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-04 Thread Niklas Laxström
2016-04-04 16:00 GMT+03:00 Subramanya Sastry :
> Niklas and the language team: thanks for your efforts in enabling
> translation features. They are truly important and necessary.

And I want to thank you for your positive and constructive approach
for solving this issue.


> 1. Given the success of CX, and the increasing use of VE, is explicit
> wikitext markup still necessary for enabling translation?

I am actually eager to try out non mark-up approach for Translate's
page translation feature. I am not aware of any fundamental reason for
not being able to work without additional wikitext mark-up. But I
would like to clarify some things about the relation of Translate
(specifically its page translation feature) and Content Translation
(CX).

The use case for CX is very different from Translate's page
translation. Former supports one-time "free form" translation from one
language to one language. The latter supports maintaining
content-preserving translations from one language to hundreds of
languages so that translations can be kept in sync when the source
text changes.

Along the same line the occasional suggestions about replacing either
extension with the other one is not practical. My opinion is that the
way to bring these closer together is to take the best parts of both
(and also others like VE) and combine them to produce tools which are
tailored for each use case and which are consistent in appearance and
functionality. Translate does so much more that it is easier to add a
new translation interface to Translate than it is to re-implement all
the backend tracking functionality in CX.


> 2. Identifying document fragments for translation is another instance of the
> same problem of associating metadata with document fragments *across edits*.
> Citations, content-translation, comments-as-documentation, authorship
> information, maintaining-association-between-translated-fragments, etc. are
> all different instances of this problem. Translate extension seems to be
> using comments in wikitext markup as one way to maintain this
> cross-edit-association. Maybe there is value in thinking about this problem
> broadly.

Definitely. As we have discussed earlier, the definition of what is a
"breaking change" does vary between the different use cases, and in
fact is left to a humans (translation admins) to decide in the page
translation feature. But the other approach of each tool implementing
it from scratch is not ideal either.

One thing to remember here is that there are two goals with the
current mark-up. The association of content across edits is only one
of them (these are the T-comment you refer to). The other goal is to
specify what is and what is not translatable. For example, one
frequent use case is to mark for translation only the image captions
but not rest of the image wikitext mark-up. I am hopeful that some
heuristics with additional tools for manual correction would reach
good results for this goal.

Yet another thing related to this is that we will want to support
WYSIWYG/HTML translation in Translate. As you might know already,
Special:Translate has two modes: the list view and the page view. Page
view should be replaced with interface not much unlike CX, but we also
need some support for the list view.


> 3. Are there incremental steps we can take to start deprecating the existing
>  extension solution and move towards some structural metadata
> solution? Given that translate extension is not used on *all wikis*, looking
> at the wikis where translation is enabled, is there a possibility of making
> this a VE-only / CX-only feature? CX, for example, is already doing work to
> maintain association between translated fragments across document edits.

What to do with the translate tags is a delicate question and whether
to stop supporting them altogether is a question that has lots of
dependencies such as how long there will exist third party wikis
(where Translate is also used a lot) that prefer to use wikitext
instead of VE and parsoid or alternatives. See also my thoughts to
your first question.

Right now my answer is no: we cannot make this implementation of the
feature VE or CX only. My preference is to start developing a parallel
mark-up-less system and to provide migration tools. This new system
can depend on VE and parsoid and other modern functionality. Its
implementation will require cross team planning and collaboration.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-04 Thread Niklas Laxström
To Brion and other people who think the page translation markup is
annoying and a usability issue: As the (then volunteer) developer who
created it, I can only agree.

The way page translations currently works, which is extensively
documented at [0], is the result of lots of experimenting with what
works and what does not. When developing this feature, I got some
first hand experience working with the MediaWiki parser, which was not
always easy.

Yes, the wikipage translation feature can and should be improved as
technology progresses: we now have a visual editor which did not exist
when this feature was developed. However, as shown by statistics [1],
these issues do not prevent the feature from being used to translate
thousands of pages, including the weekly tech news. The markup issue
is not a reason to stop using this feature.

For this quarter the Language team is going to address high priority
issues in Translate that can prevent proper use of the page
translation feature [2]. Our team is small and has a huge scope of
work. Only with help of others it is possible to proceed faster and
cover more ground.

To remind us about our visions: we should "make efforts to support the
translation of key documents into multiple languages" [3] and we
should "provide the essential infrastructure for the support and
development of multilingual wiki projects" [4].

I do not wish to spend my time arguing repeatedly for these goals.
Instead I want to work on making them happen. My request is that we
(especially us English speakers and developers) accept some
inconvenience when necessary to support multilingualism. [5]

[0] https://www.mediawiki.org/wiki/Help:Extension:Translate
[1] 
https://www.mediawiki.org/wiki/Wikimedia_Language_engineering/Reports/2016-March#Usage_data_2
[2] https://phabricator.wikimedia.org/project/profile/1854/
[3] https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Guiding_Principles
[4] https://wikimediafoundation.org/wiki/Mission_statement
[5] Not limited to this thread, see
https://gerrit.wikimedia.org/r/#/c/214893/ and
https://phabricator.wikimedia.org/T39797 for examples which are stuck
for a long time.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tags are a usability nightmare for editing on mediawiki.org

2016-04-03 Thread Niklas Laxström
2016-04-03 11:29 GMT+03:00 Jon Robson :
> The Translate tag has always seemed like a hack that I've never quite
> understood.

I am happy to direct to our documentation [1] anyone who asks, or
explain if the documentation is not sufficient.

The word hack can have both positive and negative meanings and it is
unclear what do you mean with it here.

> The translate tag causes lots of issues on mobile (impacting usability and
> performance) due to not playing well with the rest of the language
> ecosystem.

I was under the impression that mobile does not work with the wikitext
directly. Translate tags never appear in the parsed wikitext output.
Can you please point me to the tasks in Phabricator where these are
explained? I am especially curious about the part about "rest of the
language ecosystem" because having worked on many parts of it, I don't
feel the same way.

[1] https://www.mediawiki.org/wiki/Help:Extension:Translate

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Closing wikibugs-l and mediawiki-commits firehoses?

2015-11-12 Thread Niklas Laxström
2015-11-13 4:22 GMT+02:00 Chad :
>
> Considering ~50% of subscribers weren't even using the list, and
> we only have 130 remaining subscribers between the two, who would
> be terribly upset at closing one or both of these lists?


It would break my workflow of following all merged commits. I have the
following filters:

list:"mediawiki-commits.lists.wikimedia.org" -{"Gerrit-MessageType: merged"}
--> Delete
list:"mediawiki-commits.lists.wikimedia.org" "Gerrit-MessageType: merged"
--> Apply tag

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC & Outreachy IRC Showcase

2015-09-22 Thread Niklas Laxström
2015-09-22 13:37 GMT+03:00 Niharika Kohli :

> last but not the least, an awesome new search feature for TranslateWiki.
>

The new features are not only for translatewiki.net, they also work on all
Wikimedia sites (and other wikis) where Translate is installed.

For example, https://meta.wikimedia.org/wiki/Special:SearchTranslations

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [MediaWiki-announce] MediaWiki Security and Maintenance Releases: 1.25.2, 1.24.3, 1.23.10

2015-08-11 Thread Niklas Laxström
2015-08-11 0:54 GMT+03:00 Chad :

> I would like to announce the release of MediaWiki 1.25.2, 1.24.3, and
> 1.23.10.
>

Can "merge patches to master" be added to the release checklist so that
sites running form master, such as translatewiki.net, can update promptly,
please. This is not the first time it has been delayed.

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Language Extension Bundle 2014.11 release

2014-12-01 Thread Niklas Laxström
2014-12-01 8:41 GMT+02:00 David Chamberlain :
> I run 5 wikis for a total of 6,000 pages and still use solr in 6 languages.

Thanks for the information, David. We will keep the code for Solr backend.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Plural rule changes for many languages - MediaWiki migrates to CLDR 26

2014-10-27 Thread Niklas Laxström
MediaWiki is upgrading its plural rules to match CLDR version 26. The
updates include incompatible changes for plural forms in Russian,
Prussian, Tagalog, Manx and several languages that fall back to
Russian [1]. In addition there are minor changes for other languages.

In January 2014, CLDR 24 had introduced several changes in the plural
forms for some of these languages, including Russian, and we had
updated MediaWiki's plural rules to comply with the CLDR standard.
Some of these changes are now being reverted. Below is a detailed
explanation of the changes.

For the migration period, from Monday, 27th October 2014 to Thursday
6th November 2014, we have disabled LocalisationUpdate at Wikimedia
wikis to reduce the chance of ungrammatical translations being
displayed in the interface.

Developers do not need to take special actions, but if you use master
and do not update core, extensions and skins all at the same time, you
might see ungrammatical translations in the languages mentioned above.

I recommend sysadmins to avoid mixing different versions of core and
extensions for the same reason as above. If you have users in above
languages and are also using LocalisationUpdate extension, you should
consider disabling the extensions until your version of core includes
the plural rule patch: https://gerrit.wikimedia.org/r/#/c/161920/

More details of the actual rule changes can be found at
https://translatewiki.net/wiki/Thread:Support/Plural_rule_changes_for_many_languages

[1] Abkhaz (ab), Avaric (av), Bashkir (ba), Buryat (bxr), Chechen
(ce), Crimean Tatar (crh-cyrl), Chuvash (cv), Inguish (inh),
Komi-Permyak (koi), Karachay-Balkar (krc), Komi (kv), Lak (lbe),
Lezghian (lez), Eastern Mari (mhr), Western Mari (mrj), Yakut (sah),
Tatar (tt), Tatar-Cyrillic (tt-cyrl), Tuvinian (tyv), Udmurt (udm),
Kalmyk (xal).

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fixing PollNY -- ResourceLoader woes

2014-09-17 Thread Niklas Laxström
> mw.loader.using( 'ext.pollNY.lightBox', function() {
> LightBox.init();
> } );
>
> ((Any other code using LightBox.* should probably do the same thing,
> athough mw.loader.using is using jQuery's "promises" (I really want to
> call this something else, since jQuery doesn't implement proper
> promises) incorrectly so I'm not sure how well that will work))

That's difficult to read with the multiple levels of nesting with parenthesis.

1) jQuery's promises are not compatible with Promises/A+ or ES6 promises spec.
** How is this relevant?
2) mw.loader.using promises incorrectly?
** How? Is there a bug report? What issues does it cause?

Also [1], it is possible to use the regular syntax for jQuery promises:
mw.loader.using(  'ext.pollNY.lightBox' ).done( function () { ... } );

[1] Since 1.23 if you dig git history. There is no mention of when it
was added in the documentation.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of skins

2014-08-26 Thread Niklas Laxström
2014-08-27 1:53 GMT+03:00 Jon Robson :
> 2) We need a templating system in core. Trevor is going to do some
> research on server side templating systems. We hope that the
> templating RFC [1] can get resolved however we are getting to a point
> that we need one as soon as possible and do not want to be blocked by
> the outcome of this RFC, especially given a mustache based templating
> language can address all our current requirements.
>
> [1] 
> https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library

At this point about everyone needs templating library as soon as
possible and do not want to get blocked by the RFC. Please, let's all
work together to complete the RFC for the benefit of everyone.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Niklas Laxström
2014-05-30 0:57 GMT+03:00 Bryan Davis :
> I think bug 65188 [0] is the solution suggested by Ori that you are
> referring to. Would this alone be enough to fix the problems for
> translatewiki.net? More directly, is translatewiki.net using the top
> level composer.json file to manage anything other than extensions? In
> the near term is there any better work around for you (and others in a
> similar position) other than running `git update-index
> --assume-unchanged composer.json` in your local checkout to make it
> ignore anything that happens to composer.json?

Now that composer.json also includes dependencies for core, ignoring
changes to it would also break things.

To make things worse, I noticed on my development environment that our
own scap-equivalent will just go on to run composer update even if the
file conflicted. This causes it to remove the extensions and libraries
we currently install via composer, also breaking the site.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-05-29 Thread Niklas Laxström
2014-05-29 20:27 GMT+03:00 Bryan Davis :
> What use cases did I miss? What other concerns do we have for this process?

The email subject does not cover third party users, so apologies if
this is not the correct place for this.

Currently updating translatewiki.net codebase is annoying, as git does
not handle files replaced with symlinks well.

I am not happy since I had to spent unplanned effort to be able
install extensions via composer just a short while ago. We replace
composer.json with symlink to our configuration repo, where it
includes the extension we install via composer.

Ori has come up with a good solution for this issue, but that solution
requires changes to Composer. To me it looks like nobody is currently
working on that.

Is there a way I can expedite this fix?

Thank you for your efforts to bring composer support to MediaWiki. I
wish the process was less rough for me.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-15 Thread Niklas Laxström
2014-05-14 1:34 GMT+03:00 Jon Robson :
> During the Zurich hackathon, DJ Hartman, Aude and I knocked up a
> generic maps prototype extension [1]. We have noticed that many maps
> like extensions keep popping up and believed it was time we
> standardised on one that all these extensions could use so we share
> data better.

1) Will it be moved to Gerrit?
2) One of my use cases is creating maps of translators [1]. Will it be
able to display OSM maps based on Semantic MediaWiki properties?

[1] https://translatewiki.net/wiki/Map_of_translators

2014-05-14 19:43 GMT+03:00 Dan Andreescu :
> For the short term, I think further exploration of the Map namespace is
> great, but I think generic visualization work could go into the
> Visualization namespace.  My suggestion for a name for this namespace may
> seem a bit obscure.  It's a word that means "to illuminate": Limn [5].

3) It's a word that is difficult to translate.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming jQuery upgrade (breaking change)

2014-05-09 Thread Niklas Laxström
Is there any way to deliver the deprecation messages to a server side
log? I'd rather not spend time inspecting all user scripts and gadgets
manually (I'm thinking of non-WMF sites).

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance issue with regard to JSON-based localisation format

2014-04-14 Thread Niklas Laxström
2014-04-14 15:35 GMT+03:00 Adrian Lang :
> you are right, although I'm not using an older version of MediaWiki,
> I'm on 77bc489c2731827b1c61a6509177eed23193d694 from 2014-04-11.

Is something loading the i18n files manually then? Or is there a
missing $wgMessagesDirs definition matching $wgExtensionMessagesFiles?
LC was made so that is skips loading PHP shims when JSON alternative
is present.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance issue with regard to JSON-based localisation format

2014-04-14 Thread Niklas Laxström
I think technically easiest solution is to modify the i18n.php files:

-$GLOBALS['wgHooks']['LocalisationCacheRecache'][] = function (
$cache, $code, &$cachedData ) {
+$GLOBALS['wgHooks']['LocalisationCacheRecache'][__FILE__] = function
( $cache, $code, &$cachedData ) {

This makes it so that if the file is included again, it will just
override the previous callback set in that file, instead of adding a
new one.

The downside of this approach is that someone needs to change this in
all the hundreds of extensions.

Modifying LC itself does not help users like you who are running older
versions of MediaWiki [1].

[1] The shims are only used in <= 1.22.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bug 24620 - Log entries are difficult to localize; rewrite logs system

2014-04-05 Thread Niklas Laxström
The rewrite did happen, so what is actually left, like Nemo pointed
out, is converting the remaining places in the code using the old
logging code to the new one.

I tried to clarify that by changing the bug title and blockers.

PS: the new logging system also brings other benefits than better i18n.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Implementation JSON based localisation format for MediaWiki nearly completed

2014-04-03 Thread Niklas Laxström
2014-04-03 16:53 GMT+03:00 Justin Folvarcik :
> Just a question: Are the messages still able to be accessed like they were
> before, or will new methods be introduced to gather them and decode the
> JSON?

There are no visible changes for developers using the messages api:
https://www.mediawiki.org/wiki/Manual:Messages_API

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Implementation JSON based localisation format for MediaWiki nearly completed

2014-04-02 Thread Niklas Laxström
>> use tabs for indentation instead of spaces to be like the rest of
>> mediawiki?
>>
>
> I was going to say the same thing. Why wasn't that caught in code review?

Spaces were chosen because that is what we get with FormatJson::encode
and there is no way to change it except by post-processing. I'm fine
with both tabs and spaces.

Unless you change FormatJson::encode (which is used by many other
things as well), you would need to apply this post-processing in
multiple places, making it harder to alter these files from PHP code
and you would take a small performance hit for the extra processing.

> Also, I see we've lost the helpful comments that used to be in some of
> these files to visually divide things into sections.

That is true. JSON does not allow comments.

On the other hand, we can now stop updating messages.inc.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Encoding (Localisation updates from https://translatewiki.net.)

2014-03-26 Thread Niklas Laxström
For UTF-8 adding ensure_ascii = False to json.dumps would fix it. For
HTML, there is no simple way as far as I know. With some searching you
can find some workarounds. Or you can consider using
https://github.com/simplejson/simplejson

I did point out this issue almost a week ago
https://gerrit.wikimedia.org/r/#/c/119637/4/i18n/qqq.json
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Webfonts

2014-03-14 Thread Niklas Laxström
I just want to clarify that I was highlighting the possibility of
considering webfonts for *typography*.

I expect everyone to know by now that tofu issue is not yet solved and
people are working on it.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Webfonts

2014-03-13 Thread Niklas Laxström
There have recently been questions whether WMF is able to serve
webfonts. Some people think that because of the issues that led to
disabling webfonts by default in Universal Language Selector (ULS),
WMF is not ready to consider webfonts for typography.

I don't think that way. ULS is not a good comparison point because of
the following.

1) Universal Language selector is trying to solve a much harder issue
than what webfonts are usually used for. It is trying to avoid tofu
(missing fonts) which brings a whole list of issues which are not
present or are much smaller otherwise:
* large fonts for complex scripts,
* detecting which fonts are missing,
* many fonts per page,
* the systems with greatest need of fonts often have bad renderers.

2) WMF has a lot of experience working with web fonts by now. We know
how to handle different formats, how to optimally compress fonts and
how to check the results in different systems and browsers. In some
areas we are even ahead of Google, like non-latin fonts.

Thus, I think that delivering a relative small fonts for simple
scripts like latin and cyrillic is something that is possible *if* we
are willing to accept that it will take some bandwidth and that page
load experience can be affected* if the font is not cached or present
locally.

  -Niklas

* The unwanted effects of using webfonts are getting smaller and
smaller with modern browsers.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-08 Thread Niklas Laxström
2014-03-08 0:39 GMT+02:00 George Herbert :
> This is not disrespecting development, which is extremely important by any
> measure.  But we're running a top-10 worldwide website, a key worldwide
> information resource for humanity as a whole.  We cannot cripple
> development to try and maximize stability, but stability has to be priority
> 1.  Any large website's teams will have the same attitude.

Please do not forget the contributors who want to improve MediaWiki
for their own needs. We also have to balance how much we inconvenience
them to meet the requirements of WMF. In my opinion, the balance is
already in favor of WMF.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC 2014 idea

2014-02-28 Thread Niklas Laxström
2014-02-28 11:09 GMT+02:00 Roman Zaynetdinov :
> From which source gather the data?
>
> Wiktionary is the best candidate, it is an open source and it has a wide
> database. It also suits for growing your project by adding different
> languages.

It's not obvious why you have reached this conclusion.

1) There are many Wiktionaries, and they do not all work the same or
have the same content.
2) The Wiktionary data is relatively free form text, so it is hard to
parse to find the relevant bits.
3) Dozens of people have mined Wiktionary already. It would make sense
to see if they have put the resulting database available.
4) There are many sources of data, some of them also open, which can
have better coverage, or coverage on speciality areas where
Wiktionaries are lacking.
5) I expect that best results will be achieved by using multiple data sources.

> Growth opportunities
>
> I am leaving in Finland right now and I don't know Finnish as I should to
> understand locals, therefore this project can be expanded by adding more
> languages support for helping people like me reading, learning and
> understanding texts in foreign languages.

I hope you enjoyed your stay in here. I do not how much Finnish you
have learned, but after a while it should be obvious that just
searching for the exact string the user clicked or selected will not
work because of the agglutinative nature of the language. I advocate
for features which work in all languages (at least in many :). If you
implement this for English only first, it is likely that you will have
to rewrite it to support other languages.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to retrieve current wiki time

2014-02-26 Thread Niklas Laxström
There is also param 'createonly' - Don't edit the page if it exists already
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help me to avoid local patches

2014-02-10 Thread Niklas Laxström
On one of my wikifarms I only have one local patch to MediaWiki core
left. I'm hoping to have that merged as well, so let me bring this to
your attention: https://gerrit.wikimedia.org/r/#/c/98078/

On a related note, I'm using html templates and I am in need of a way
to deliver them with resource loader. Max Semenik has already made a
patch for that, so please have a look at that too:
https://gerrit.wikimedia.org/r/#/c/111250/

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki-Vagrant can run MediaWiki under HHVM!

2014-01-19 Thread Niklas Laxström
Do you know where I can find these hhvm-nightly packages if I want to
try them out on my own?

Last time I tested hhvm on translatewiki.net, there were fastcgi
parameter passing problems which blocked further testing there.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC cluster summary: HTML templating

2013-12-27 Thread Niklas Laxström
2013/12/27 Tyler Romeo 

> If we want a comprehensive templating system in PHP core,
>

I want a templating system that can be used both in PHP and JavaScript and
fits in our way of doing i18n. And a bunny.

  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How can I retrieve patrol token for specific revision

2013-11-29 Thread Niklas Laxström
Action tokens should give it:
https://en.wikipedia.org/w/api.php?action=tokens&type=patrol
​  -Niklas
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core code coverage report

2013-10-18 Thread Niklas Laxström
2013/10/18 Antoine Musso :
> Hello,
>
> Too long, dont want to read:
>   https://integration.wikimedia.org/cover/mediawiki-core/master/php/

This is very cool, apart from the numbers of course.

Is it hard to setup for extensions as well? Lots of development
happens in extensions nowdays. I read from the linked bug that it is
currently not possible due to bug in PHP.

  -Niklas




-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Isolate custom jQuery libraries

2013-10-12 Thread Niklas Laxström
2013/10/12 Daniel Friesen :
> I've been bothered for awhile by the mess we have in resources/jquery/ –
> 3rd party libraries, custom libraries we have to maintain, and directly
> MW related code using mediaWiki.* APIs all mixed together in the same
> directory. So I went and audited the .js we have inside
> resources/jquery/ and have wrote up an RFC on it:
>
> https://www.mediawiki.org/wiki/Requests_for_comment/Isolate_custom_jQuery_libraries

Thanks for doing the research. I recently came across this [1], but
based on your writing it was just a tip of the iceberg.

  -Niklas

[1] https://gerrit.wikimedia.org/r/#/c/82116/

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] What are DeferredUpdates good for?

2013-09-12 Thread Niklas Laxström
All the documentation I could find is in docs/deferred.txt. Let me
paste the paragraph:

"A few of the database updates required by various functions here can be
deferred until after the result page is displayed to the user.  For example,
updating the view counts, updating the linked-to tables after a save, etc.  PHP
does not yet have any way to tell the server to actually return and disconnect
while still running these updates (as a Java servelet could), but it might have
such a feature in the future."

That text has been there at least since 2005. Given that to my
knowledge there still is no such feature: I've spent hours trying to
investigate why DeferrableUpdates delayed the page delivery as I
incorrectly assumed those would be run after page has been delivered
and trying to figure out if it is possible to make them actually work
that way with PHP-FPM and nginx.

Should we just get rid of them? That should be easy, by either moving
stuff to the jobqueue or just executing the code immediately.

Or if they are useful for something, can we at least document the
*class* to reflect how it actually works and what it is useful for?

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Double session this Thursday: Security and Automated QA

2013-06-25 Thread Niklas Laxström
On 25 June 2013 18:52, Brad Jorsch (Anomie)  wrote:
> On Tue, Jun 25, 2013 at 10:32 AM, Quim Gil  wrote:
>> Wikimedia Tech Talk
>> Attack vectors & MediaWiki /// OWASP ZAP
>> Special guest: Mike Gagnon is an independent security researcher and a
>> software engineer at Twitter.
>> http://www.mediawiki.org/wiki/Meetings/2013-06-27-midday
>
> The times listed at that link are confused, it says both 19:00 PDT and
> 12:30 PDT.

It also says Wednesday, but I assume Thursday is intended.
  -Niklas
--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] migrating hooks doc to doxygen?

2013-06-04 Thread Niklas Laxström
On 4 June 2013 19:00, Antoine Musso  wrote:
> Hello,
> Thoughts ?

I had taken another approach in Translate which was designed to be
easy to sync to wiki:
* 
https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FTranslate.git/2cd676fd53e4d2dd45ac22972175739f0b3e2bf0/hooks.txt
* https://www.mediawiki.org/wiki/Help:Extension:Translate/Hooks

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code review of the Wikidata code

2013-05-18 Thread Niklas Laxström
On 17 May 2013 16:57, Denny Vrandečić  wrote:
> Hey,
>
> in order to sanity check the code we have written in the Wikidata project,
> we have asked an external company to review our code and discuss it with
> the team. The effort was very instructional for us.
>
> We want to share the results with you. The report looks dauntingly big, but
> this is mostly due to the appendixes. The first 20 pages are quite worth a
> read.

I read half of the report. Here is what I got out of it:

* We should do dependency injection to make code testable
* We should adhere to the single responsibility principle to make code
testable and less complex
* MediaWiki in many places (special pages, api, hooks to name few)
makes it impossible or at least very hard to do the above for most
classes

They proposed a solution (work around in Wikibase) to allow writing
better code in Wikibase. In section 4.1.6 it reads "The sketched
approach fulfills all presented requirements: The dependency injection
mechanism is kept simple and easy to understand". But as a reader I
did not understand how it works or what would be the practical gains
for the code complexity that the solution itself introduces. I would
attribute some part of this to the many typos, mistakes and non-fluent
language in the text and examples. One example:

"For example, if the a DependencyManager as configured in Illustration 1:
DependencyManager is asked to retrieve the object with the key “mail”, it will
ask the registered DatabaseBuilder to create that object and return it."

I am curious whether there was any discussions if and how these issues
could be fixed in MediaWiki core? Or do we need[1] to figure that out
on our own?

[1] I see that I already jumped on the "how to get there" part, but I
guess I should first ask: is there sufficient mass that thinks there
is an issue and that the issue should be fixed? I think that there is
definitely a need for core to adopt better coding practices to reduce
code complexity and allow easier unit testing. Just like the move from
svn to git and gerrit, it will need a lot of effort and can annoy
people, but in the end we will be better than before.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MathJax opt-in for all visitors

2013-05-03 Thread Niklas Laxström
On 3 May 2013 08:13, Peter Krautzberger  wrote:
> Hi,
>
> Here's an idea somebody suggested to me.
>
> I would like to propose a way for any visitor to opt-in to MathJax on the
> fly. (Oh, maybe I should add a disclaimer: I work for MathJax.)
>
> This would be simply a button on pages with math that would switch MathJax
> on (and possibly off via a cookie).

What is the state of MathJax i18n currently? Before we show it to lots
of users it should be translatable.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review version revives year-old bug

2013-04-09 Thread Niklas Laxström
On 9 April 2013 00:54, Roan Kattouw  wrote:
> The workaround is the same as last year: if git-review complains and
> you see bogus commits in the list, respond "no" to abort, run "git
> fetch gerrit", then rerun git-review. This will ensure git-review has
> an up-to-date view of the remote master.

Fortunately we still had the workaround in our scripts at
translatewiki.net from last time, so l10n-bot is not affected.

  -Niklas



--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wmf12 rollback, all wikis (except test2) are on wmf11

2013-03-22 Thread Niklas Laxström
On 21 March 2013 20:11, Greg Grossmeier  wrote:
> Tim rolled back wmf12 after a nasty bug last night:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=46397

I assume this included all the extension as well.
  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?

2013-03-21 Thread Niklas Laxström
I've seen a couple of instances where changes to MediaWiki are blocked
until someone informs the community.

Someone is a volunteer.

Community is actually just the Wikimedia project communities. Or at
least the biggest ones which are expected to complain and where the
complaining would hurt.

This situation seems completely unfair to me. WMF should be able to
communicate upcoming changes itself, not throw it to volunteers.
Volunteers can help, but they should not be responsible for this to
happen.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reminder about the best way to link to bugs in commits

2013-03-20 Thread Niklas Laxström
On 1 March 2013 23:46, Chad  wrote:
> Bug: 1234
> Change-Id: Ia90.
> """
>
> So when you do this, you're able to search for "bug:1234" via Gerrit.
> By doing this, you're also removing it from the first line (which was
> our old habit, mostly from SVN days), providing you more space to
> be descriptive in that first line.

Few questions:

1) Why is "Bug:43778" different from "bug:43778" when searching?

2) Can we do the same for all things in the footer? I tried it but
"bug" seems to be a special case and nothing else works.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-14 Thread Niklas Laxström
On 14 March 2013 01:07, Christian Aistleitner
 wrote:
> Wouldn't "git notes" be a good match [1]?
>
> They're basically annotations attached to commits. You can add/remove
> them to any commit without changing the actual commit. And support
> comes directly with git, as for example in
>
>   git log --show-notes
>
> Queries are just a grep away, and setting them can be done through git
> as well, until we integrate that into gerrit.

Except that you need to have the repository cloned locally first.

And often you need to query multiple repositories.

You can work around both of the above... but then we are again on the
"more than non-significant effort needed" path.

  -Niklas


-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-10 Thread Niklas Laxström
On 10 March 2013 03:11, Rob Lanphier  wrote:
> Hi folks,
>
> Short version: This mail is fishing for feedback on proposed work on
> Gerrit-Bugzilla integration to replace code review tags.

Interesting idea. I can imagine Bugzilla integration working fine for
filing fixmes - not so sure if people would fix them this way either.

I am wondering if it would be an overkill for tags like i18ndeploy or
backport - at least there should be easy way to query a list of those
kind of bugs.

As far as I understand, the filed bugs would not be displayed in
Gerrit side nor could you search with a tag in Gerrit, so there is
discoverability problem.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Module namespace

2013-02-28 Thread Niklas Laxström
As observed, it is not possible to translate extension namespaces at
translatewiki.net [1][2].

All the usual routes can be used, depending on your technical skills:
* File a [[Support]] request at translatewiki.net
* File a bug in Bugzilla against the extension
* Submit a patch to Gerrit

  -Niklas

[1] this is the preferred way to write it
[2] patches welcome

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfMsg and wfMsgForContent are deprecated. What to use?

2013-02-21 Thread Niklas Laxström
On 19 February 2013 19:10, Krenair  wrote:
> See
> https://www.mediawiki.org/wiki/Manual:Messages_API#Deprecated_wfMsg.2A_functions

The whole page is a recommended reading. I refactored it about a week
ago to make it more accessible. Thank you to everyone who has already
provided feedback or cleaned up after my mistakes.

  -Niklas


-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extensions and LTS

2013-02-12 Thread Niklas Laxström
On 13 February 2013 01:48, Mark A. Hershberger  wrote:

> I'm going to start paying close attention to people who have problems
> upgrading from 1.19 over the next couple of years so that when we hit
> the next LTS (1.25) in 2015, we'll have fewer issues for the people
> moving from 1.19 to 1.25.

There is near zero chance that I will keep supporting 1.19 for that
long in master branch in all of my extensions (and this is assuming
support for 1.19 is dropped immediately when 1.25 is released). It's
already hard since 1.19 is missing some features I need.

People need to realize that they can't get the latest shiny extensions
on years old MediaWiki. For extensions this would mean that they would
create a branch with security fixes and other important bug fixes for
1.19. MLEB will probably just declare some release as the last release
that works with 1.19.

1.19 was released on 2012-02-09. For comparison Ubuntu has released
LTS version every two years.

> I'd like to think that extensions in [[Category:Stable_extensions]] will
> be maintained, but maybe that isn't right.  I certainly haven't tried
> all of them against 1.19.

Maintained doesn't necessarily mean support for 1.19 is kept.

> Developers can help out with this.  On Debian, for example, developers
> can announce that they're orphaning a package and it needs a new
> maintainer.  Ideally, a developer would find someone to maintain his own
> extension, but if you see an orphaned extension or just don't feel like
> maintaining one any more, please add {{Unmaintained extension}} to its
> page on MediaWiki.org.

Has any extension been adopted this way?

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Minimalist MediaWiki? (was Re: Merge Vector extension into core)

2013-02-06 Thread Niklas Laxström
On 6 February 2013 18:42, Mark A. Hershberger  wrote:
> On 02/06/2013 03:53 AM, Petr Bena wrote:
>> don't merge anything into core pls. rip the stuff out of core and make
>> extensions from them, and finally, please make core a lightweight and faster
>
> The speed of MediaWiki isn't directly related to the amount of code in
> core.  Speed is all about keeping the average code path to execute any
> given request short.

Rather than absolute size, we should care about core that is modular
and has well-defined interfaces. Moving stuff to extensions is one way
to encourage this, as then the functionality cannot be relied on or it
can be replaced with an another extension.

More refactorings and rewrites are needed (like ContentHandler) to get
MediaWiki back to enabling the development of new ideas rather that
limiting that. At the same time constantly changing interfaces and
breaking existing extensions will annoy people depending how much we
will put effort into keeping backwards compatibility. Can't please
everyone.

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FOSDEM presentation - feedback welcome

2013-01-31 Thread Niklas Laxström
On 30 January 2013 20:20, Quim Gil  wrote:
>
> This Sunday at FOSDEM I will have a lightning session at FOSDEM:
>
> How to hack on Wikipedia
>
> It is in fact an intro to MediaWiki & Wikimedia tech contributions, designed 
> to be reusable and customized by others for other occasions.
>
> I just uploaded a new version at
> https://commons.wikimedia.org/wiki/File:How_to_hack_on_Wikipedia.pdf
>
> Still working on details & credits. Your feedback is welcome!

I've noticed people tend get a wrong impression when we talk about API
(they think it is something internal to MediaWiki) and hence I've
started calling it WebAPI to make it more explicit.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] getting Request object from EditPage

2012-12-18 Thread Niklas Laxström
> On Tue, Dec 18, 2012 at 4:48 PM, Yury Katkov  wrote:
>> Hi guys!
>>
>> I'm writing the EditPage::showEditForm:fields and I want to get a
>> Request object. The use of wgRequest considered to be deprecated, so
>> how is it possible to get request object in my hook function?
>>
>>  static public function showBacklinks($editpage, &$output){
>>  return true;
>>  }

OutputPage is a context source, so you can do $output->getRequest().
Less nicer way is $editpage->getArticle()->getContext()->getRequest().

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recent test failures for many extensions

2012-12-11 Thread Niklas Laxström
Jenkins has been failing all commits for certain extensions today. I
assume it is somehow related to the accidental addition of submodules
to core master branch. I've been told that some of the tests that
Jenkins runs have been disabled.

I'm assuming someone else will post a followup to explain all the
"somes" above and make sure that all tests are re-enabled and passing
for all affected extensions.

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] gerrit support question: how to show raw file content after the commit in the browser, not as zip download

2012-12-11 Thread Niklas Laxström
On 11 December 2012 11:42, Thomas Gries  wrote:
> While tracking an iusse I came to
> https://gerrit.wikimedia.org/r/#/c/7986/ (example case)
> In the list of files I clicked on
> https://gerrit.wikimedia.org/r/#/c/7986/14/includes/UserMailer.php
>
> Now I am "desperately seeking" a link to _show the raw file content
> after the commit in the __browser__,__
> _but only found a link "(Download)" which starts a zip download. This is
> not what I wanted.
>
> Is there a solution which I have overlooked?

If you just want to view the content, in diff view click Preferences,
on the left bottom choose Whole File as context and click Update. It
is no good for copy-pasting though.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unit tests scream for attention

2012-12-08 Thread Niklas Laxström
On 8 December 2012 21:22, Platonides  wrote:
> They mostly run for me.

Yep, mostly:
Tests: 4653, Assertions: 556494, Failures: 119, Errors: 17,
Incomplete: 3, Skipped: 15.

I've not even looked at most of these failures because of the fatal
errors. Thanks to Jeroen and Brad there is now one fatal error less
(and at least one remains), though I did spend two hours debugging
that issue.
  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unit tests scream for attention

2012-12-07 Thread Niklas Laxström
On 7 December 2012 22:51, Brad Jorsch  wrote:
> While that unit test mentioned there does seem screwed up, why is your
> PHPUnit installation not respecting convertErrorsToExceptions="true"
> and related settings in MediaWiki's provided suite.xml?

I honestly don't know. I barely got working PHPUnit installed in the
first place (3.7.8 - see my other previous phpunit thread). I also
don't know why it occasionally segfaults.

  -Niklas


--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Unit tests scream for attention

2012-12-07 Thread Niklas Laxström
Now that tests need +2 to be run, at least temporarily, I'm going to
point out that I've not been able to run tests on my development
environment in ages. I mentioned broken unit tests in Oct 4 on this
list. 
http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/64390

There are multiple fatal bugs (not to mention the numerous test
failures), that halt test runs without any info expect the error. Some
bugs I've reported:

* https://bugzilla.wikimedia.org/41491
* https://bugzilla.wikimedia.org/42145 (skip the first few comments)

Today I tried again and there is new one:

Catchable fatal error: Argument 2 passed to
OutputPage::addWikiTextTitle() must be an instance of Title, null
given, called in /www/dev.translatewiki.net/w/includes/OutputPage.php
on line 1426 and defined in
/www/dev.translatewiki.net/w/includes/OutputPage.php on line 1472

This might be just a variant of 42145, but I can't tell for sure. I
could add exception there, but the other fatal errors make phpunit not
to display backtraces. I haven't yet had time to try to find out which
test it is.

This situation is starting to feel like a bad horror movie, so I ask
everyone to give some tender, love and care to our unit tests so that
I don't have to come up with even worse analogies.

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: MediaWiki Language Extension Bundle launches

2012-11-28 Thread Niklas Laxström
On 28 November 2012 23:54, Platonides  wrote:
> On 28/11/12 12:28, Niklas Laxström wrote:
>> * Running of PHPUnit tests is currently broken
> Why?

Because of https://bugzilla.wikimedia.org/42529

There is also one test failure I don't know to fix. The test is marked
as databaseless but it runs a hook in Translate that accesses a
database table.
1) ExtraParserTest::testBug8689
DBQueryError: A database error has occurred.  Did you forget to run
maintenance/update.php after upgrading?  See:
https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: SELECT  tmi_value  FROM translate_messageindex  WHERE tmi_key =
'0:unit_test'  LIMIT 1
Function: DatabaseMessageIndex::get
Error: 1 no such table: translate_messageindex

  -Niklas


Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: MediaWiki Language Extension Bundle launches

2012-11-28 Thread Niklas Laxström
FYI. I will not be posting release announcements here, but feedback on
the implementation is welcome.

Code is at: 
https://gerrit.wikimedia.org/r/gitweb?p=translatewiki.git;a=tree;f=melange

Maybe there is interest in making it more generic and use it to make
other extension bundles too. In that case it we could create own repo
for it. Some known TODOs on top of my head:
* Signing the archive
* Running of PHPUnit tests is currently broken
* QUnit tests need to be run manually

  -Niklas


-- Forwarded message --

The Wikimedia Language Engineering team is pleased to announce the
first release of the MediaWiki Language Extension Bundle. The bundle
is a collection of selected MediaWiki extensions needed by any wiki
which desires to be multilingual.

This first bundle release (2012.11) is compatible with MediaWiki 1.19,
1.20 and 1.21alpha.
Get it from https://www.mediawiki.org/wiki/MLEB

The Universal Language Selector is a must have, because it provides an
essential functionality for any user regardless on the number of
languages he/she speaks: language selection, font support for
displaying scripts badly supported by operating systems and input
methods for typing languages that don't use Latin (a-z) alphabet.

Maintaining multilingual content in a wiki is a mess without the
Translate extension, which is used by Wikimedia, KDE and
translatewiki.net, where hundreds of pieces of documentation and
interface translations are updated every day; with Localisation Update
your users will always have the latest translations freshly out of the
oven. The Clean Changes extension keeps your recent changes page
uncluttered from translation activity and other distractions.

Don't miss the chance to practice your rusty language skills and use
the Babel extension to mark the languages you speak and to find other
speakers of the same language in your wiki. And finally the cldr
extension is a database of language and country translations.

We are aiming to make new release every month, so that you can easily
stay on the cutting edge with the constantly improving language
support. The bundle comes with clear installation and upgrade
installations. The bundle is tested against MediaWiki release
versions, so you can avoid most of the temporary breaks that would
happen if you were using the latest development versions instead.

Because this is our first release, there can be some rough edges.
Please provide us a lot of feedback so that we can improve for the
next release.

  -Niklas

--
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems with PHPUnit

2012-10-26 Thread Niklas Laxström
On 26 October 2012 15:57, Tim Landscheidt  wrote:
> Niklas Laxström  wrote:
>
>> [...]
>>> All of this shouldn't be necessary, as PHPUnit 3.6.12's
>>> package.xml clearly declares a dependency on PHP_Timer
>>>>= 1.0.1 and <= 1.0.3.  In other words: Whatever Niklas used
>>> for update seems to be broken and should be fixed.
>
>> Not too clearly apparently, I had this when installing invoker:
>
>> sudo pear install phpunit/PHP_Invoker
>> phpunit/phpunit requires package "phpunit/PHP_Invoker" (version >=
>> 1.1.0, version <= 1.1.1), downloaded version is 1.1.2
>
>> Then I installed version 1.1.1.
>
> Your problem lies (inter alia) with:
>
> | Fatal error: Call to undefined function php_timer_autoload() in
>   ^
> | /usr/share/php/PHPUnit/Util/GlobalState.php on line 381

Well wrong package, but I get PHP_Timer 1.0.4 nevertheless. I did some
uninstalling and installed 1.0.3 first, but then it broke on some
other package.

So I got pissed of and uninstalled everything from pear. Now when I
try to install phpunit again I get:
sudo pear install --alldeps phpunit/phpunit
Duplicate package channel://pear.phpunit.de/File_Iterator-1.3.3 found
Duplicate package channel://pear.phpunit.de/File_Iterator-1.3.2 found
install failed

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problems with PHPUnit

2012-10-26 Thread Niklas Laxström
On 26 October 2012 15:02, Tim Landscheidt  wrote:
> Antoine Musso  wrote:
>
>>> I recently updated packages via PEAR. I don't remember what exactly,
>>> but right now I have
>>> PHPUnit-3.6.12
>>> PHP_Invoker-1.1.1
>
>>> When I try to run tests for Translate I get:
>>> 1) SpecialPagesTest::testSpecialPage with data set #3 ('LanguageStats')
>
>>> Fatal error: Call to undefined function php_timer_autoload() in
>>> /usr/share/php/PHPUnit/Util/GlobalState.php on line 381
>>> make: *** [default] Error 255
>
>> Could you paste all your versions obtained with:
>
>>  pear list -c phpunit

twn:~$ pear list -c phpunit
Installed packages, channel pear.phpunit.de:

PackageVersion State
DbUnit 1.1.2   stable
File_Iterator  1.3.2   stable
PHPUnit3.6.12  stable
PHPUnit_MockObject 1.1.1   stable
PHP_CodeCoverage   1.1.2   stable
PHP_Invoker1.1.1   stable
PHP_Timer  1.0.4   stable
PHP_TokenStream1.1.5   stable
Text_Template  1.1.2   stable

>
>> php_timer_autoload() was defined in PHP_Timer 1.0.3 apparently and got
>> removed in 1.0.4.  It seems PHPUnit 3.6 still rely on it so either you
>> have to downgrade PHP_Timer to 1.0.3 or possibly 1.0.2 or upgrade
>> PHPUnit to 3.7.x (recommended).

3.7.x is not marked stable, can I install it just by specifying version number?

> All of this shouldn't be necessary, as PHPUnit 3.6.12's
> package.xml clearly declares a dependency on PHP_Timer
>>= 1.0.1 and <= 1.0.3.  In other words: Whatever Niklas used
> for update seems to be broken and should be fixed.

Not too clearly apparently, I had this when installing invoker:

sudo pear install phpunit/PHP_Invoker
phpunit/phpunit requires package "phpunit/PHP_Invoker" (version >=
1.1.0, version <= 1.1.1), downloaded version is 1.1.2

Then I installed version 1.1.1.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problems with PHPUnit

2012-10-26 Thread Niklas Laxström
I recently updated packages via PEAR. I don't remember what exactly,
but right now I have
PHPUnit-3.6.12
PHP_Invoker-1.1.1

When I try to run tests for Translate I get:
1) SpecialPagesTest::testSpecialPage with data set #3 ('LanguageStats')

Fatal error: Call to undefined function php_timer_autoload() in
/usr/share/php/PHPUnit/Util/GlobalState.php on line 381
make: *** [default] Error 255

Any ideas?

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.20rc2 is about 50 times slower (on openSUSE)

2012-10-25 Thread Niklas Laxström
On 26 October 2012 01:07, Platonides  wrote:
> On 25/10/12 19:03, Bryan Tong Minh wrote:
>> Your log file:
>>
>> CACHES: XCacheBagOStuff[main] XCacheBagOStuff[message]
>> XCacheBagOStuff[parser]
>> [...]
>> MessageCache::load: Loading en... cache is empty, loading from database,
>> loading FAILED - cache is disabled
>> [...]
>> MessageCache::load
>> 1 10002.965 10002.96598.312%  1670  (10002.965 -
>> 10002.965) [1]
>>
>>
>> Shouldn't it fallback to the DB if XCache is not found? Were there changes
>> with respect to this in 1.20?
>>
>> Bryan
>
> That's not necessarily better. The current approach of disabling the
> MessageCache seems good.
>
> The strange thing is that it will note the cache error before loading
> the messages (and not use them). And the cache is then put into disabled
> mode. So I don't know why it would take so much time in MessageCache::load()

It tries to acquire the lock until MGS_WAIT_TIMEOUT has passed, which
is 10 seconds. The lock is there to prevent multiple threads building
the cache concurrently. For reasons unknown to me, there is another
lock inside ->lock() ->unlock() which again fails and disables message
cache.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] whether to do Google Code-In

2012-10-25 Thread Niklas Laxström
On 25 October 2012 19:55, Platonides  wrote:

> An easy way to get articles would be in the documentation front, asking
> for a couple of wiki pages documenting something, a tutorial (with
> screenshots) on installing MediaWiki... Creating X new translations for
> MediaWiki or its extensions on translatewiki would also be an easy way
> of producing tasks.

Translation tasks are not allowed as far as I remember, but some open
support tasks at translatewiki.net could be suitable.

Perhaps 'document X messages' would be allowed. That could include
taking screenshots and stuff.
  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pretty Timestamps

2012-10-25 Thread Niklas Laxström
On 25 October 2012 08:07, Tyler Romeo  wrote:
> So recently https://gerrit.wikimedia.org/r/15746 was merged. It implements
> a pretty timestamp function. Yet it was somehow completely ignored that we
> actually have an MWTimestamp class made specifically for timestamp objects
> in MediaWiki.

It does use wfTimestamp in many places.

If you are saying that the new functionality should be in the
MWTimestamp class, then I disagree. wfTimestamp and MWTimestamp have
never* contained anything else but code which parses and formats
timestamps in various different formats used in databases, http
protocol, image files, etc. This is clearly visible from the design of
the class, it doesn't take any kind of context for the language or
user to affect the formatting

Language class has always been responsible for formatting time and
date expressions for users.

* Well, not until someone added getHumanTimestamp which I guess should
be deprecated now.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Make Extensions Aware of the ContentHandler

2012-10-10 Thread Niklas Laxström
On 10 October 2012 16:52, Daniel Kinzler  wrote:
> Hi!
>
> I could use some help looking over extensions.

What is the difference between "$content instanceof TextContent" and
"$title->getContentModel() === CONTENT_MODEL_WIKITEXT"?
  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Merging the ContentHandler into master

2012-10-09 Thread Niklas Laxström
On 8 October 2012 19:33, Daniel Kinzler  wrote:
> Hi all!
>
> As discussed last week with Rob, I have no prepared a merge request that
> introduces the ContentHandler into MediaWiki core. This is a major building
> block for the Wikidata project. I hope the merge will be completed soon, since
> this will grow stale fast.
>
> The merge request is here: https://gerrit.wikimedia.org/r/27194
>
> Since Gerrit doesn't show nice diffs for merges,
> here's a squashed version: https://gerrit.wikimedia.org/r/27191
>
> Please let us know very soon if there are any serious problems. The branch has
> been reviewed before, and I resolved several remaining issues over the last
> days, so I hope there are no more big issues left.

And it's merged. Congrats!

I'm sure it is just an oversight, but some of the review comments in
[1] were not addressed before the merge. For example MessagesEn.php (a
fix has been submitted by someone else) and MessageCache.php [2].

[1] https://gerrit.wikimedia.org/r/#/c/25736/1 (took me a while to
find that change anymore)
[2] I see now that I misunderstood the code, but at least the comment
needs to be updated.

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please fix the PHP unit tests

2012-10-04 Thread Niklas Laxström
Currently there are so many failures in core unit tests that it I
can't run the tests locally. The new errors would only hide in between
all the other failures. Most of these are caused by the tests assuming
certain configuration and some issues comes from extensions.

I've reported bugs about 10 days ago but so far they have been either
1) silent 2) blaming me for having broken configuration 3) suggesting
a fix.

Here is a list:
https://bugzilla.wikimedia.org/show_bug.cgi?id=40491
https://bugzilla.wikimedia.org/show_bug.cgi?id=40490
https://bugzilla.wikimedia.org/show_bug.cgi?id=40489
https://bugzilla.wikimedia.org/show_bug.cgi?id=40488
https://bugzilla.wikimedia.org/show_bug.cgi?id=40487
https://bugzilla.wikimedia.org/show_bug.cgi?id=40484 [CentralNotice]
https://bugzilla.wikimedia.org/show_bug.cgi?id=40483 [SMW]
https://bugzilla.wikimedia.org/show_bug.cgi?id=40432 [Maps]

I don't have time to fix all these myself.

  -Niklas

FAILURES!
Tests: 4685, Assertions: 320062, Failures: 62, Errors: 45, Incomplete:
5, Skipped: 6.
make: *** [safe] Error 2

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Skin pages on MW.org, and Skin repos in Gerrit

2012-09-26 Thread Niklas Laxström
On 26 September 2012 10:08, Krinkle  wrote:
> Another problem I found in the current setup is that its a bit 
> counter-intuitive how to manage the directory structure for developers. I 
> mean, most of us probably have this:
>
> - mediawiki
> - /core (clone mediawiki/core.git)
> - /extensions (directory with clones of individual extensions or clone of 
> mediawiki/extensions.git tracking repo)

In SVN time extensions were a subdir of mediawiki core and I doubt
that everyone has suddenly decided to change it. At least I haven't.
  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Problem with unit tests and dataProviders

2012-09-24 Thread Niklas Laxström
On 24 September 2012 13:27, Antoine Musso  wrote:
> One possibility would be to only set it once using the
> setUpBeforeClass() and tearDownAfterClass(). Those are only run once in
> the class. So you could create a new test file having a class dedicated
> to this test.

Those are static methods, I cannot store the users anywhere if I use those.

> Can you possibly send the code in Gerrit so we can have a look at it?

Yes.

  -Niklas
-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Problem with unit tests and dataProviders

2012-09-23 Thread Niklas Laxström
I'm writing unit tests for one of Translate classes.

In the setUp I need to create few pages, but I need to also control
the user ids of the revisions. This seems to work well except for two
things:
* dataProvider methods are called *before* setUp, so I cannot use the
user ids I have stored in setUp.
* setUp and tearDown are called for *every* item in the dataProvider.
This seems very wasteful - no wonder the tests takes minutes or so to
run.

This just doesn't make any sense to me. I'm considering to stop using
@dataProvider in this case - any other ideas?

The code in setUp is something like this:

$title = Title::makeTitle( NS_MEDIAWIKI, 'Key1/fi' );
$user = User::newFromName( 'Translate test user 1' );
$user->addToDatabase();
WikiPage::factory( $title )->doEdit( 'trans1', __METHOD__, 0, 
false, $user );
        $this->user1 = $user;

  -Niklas

-- 
Niklas Laxström

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   >