[Wikitech-l] CI now uses NodeJs version 6

2017-01-17 Thread Antoine Musso

Hello,

The Wikimedia Jenkins system now uses NodeJS version 6 (was 4).

That is part of a plan to upgrade the Wikimedia cluster to 6 which is 
tracked on: https://phabricator.wikimedia.org/T149331


If you see any issue with the *node-6-jessie Jenkins jobs, please fill 
them as sub tasks of T149331 and add in #contint Phabricator project.


Thanks Paladox for the CI configuration change and everyone that has 
been involved in ensuring our most important softs are compatibles.


--
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP fatal error: Call to undefined method stdClass::get()

2017-01-17 Thread Andre Klapper
On Wed, 2017-01-11 at 11:36 -0800, Pine W wrote:
> I just started getting a lot of these errors when trying to load
> pages on multiple Wikimedia sites.

If you'd like to report a software bug, Phabricator is the best place:
https://www.mediawiki.org/wiki/How_to_report_a_bug

If a bug is urgent, #wikimedia-tech on Freenode IRC is the best place:
https://www.mediawiki.org/wiki/MediaWiki_on_IRC

Cheers,
andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP fatal error: Call to undefined method stdClass::get()

2017-01-17 Thread Pine W
Hi Andre,

*Usually. At the time I reported this bug I didn't have quick access to
IRC, and appeared to be an urgent situation. In this case I made a
judgement that speed was important, and email was the tool that was most
readily available.

Pine


On Tue, Jan 17, 2017 at 7:38 AM, Andre Klapper 
wrote:

> On Wed, 2017-01-11 at 11:36 -0800, Pine W wrote:
> > I just started getting a lot of these errors when trying to load
> > pages on multiple Wikimedia sites.
>
> If you'd like to report a software bug, Phabricator is the best place:
> https://www.mediawiki.org/wiki/How_to_report_a_bug
>
> If a bug is urgent, #wikimedia-tech on Freenode IRC is the best place:
> https://www.mediawiki.org/wiki/MediaWiki_on_IRC
>
> Cheers,
> andre
> --
> Andre Klapper | Wikimedia Bugwrangler
> http://blogs.gnome.org/aklapper/
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia Labs][Announce] NFS (only labs projects) maintenance on 2017-01-18

2017-01-17 Thread Madhumitha Viswanathan
Reminder: This is happening tomorrow, starting 09:00 PST(16:00 UTC).

On Wed, Jan 4, 2017 at 1:43 PM, Madhumitha Viswanathan <
mviswanat...@wikimedia.org> wrote:

> Hello,
>
> Continuing the storage redundancy and reliability efforts for Labs (
> https://phabricator.wikimedia.org/T126083), the final migration to the
> new NFS storage cluster for Labs projects with NFS enabled is upcoming. The
> migration is planned to happen 2017-01-18 starting 09:00 PST(16:00 UTC).
> This *does not* affect tools, maps or any other projects that don't have
> /home or /data/project mounted. The migration window is expected to be
> fairly short (<3 hours) - but could last up to 6 hours.
>
> During the migration, no new data will be written to NFS (/home or
> /data/project), but existing data will be accessible in Read-only mode for
> the most part. Post migration, any services or jobs that were running on
> top of NFS (/home or /data/project) will require manual restarts. Jobs
> running on top of /scratch or /public/dumps will be unaffected. I will keep
> the lists and #wikimedia-labs updated on progress during and after the
> migration.
>
> The list of labs projects that will be affected in this migration are:
>
>
>- catgraph
>
>
>- account-creation-assistance
>
>
>- contributors
>
>
>- wikidata-topicmaps
>
>
>- sugarcrm
>
>
>- wikidumpparse
>
>
>- video
>
>
>- openstack
>
>
>- testlabs
>
>
>- wikidata-dev
>
>
>- quarry
>
>
>- huggle
>
>
>- editor-engagement
>
>
>- utrs
>
>
>- wmt
>
>
>- cvn
>
>
>- fastcci
>
>
>- toolsbeta
>
>
>- project-proxy
>
>
>- dumps
>
>
>- bots
>
>
>- snuggle
>
>
>- math
>
>
>- wikisource-tools
>
>
> The tracking task on phabricator is here - https://phabricator.
> wikimedia.org/T154336. Let us know if you have any questions or concerns
> on the list or on #wikimedia-labs.
>
> --
> Madhu Viswanathan
> Operations Engineer, Wikimedia Labs
>



-- 
--Madhu :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New 1.27/1.28 login mechanism

2017-01-17 Thread Aran
Hello,

I have a login system that extends the
AbstractPrimaryAuthenticationProvider and uses an AuthenticationRequest
that returns an empty fields array so that the login form is bypassed
completely and login is determined by some other environmental
parameters. But in 1.27 this does not bypass the login form.

What is the proper way I should be making the login page determine login
without a login form?

Thanks,
Aran


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Divide XML dumps by page.page_namespace (and figure out what to do with the "pages-articles" dump)

2017-01-17 Thread MZMcBride
Hi.

Re: https://phabricator.wikimedia.org/T99483

This is a task proposing dividing XML dumps by the numeric page namespace
ID (such as 2 for User pages). Please share your thoughts on the task.

It's currently unclear whether implementing this task would result in
getting rid of the "pages-articles" dump. We could keep generating it, but
it costs a non-negligible amount of disk space to do so. If you regularly
use the "pages-articles" XML dump format and have thoughts about keeping
it as-is or changing it, please comment on the task.

If someone could forward this e-mail to the xmldatadumps-l and
wiki-research-l mailing lists, I would very much appreciate it.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Final Call for Comments on the new Deprecation Policy for PHP code

2017-01-17 Thread Daniel Kinzler
Hi all!

Sorry for dropping the ball on reporting from the Architecture Committee. After
the holidays and the Developer Summit, I'm working on catching up. Things should
be back to normal by next week.

This final call regarding Deprecation Policy for PHP Code has passed without any
new concerns being raised.

The RFC is thus approved.

The deprecation policy as described in the RFC should be considered official
policy.

Regards,
Daniel


Am 16.12.2016 um 07:04 schrieb Daniel Kinzler:
> Hi all!
> 
> At the ArchCom office hour (aka RFC IRC meeting) on December 14, we discussed
> Legoktm's proposal for a deprecation policy for PHP code [1]. This would
> supersede the current guideline [2], allowing deprecated code to be removed
> after a couple of releases, even if it may still used by some extensions.
> 
> At the IRC meeting, we decided to approve this policy for a final call [3][4].
> This means that it will be adopted as official policy if no new pertinent 
> issues
> are raised within a week.
> 
> 
> Cheers,
> Daniel
> 
> 
> [1] https://www.mediawiki.org/wiki/Requests_for_comment/Deprecation_policy
> [2] https://www.mediawiki.org/wiki/Deprecation
> [3]
> https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-12-14-22.04.html
> [4]
> https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-12-14-22.04.log.html
> 


-- 
Daniel Kinzler
Principal Platform Engineer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Final Call for Comments on the Content Model Storage RFC

2017-01-17 Thread Daniel Kinzler
Hi all!

Again sorry for dropping the ball on ArchCom reporting, things should be back to
normal by next week.

This final call regarding Content Model Storage has passed without any new
concerns being raised.

The RFC is thus approved as proposed, and implementation can go forward.

Implementation work and deployment schedule should be coordinated with the
imminent work on Multi-Content-Revisions. If you are going to work on Content
Model Storage, please talk to me first.

Thanks,
Daniel


Am 14.12.2016 um 12:29 schrieb Daniel Kinzler:
> Hi all!
> 
> This is a Final Call for Comments on the RFC on Content Model Storage [1][2]. 
> If
> no new and serious objections are raised within a week, the Architecture
> Committee will approve this RFC and drive its implementation.
> 
> The RFC on Content Model Storage was originally approved in 2015, but was then
> postponed in favor of another RFC, which proposes to create a separate content
> meta-data table [3] as part of the move towards multi-content revisions (MCR) 
> [4].
> 
> However, MCR in turn got stuck on database performance concerns. So we now
> propose to go ahead with implementing the original RFC. The idea is to assign 
> a
> number to every content model (and content format), and then use these numbers
> to refer to the models and formats in the database, instead of repeating the
> same string millions of times (which is my fault btw, sorry about that).
> 
> Since the original RFC was already approved, and the situation does not seem 
> to
> have changed since then, we see no need for another round of discussions. If
> nobody raises any new and serious objections within a week, this should be 
> good
> to go.
> 
> 
> Cheers,
> Daniel
> 
> 
> [1] https://phabricator.wikimedia.org/T105652
> [2] https://www.mediawiki.org/wiki/Requests_for_comment/Content_model_storage
> [3] https://phabricator.wikimedia.org/T142980
> [4]
> https://www.mediawiki.org/wiki/Multi-Content_Revisions/Content_Meta-Data#Database_Schema
> 


-- 
Daniel Kinzler
Principal Platform Engineer

Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l