Re: [Wikitech-l] Status of the new PDF Renderer

2014-05-29 Thread Andre Klapper
Hi,

On Mon, 2014-05-19 at 11:57 -0700, C. Scott Ananian wrote:
> That's a good question!  I'm in SFO this week, so it's probably worth
> setting aside a day to resync and figure out what the next steps for
> the new PDF renderer are.

Any news (or a public test instance available)?

As I wrote, I'd be interested in having a bugday on testing the new PDF
renderer by going through / retesting
https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&component=Collection

Thanks,
andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding external libraries to core (SwiftMailer)

2014-05-29 Thread Tony Thomas
We will have to make Jenkins bot run composer install before running tests
for a start. Currently, jenkins wont be able to find the libraries and
the test fails
ref: https://gerrit.wikimedia.org/r/#/c/135290.
Thanks,
Tony Thomas
FOSS@Amrita

"where there is a wifi, there is a way"


On Wed, May 28, 2014 at 7:37 PM, Tony Thomas <01tonytho...@gmail.com> wrote:
> Yes. we will have the third party code kept in a separate repo though.
> Currently, mw users have to run :
>
> sudo apt-get install php-pear
> sudo pear install mail
> sudo pear install Net_SMTP
>
> to install the pear and mail package to get mail working. After this
> patch, and composer loded with the swiftmailer configs as per
> (https://gerrit.wikimedia.org/r/#/c/135290/10/composer.json), the user
> just have to give
>
>  php composer.phar install
>
> from core. and Swift gets auto loaded.
>
>
> Thanks,
> Tony Thomas
> FOSS@Amrita
>
> "where there is a wifi, there is a way"
>
>
> On Wed, May 28, 2014 at 10:43 AM, Matthew Flaschen
>  wrote:
>>
>> On 05/27/2014 11:57 AM, Bryan Davis wrote:
>>>
>>> I think we should create a new repository in gerrit
>>> ("mediawiki/core/contrib"? Bikeshed as needed) where composer is used
>>> to manage importing specific versions of external libraries that are
>>> needed for the wiki[mp]edia cluster deployments.
>>
>>
>> Is the idea that third-party users would use Composer to install these 
>> libraries?
>>
>> Matt Flaschen
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-05-29 Thread Bryan Davis
My logging changes [0][1][2][3] are getting closer to being mergeable
(the first has already been merged). Tony Thomas' Swift Mailer change
[4] is also progressing. Both sets of changes introduce the concept of
specifying external library dependencies, both required and suggested,
to mediawiki/core.git via composer.json. Composer can be used by
people directly consuming the git repository to install and manage
these dependencies. I gave a example set of usage instructions in the
commit message for my patch that introduced the dependency on PSR-3
[0]. In the production cluster, on Jenkins job runners and in the
tarball releases we will want a different solution.

My idea of how to deal with this is to create a new gerrit repository
(mediawiki/core/vendor.git?) that contains a composer.json file
similar to the one I had in patch set 7 of my first logging patch [5].
This composer.json file would be used to tell Composer the exact
versions of libraries to download. Someone would manually run Composer
in a checkout of this repository and then commit the downloaded
content, composer.lock file and generated autoloader.php to the
repository for review. We would then be able to branch and use this
repository as git submodule in the wmf/1.2XwmfY branches that are
deployed to production and ensure that it is checked out along with
mw-core on the Jenkins nodes. By placing this submodule at $IP/vendor
in mw-core we would be mimicking the configuration that direct users
of Composer will experience. WebStart.php already includes
$IP/vendor/autoload.php when present so integration with the rest of
wm-core should follow from that.

It would also be possible to add this repo to the tarballs for
distribution. There will probably need to be some adjustments for that
process however and the final result may be that release branches
update the mediawiki/core composer.json and provide a composer.lock
along with a pre-populated vendor directory. I would be glad to
participate in discussions of that use case, but we will have about 6
months before we need to solve it (and a new release management RFC to
resolve between now and then).

There are several use cases to consider for the general solution:

== Adding/updating a library ==
* Update composer.json in mediawiki/core/vendor.git
* Run `composer update` locally to download library (and dependencies)
* Run `composer dump-autoload --optimize` to make an optimized autoloader.php
* Commit changes
* Push changes for review in gerrit

== Hotfix for an external library ==
At some point we will run into a bug or missing feature in a Composer
managed library that we need to work around with a patch. Obviously we
will attempt to upstream any such fixes (otherwise what's the point of
this whole exercise?). To keep from blocking things for our production
cluster we would want to fork the upstream, add our patch for local
use and upstream the patch. During the time that the patch was pending
review in the upstream we would want to use our locally patched
version in production and Jenkins.

Composer provides a solution for this with its "repository" package
source. The Composer documentation actually gives this exact example
in their discussion of the "vcs" repository type [6]. We would create
a gerrit repository tracking the external library, add our patch(es),
adjust the composer.json file in mediawiki/core/vendor.git to
reference our fork, and finally run Composer in
mediawiki/core/vendor.git to pull in our patched version.

== Adding a locally developed library ==
The Platform Core team has been talking about extracting libraries
from mw-core and/or extensions to be published externally. This may be
done for any and all of the current $IP/includes/libs classes and
possibly other content from core such as FormatJson.

My idea for this would be to create a new gerrit repository for each
exported project. The project repo would contain a composer.json
manifest describing the project correctly to be published at
packagist.org like most Composer installable libraries. In the
mediawiki/core/vendor.git composer.json file we would pull these
libraries just like any third-party developed library. This isn't
functionally much different than the way that we use git submodules
today. There is one extra level of indirection when a library is
changed. The mediawiki/core/vendor.git will have to be updated with
the new library version before the hash for the git submodule of
mediawiki/core/vendor.git is updated in a deploy or release branch.

== wmf/1.XwmfY branches ==
The make-wmf-branch script (found in mediawiki/tools/release.git) is
used to create the weekly release branches that are deployed by the
"train" on each Thursday. This script would be updated to branch the
new mediawiki/core/vendor.git repository and add the version
appropriate branch as a submodule of mediawiki/core.git on the wmf/*
branch. This is functionally exactly what we do for extensions today.

== Updating a deployment branch ==

[Wikitech-l] Process change: New contributors getting editbugs on Bugzilla

2014-05-29 Thread Mark Holmquist
Hi,

For the nth time, in #mediawiki, I've had someone ask how to be able to mark a 
bug as resolved, or claim it, or mark it as a duplicate of another bug. I 
conceptually know that this means "getting editbugs permissions" but it happens 
so infrequently that I never know where to go.

Usually what happens is this:
1. I wrack my brain trying to remember the process for about 30 seconds
2. Failing, I try to ping Andre, Quim, and Sumana (none of whom are in the 
channel, sadly)
3. I search with duckduckgo and pull up nothing of any use
4. I search MediaWiki.org and find outdated status reports about greasemonkey 
scripts but nothing useful
5. I go to the developer hub pages and look at the welcome-to-the-community 
process but again find nothing describing this process

Solution: We've made every editbugs user able to add editbugs to an account. 
I've documented the process here: 
https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F

Thanks to Chad for the quick resolution on this, hopefully this will be a 
positive change overall.

-- 
Mark Holmquist
Software Engineer, Multimedia
Wikimedia Foundation
mtrac...@member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unclear Meaning of $baseRevId in WikiPage::doEditContent

2014-05-29 Thread Aaron Schulz
Yes it was for auto-reviewing new revisions. New revisions are seen as a
combination of (base revision, changes). If the base revision was reviewed
and the user is trusted, then so is the new revision. MW core had the
obvious cases of rollback and null edits, which are (base revision, no
changes). Their is a lot more "base revision" detection in FlaggedRevs for
the remaining cases, some less obvious (user supplied baseRevId, X-top edit
undo, fall back to prior edit).

If baseRevId is always set to the revision the user started from it would
cause problems for that extension for the cases where it was previously
false.

It would indeed be useful to have a casRevId value that was the current
revision at the time of editing just for CAS style conflict detection.



--
View this message in context: 
http://wikimedia.7.x6.nabble.com/Unclear-Meaning-of-baseRevId-in-WikiPage-doEditContent-tp5028661p5028902.html
Sent from the Wikipedia Developers mailing list archive at Nabble.com.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] 404 errors

2014-05-29 Thread ENWP Pine
Hi, I'm getting some 404 errors consistently when trying to load some English 
Wikipedia articles. Other pages load ok. Did something break?

Pine
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 404 errors

2014-05-29 Thread Dan Garry
I'm getting these errors too.

Judging by the chatter in #wikimedia-operations, this is being actively
looked in to.

Dan


On 29 May 2014 13:34, ENWP Pine  wrote:

> Hi, I'm getting some 404 errors consistently when trying to load some
> English Wikipedia articles. Other pages load ok. Did something break?
>
> Pine
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Dan Garry
Associate Product Manager for Platform and Mobile Apps
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 404 errors

2014-05-29 Thread Erik Moeller
It's being investigated, see #wikimedia-operations on irc.freenode.net.

Erik


On Thu, May 29, 2014 at 1:34 PM, ENWP Pine  wrote:

> Hi, I'm getting some 404 errors consistently when trying to load some
> English Wikipedia articles. Other pages load ok. Did something break?
>
> Pine
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-05-29 Thread Niklas Laxström
2014-05-29 20:27 GMT+03:00 Bryan Davis :
> What use cases did I miss? What other concerns do we have for this process?

The email subject does not cover third party users, so apologies if
this is not the correct place for this.

Currently updating translatewiki.net codebase is annoying, as git does
not handle files replaced with symlinks well.

I am not happy since I had to spent unplanned effort to be able
install extensions via composer just a short while ago. We replace
composer.json with symlink to our configuration repo, where it
includes the extension we install via composer.

Ori has come up with a good solution for this issue, but that solution
requires changes to Composer. To me it looks like nobody is currently
working on that.

Is there a way I can expedite this fix?

Thank you for your efforts to bring composer support to MediaWiki. I
wish the process was less rough for me.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 404 errors

2014-05-29 Thread ENWP Pine

Thanks Dan and Erik.

Pine

> Date: Thu, 29 May 2014 13:36:12 -0700
> From: Dan Garry 
> To: Wikimedia developers 
> Subject: Re: [Wikitech-l] 404 errors
> Message-ID:
>   
> Content-Type: text/plain; charset=UTF-8
> 
> I'm getting these errors too.
> 
> Judging by the chatter in #wikimedia-operations, this is being actively
> looked in to.
> 
> Dan
> 
> 
> On 29 May 2014 13:34, ENWP Pine  wrote:
> 
> > Hi, I'm getting some 404 errors consistently when trying to load some
> > English Wikipedia articles. Other pages load ok. Did something break?
> >
> > Pine
> >
> Dan Garry
> Associate Product Manager for Platform and Mobile Apps
> Wikimedia Foundation
> 
> 



> Date: Thu, 29 May 2014 13:36:00 -0700
> From: Erik Moeller 
> To: Wikimedia developers 
> Subject: Re: [Wikitech-l] 404 errors
> Message-ID:
>   
> Content-Type: text/plain; charset=UTF-8
> 
> It's being investigated, see #wikimedia-operations on irc.freenode.net.
> 
> Erik
> 
> 
> On Thu, May 29, 2014 at 1:34 PM, ENWP Pine  wrote:
> 
> > Hi, I'm getting some 404 errors consistently when trying to load some
> > English Wikipedia articles. Other pages load ok. Did something break?
> >
> > Pine
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> 
> 
> 
> -- 
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
> 
> 
> --
> 
  
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Security and Maintenance Releases: 1.19.16, 1.21.10 and 1.22.7

2014-05-29 Thread Markus Glaser
Hello everyone,

I would like to announce the release of MediaWiki 1.19.16, 1.21.10 and 1.22.7. 
This is a regular security and maintenance release. Download links are given at 
the end of this email. 

== Security ==
* (bug 65501) SECURITY: Don't parse usernames as wikitext on
  Special:PasswordReset.

== Bugfixes in 1.22.7 ==
* (bug 36356) Add space between two feed links.
* (bug 63269) Email notifications were not correctly handling the
  [[MediaWiki:Helppage]] message being set to a full URL. This is a regression
  from the 1.22.5 point release, which made the default value for it a URL.
  If you customized [[MediaWiki:Enotif body]] (the text of email notifications),
  you'll need to edit it locally to include the URL via the new variable
  $HELPPAGE instead of the parser functions fullurl and canonicalurl; otherwise
  you don't have to do anything.
* Add missing uploadstash.us_props for PostgreSQL.
* (bug 56047) Fixed stream wrapper in PhpHttpRequest.

== Bugfixes in 1.21.10 ==
* (bug 36356) Add space between two feed links.

Full release notes for 1.22.7:


Full release notes for 1.21.10:


Full release notes for 1.19.16:


Public keys:


**
1.22.7
**
Download:
http://releases.wikimedia.org/mediawiki/1.22/mediawiki-1.22.7.tar.gz

Patch to previous version (1.22.6):
http://releases.wikimedia.org/mediawiki/1.22/mediawiki-1.22.7.patch.gz

GPG signatures:
http://releases.wikimedia.org/mediawiki/1.22/mediawiki-core-1.22.7.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.22/mediawiki-1.22.7.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.22/mediawiki-1.22.7.patch.gz.sig

**
1.21.10
**
Download:
http://releases.wikimedia.org/mediawiki/1.21/mediawiki-1.21.10.tar.gz

Patch to previous version (1.21.9):
http://releases.wikimedia.org/mediawiki/1.21/mediawiki-1.21.10.patch.gz

GPG signatures:
http://releases.wikimedia.org/mediawiki/1.21/mediawiki-core-1.21.10.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.21/mediawiki-1.21.10.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.21/mediawiki-1.21.10.patch.gz.sig

Note:
There is no i18n patch as there are no changes in translation.

**
1.19.16
**
Download:
http://releases.wikimedia.org/mediawiki/1.19/mediawiki-1.19.16.tar.gz

Patch to previous version (1.19.15):
http://releases.wikimedia.org/mediawiki/1.19/mediawiki-1.19.16.patch.gz

GPG signatures:
http://releases.wikimedia.org/mediawiki/1.19/mediawiki-core-1.19.16.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.19/mediawiki-1.19.16.tar.gz.sig
http://releases.wikimedia.org/mediawiki/1.19/mediawiki-1.19.16.patch.gz.sig

Note:
There is no i18n patch as there are no changes in translation.


Markus Glaser
(Release Team)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-05-29 Thread Bryan Davis
On Thu, May 29, 2014 at 2:38 PM, Niklas Laxström
 wrote:
> 2014-05-29 20:27 GMT+03:00 Bryan Davis :
>> What use cases did I miss? What other concerns do we have for this process?
>
> The email subject does not cover third party users, so apologies if
> this is not the correct place for this.

We need to hash out the third-party issues somewhere. I suppose this
thread is as good a place as any to bring up your issues.

> Currently updating translatewiki.net codebase is annoying, as git does
> not handle files replaced with symlinks well.
>
> I am not happy since I had to spent unplanned effort to be able
> install extensions via composer just a short while ago. We replace
> composer.json with symlink to our configuration repo, where it
> includes the extension we install via composer.
>
> Ori has come up with a good solution for this issue, but that solution
> requires changes to Composer. To me it looks like nobody is currently
> working on that.
>
> Is there a way I can expedite this fix?

I think bug 65188 [0] is the solution suggested by Ori that you are
referring to. Would this alone be enough to fix the problems for
translatewiki.net? More directly, is translatewiki.net using the top
level composer.json file to manage anything other than extensions? In
the near term is there any better work around for you (and others in a
similar position) other than running `git update-index
--assume-unchanged composer.json` in your local checkout to make it
ignore anything that happens to composer.json?

> Thank you for your efforts to bring composer support to MediaWiki. I
> wish the process was less rough for me.

Thanks for the thanks. :) Sorry for the pain, but you are one of the
power users who helps us work out the kinks.

[0]: https://bugzilla.wikimedia.org/show_bug.cgi?id=65188

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Process change: New contributors getting editbugs on Bugzilla

2014-05-29 Thread Brian Wolff
Oh yay finally.

Ive been wishing we did that for literally years.

Thanks for making this happen.

--bawolff
On May 29, 2014 2:57 PM, "Mark Holmquist"  wrote:
>
> Hi,
>
> For the nth time, in #mediawiki, I've had someone ask how to be able to
mark a bug as resolved, or claim it, or mark it as a duplicate of another
bug. I conceptually know that this means "getting editbugs permissions" but
it happens so infrequently that I never know where to go.
>
> Usually what happens is this:
> 1. I wrack my brain trying to remember the process for about 30 seconds
> 2. Failing, I try to ping Andre, Quim, and Sumana (none of whom are in
the channel, sadly)
> 3. I search with duckduckgo and pull up nothing of any use
> 4. I search MediaWiki.org and find outdated status reports about
greasemonkey scripts but nothing useful
> 5. I go to the developer hub pages and look at the
welcome-to-the-community process but again find nothing describing this
process
>
> Solution: We've made every editbugs user able to add editbugs to an
account. I've documented the process here:
https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F
>
> Thanks to Chad for the quick resolution on this, hopefully this will be a
positive change overall.
>
> --
> Mark Holmquist
> Software Engineer, Multimedia
> Wikimedia Foundation
> mtrac...@member.fsf.org
> https://wikimediafoundation.org/wiki/User:MHolmquist
>
> -BEGIN PGP SIGNATURE-
> Version: GnuPG v1.4.10 (GNU/Linux)
>
> iQIcBAEBAgAGBQJTh4L9AAoJEEPl+wghkjzxIBoP/07z6NRuqPuZ74UZREX9Zj43
> I7runCun/KDAV108YonKIX+oyj67yBh8pFs/nOaM3PqeQRmaHDf62hfv/WT3ATGT
> xEVRVEH81Q33JRPM2fg9q73xsOln9NPhEn7xMHFO+XtyUlwaQV+Hg+yscWCV3KIU
> 4A/I/UooIacbygBqeI56LMpKUI7yiB4vwd2gGpxJecn4s556GxbTsCnCeUY3/Xs9
> 6dIZfT9ZSAbncjY1eNvWREdQ30i5Y1yaTUHFDnGQ0qQWxPnSuhb3s0HlyH0bfP73
> SgQTBL9e4vneUtUpR1G8c6qJ0tYegXv/09mLaAV648G1BHqThqOo5qCz+wxEXEOU
> p36JUkXC2+lyxfz+UgaYSE2UwNxbwkpEuwt+1VUv75JjcZFks8IKHpNxl7qm7Uuk
> 4SqB9myrtXauCrEPHOTxIWNKMw42g40oONNZjQX8XhlVp53SzgP971daFAOBHdyd
> R3QBd/q90Po6bk+gWr/xS0AQJva2nGgc4BJArDj4ReAJ4ZjsCncxLMY8/Tlq/Vsk
> 6rNezDSVmx73kmLvM5BYg5hMRrS4S990oIbHYLL+J5CbnRjak/Mr8lhealpAArxl
> bx5RHNCMVnm6iRVASDuoI8kHL1DPYnzwguQhCuaklQ/ad2MXODv3vOax4vl8xfdu
> hgbDzYYIAEmhintvznjM
> =ah5z
> -END PGP SIGNATURE-
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-05-29 Thread Matthew Walker
I'm happy to report that after a LONG time fighting with deployment the
test instance is available in beta labs (en.wikipedia.beta.wmflabs.org and
all others) via the WMF PDF option in Special:Collection and on the side
panel.

It is very rough still in terms of reliable rendering (it doesn't like to
clean up after itself) -- but now that I have deployment sorted and it
stably running that's my next task. Play away :D

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Thu, May 29, 2014 at 7:02 AM, Andre Klapper 
wrote:

> Hi,
>
> On Mon, 2014-05-19 at 11:57 -0700, C. Scott Ananian wrote:
> > That's a good question!  I'm in SFO this week, so it's probably worth
> > setting aside a day to resync and figure out what the next steps for
> > the new PDF renderer are.
>
> Any news (or a public test instance available)?
>
> As I wrote, I'd be interested in having a bugday on testing the new PDF
> renderer by going through / retesting
>
> https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&component=Collection
>
> Thanks,
> andre
> --
> Andre Klapper | Wikimedia Bugwrangler
> http://blogs.gnome.org/aklapper/
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Status of the new PDF Renderer

2014-05-29 Thread Matthew Walker
I should have also noted -- there is something strange going on with the
frontend to Special:Collection. You have to manually refresh to see status
updates...

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Thu, May 29, 2014 at 5:56 PM, Matthew Walker 
wrote:

> I'm happy to report that after a LONG time fighting with deployment the
> test instance is available in beta labs (en.wikipedia.beta.wmflabs.org
> and all others) via the WMF PDF option in Special:Collection and on the
> side panel.
>
> It is very rough still in terms of reliable rendering (it doesn't like to
> clean up after itself) -- but now that I have deployment sorted and it
> stably running that's my next task. Play away :D
>
> ~Matt Walker
> Wikimedia Foundation
> Fundraising Technology Team
>
>
> On Thu, May 29, 2014 at 7:02 AM, Andre Klapper 
> wrote:
>
>> Hi,
>>
>> On Mon, 2014-05-19 at 11:57 -0700, C. Scott Ananian wrote:
>> > That's a good question!  I'm in SFO this week, so it's probably worth
>> > setting aside a day to resync and figure out what the next steps for
>> > the new PDF renderer are.
>>
>> Any news (or a public test instance available)?
>>
>> As I wrote, I'd be interested in having a bugday on testing the new PDF
>> renderer by going through / retesting
>>
>> https://bugzilla.wikimedia.org/buglist.cgi?resolution=---&component=Collection
>>
>> Thanks,
>> andre
>> --
>> Andre Klapper | Wikimedia Bugwrangler
>> http://blogs.gnome.org/aklapper/
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Process change: New contributors getting editbugs on Bugzilla

2014-05-29 Thread Steven Walling
On Thursday, May 29, 2014, Mark Holmquist  wrote:

> Hi,
>
> For the nth time, in #mediawiki, I've had someone ask how to be able to
> mark a bug as resolved, or claim it, or mark it as a duplicate of another
> bug. I conceptually know that this means "getting editbugs permissions" but
> it happens so infrequently that I never know where to go.
>
> Usually what happens is this:
> 1. I wrack my brain trying to remember the process for about 30 seconds
> 2. Failing, I try to ping Andre, Quim, and Sumana (none of whom are in the
> channel, sadly)
> 3. I search with duckduckgo and pull up nothing of any use
> 4. I search MediaWiki.org and find outdated status reports about
> greasemonkey scripts but nothing useful
> 5. I go to the developer hub pages and look at the
> welcome-to-the-community process but again find nothing describing this
> process
>
> Solution: We've made every editbugs user able to add editbugs to an
> account. I've documented the process here:
> https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F
>
> Thanks to Chad for the quick resolution on this, hopefully this will be a
> positive change overall.
>
> --
> Mark Holmquist


Thank you! This is extremely sensible. Hopefully we can make sure avoid
duplicating this problem again in Phabricator.



> Software Engineer, Multimedia
> Wikimedia Foundation
> mtrac...@member.fsf.org 
> https://wikimediafoundation.org/wiki/User:MHolmquist
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 404 errors

2014-05-29 Thread Ori Livneh
On Thu, May 29, 2014 at 1:34 PM, ENWP Pine  wrote:

> Hi, I'm getting some 404 errors consistently when trying to load some
> English Wikipedia articles. Other pages load ok. Did something break?
>

TL;DR: A package update went badly.

Nitty-gritty postmortem:

At 20:25 (all times UTC), change Ie5a860eb9[0] ("Remove
wikimedia-task-appserver from app servers") was merged. There were two
things wrong with it:

1) The appserver package was configured to delete the mwdeploy and apache
users upon removal. The apache user was not deleted because it was logged
in, but the mwdeploy user was. The mwdeploy account was declared in Puppet,
but there was a gap between the removal of the package and the next Puppet
run during which the account would not be present.

2) The package included the symlinks /etc/apache2/wmf and
/usr/local/apache/common, which were not Puppetized. These symlinks were
unlinked when the package was removed.

Apache was configured to load configuration files from /etc/apache2/wmf,
and these include the files that declare the DocumentRoot and Directory
directives for our sites. As a result, users were served with 404s. At
20:40 Faidon Liambotis re-installed wikimedia-task-appserver on all
Apaches. Since 404s are cached in Varnish, it took another five minutes for
the rate of 4xx responses to return to normal (20:45).[1]

[0]: https://gerrit.wikimedia.org/r/#/c/136151/
[1]:
https://graphite.wikimedia.org/render/?title=HTTP%204xx%20responses%2C%202014-05-29&from=20:00_20140529&until=21:00_20140529&target=reqstats.4xx&hideLegend=true
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Process change: New contributors getting editbugs on Bugzilla

2014-05-29 Thread Legoktm

On 5/29/14, 11:57 AM, Mark Holmquist wrote:


Solution: We've made every editbugs user able to add editbugs to an account. 
I've documented the process here: 
https://www.mediawiki.org/wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F


Does this only apply to every user who has editbugs right now, or will 
it also apply to those we give editbugs to in the future?



Thanks to Chad for the quick resolution on this, hopefully this will be a 
positive change overall.


Thank you for finally getting this done! :)

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Process change: New contributors getting editbugs on Bugzilla

2014-05-29 Thread Jasper Deng
If I recall correctly, it used to be the default, but it was removed after
some Bugzilla vandalism in 2011.


On Thu, May 29, 2014 at 10:40 PM, Legoktm 
wrote:

> On 5/29/14, 11:57 AM, Mark Holmquist wrote:
>
>  Solution: We've made every editbugs user able to add editbugs to an
>> account. I've documented the process here: https://www.mediawiki.org/
>> wiki/Bugzilla#Why_can.27t_I_claim_a_bug_or_mark_it_resolved.3F
>>
>
> Does this only apply to every user who has editbugs right now, or will it
> also apply to those we give editbugs to in the future?
>
>
>  Thanks to Chad for the quick resolution on this, hopefully this will be a
>> positive change overall.
>>
>
> Thank you for finally getting this done! :)
>
> -- Legoktm
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l