Re: [Wikitech-l] Wikimedia's GitHub org help

2020-08-25 Thread Addshore
Is the main source of mirrors gerrit?
If so could we not write a script looking for .gitreview files and looking
at the URL in there?
I imagine there is also some API for marking things as mirrored? (or is it
more manual than that?)

Another thought would be adding some .wmgithub file with structured info
about repos that are on github.
Then rather than maintaining a manual list that is likely to get out of
date we could write a thin UI infront of the data in these files and the
GitHub API?

On Mon, 24 Aug 2020 at 23:47, Tyler Cipriani 
wrote:

> Hi all!
>
> If you've never created a repo or fork on the Wikimedia GitHub
> organization you can skip this email.
>
> I know that some repos are developed on our GitHub org for reasons.
> What is developed on our GitHub org? How many things are actively
> being developed on GitHub org? I have no idea :)
>
> I recently realized that there's not a great way to figure this
> out[0], but I've been able to narrow the scope a bit. Now I have a
> list of repos that are (a) in our GitHub org and (b) not in our Gerrit
> that I could use some help sorting through[1].
>
> == Help, please ==
>
> * Look through repos on The List™[1]
>
> If your repos are on the list, for each of your repos either:
>
> * Archive or Delete it if it's no longer maintained or empty/useless,
> respectively (and remove them from the list on mw.org)[2]
>
> Or:
>
> * put a "{{tick}}" in the "Active" column on the list on mw.org
>
> == Why==
>
> In a more perfect future we could add the "mirror"[3] tag to repos on
> GitHub that are mirrored from Gerrit (with a link to their canonical
> repo locations; for example, gnome-deskop has this[4] and I'm very
> jealous).
>
> Hopefully, this will help folks wanting to contribute -- either a
> Wikimedia GitHub repo is a mirror (in which case there's a link to
> Gerrit in the description) or it's actively being developed on GitHub.
>
> <3
> -- Tyler
>
> [0]: 
> [1]: 
> [2]: <
> https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/archiving-a-github-repository
> >
> [3]: <
> https://docs.github.com/en/github/getting-started-with-github/finding-ways-to-contribute-to-open-source-on-github#open-source-projects-with-mirrors-on-github
> >
> [4]: 
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Storing some values in the DB

2020-01-29 Thread Addshore
Sounds like you just want a simple key value store that.
I don't believe mediawiki has a table or interface for this.

Slightly relevant would be https://phabricator.wikimedia.org/T227776 although
that RFC will likely come up with something that is in some way bound to a
page.

On Tue, 28 Jan 2020 at 19:14, Jaime Crespo  wrote:

> > It complicates extension installation
>
> One could just copy and paste the existing migration code and just change
> the SQL to the right CREATE TABLE, and be done. If you just have to create
> a table, you will only have to do that once, no maintenance required.
> Trying to fit values into existing tables, however will create a lot of
> maintenance headaches, as Brian said.
>
> # include/Hooks.php
> $base = __DIR__ . "/../sql";
> $updater->addExtensionTable( 'extension_value', $base .
> '/ExtensionValue.sql' );
> # sql/ExtensionValue.sql
> CREATE TABLE extension_value (
> k varchar(100) PRIMARY KEY,
> v int NOT NULL,
> );
>
> If you are overwhelmed with SQL or mediawiki migration system please just
> ask for help and people here, including me, will be able to help! :-D
> Mediawiki core code can be intimidating because support of several
> databases systems and its long history of schema changes, but in your case
> adding a table should not take more than 2 files that are never touched
> again :-D. Update.php will have to be run anyway on core updates. Handling
> mysql and queries is way more difficult than creating it in the first
> place!
>
> Or maybe we are not understanding what you are trying to achieve. 0:-)
>
> On Mon, Jan 27, 2020 at 5:00 PM Jeroen De Dauw 
> wrote:
>
> > Hey,
> >
> > > Why are you so reluctant to create a table?
> >
> > It complicates extension installation and adds yet another thing to
> > maintain. Which is a bit silly for a handful of values. I was hoping
> > MediaWiki would have a generic table to put such things already.
> Page_props
> > is exactly that, just bound to specific pages.
> >
> > Cheers
> >
> > --
> > Jeroen De Dauw | www.EntropyWins.wtf 
> > Professional wiki hosting and services: www.Professional.Wiki
> > 
> > Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
> > contributor ~=[,,_,,]:3
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Jaime Crespo
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collecting UI feedback for PolyGerrit - Gerrit

2018-10-04 Thread Addshore
I'm still currently using the old UI, here are some things that I have
noticed.

#1 While comparing gerrit-review.googlesource.com and our gerrit install I
have found that the
googlesource version seems easier to look at. I think some of this is down
to the greater contrast
difference and the couple of extra seperating lines that they use. The main
one that I would like to
see on our install is line between the meta data on the left and the commit
message centre screen.

#2 My workflow now requires an extra click. I don't use git-review and
instead just use the links provided
by gerrit to checkout patches. Again, this seems to be less of an issue on
googlesource which has a
"download" button on the main patch pages, ours however is hidden within
the "more" menu.
I would like to see the download button straight away and avoid this extra
click.

#3 Upstream seems to have "sizeBars" element on each changed file
indicating the additions and
removals, I like this, and miss this from our current "old ui".

#4 When Jenkins reports on a patch the FAILURE or SUCCESS text has a text
colour of either red or
green in the old UI. googlesource also has some coloured representation of
success of failure when
reported by their CI, but our new poly gerrit doesn't seem to.

Thats it from me right now.

On Wed, 3 Oct 2018 at 23:29, Thiemo Kreuz  wrote:

> Paladox wrote:
>
> > i am collecting feedback for Gerrit's New UI […]
>
> You might want to check out the CSS tweaks I developed for the old
> Gerrit UI. This stylesheet removes a lot of clutter, makes Gerrit
> usable on smaller laptop screens, and increases critical click
> regions. If the new Gerrit UI looks as clean as my tweaked version (or
> when similar tweaks can be applied to the new UI), I'm happy.
>
> https://meta.wikimedia.org/wiki/User:Thiemo_Kreuz_(WMDE)/userContent.css
>
> Best
> Thiemo
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Heads up: multi-content revision patches landed in master

2018-06-15 Thread Addshore
Per the plan in https://phabricator.wikimedia.org/T196585 a new branch was
created last night including the merged patches.
The branch is called wmf.999, to not mess up the wmf.x branch numbers but
also not break tools.
The branch is now on group0 wikis.

On Thu, 14 Jun 2018 at 17:04, Daniel Kinzler 
wrote:

> Hello all!
>
> Several patches related to multi-content[1] revisions landed on master
> today.
> The big ones are:
>
> * https://gerrit.wikimedia.org/r/c/mediawiki/core/+/405015
> * https://gerrit.wikimedia.org/r/c/mediawiki/core/+/406595
> * https://gerrit.wikimedia.org/r/c/mediawiki/core/+/416465
>
> These patches *should* by themselves not change any behavior. All new
> features
> are still disabled, new database tables are not used yet.
>
> We plan to have these on testwiki for more than a week before they go live
> with
> the train in the week of the 25th. We made a plan with releng for this[2].
> Deployment to testwiki (group0) will probably happen tonight.
>
> Until then, the new code can already be tested on the beta cluster. We did
> extensive manual testing beforehand on a dedicated vps. Test logs are
> available
> at [3].
>
> If you find any problems, please file tickets on phabricator and tag them
> with
> the #Multi-Content-Revisions project.
>
> Overall progress of the MCR storage layer deployment ins tracked on
> phabricator
> [4]. Next steps will be to further refactor the code that is now in the
> (intermediate/internal) DerivedPageDataUpdater class, and to change core
> code to
> use the new interfaces. We will also look into providing alternatives to
> several
> of the hook points triggered during page edits.
>
> Cheers,
> Daniel
>
> [1]
> https://www.mediawiki.org/wiki/Requests_for_comment/Multi-Content_Revisions
> [2] https://phabricator.wikimedia.org/T196585
> [3]
>
> https://www.mediawiki.org/wiki/User:Daniel_Kinzler_(WMDE)/MCR-StorageLayerTesting
> [4] https://phabricator.wikimedia.org/T174044
>
> --
> Daniel Kinzler
> Principal Platform Engineer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
> \o/\o/ff
>
> ___
> Ops mailing list
> o...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/ops
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] +2 nomination for Jakob in mediawiki/*

2018-05-20 Thread Addshore
Hi,

I've filed <*https://phabricator.wikimedia.org/T195220
<https://phabricator.wikimedia.org/T195220>*>, nominating
Jakob for +2 in mediawiki/ repos.

-- Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] +2 nomination for WMDE-leszek in mediawiki/*

2018-01-23 Thread Addshore
Hi,

I've filed <https://phabricator.wikimedia.org/T185593>, nominating
WMDE-leszek for +2 in mediawiki/ repos.

-- Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Setting logged in user context on unit tests

2017-12-27 Thread Addshore
Take a look at MediaWikitestCase::insertPage

https://phabricator.wikimedia.org/source/mediawiki/browse/master/tests/phpunit/MediaWikiTestCase.php;8eaee6fd06d9089ef90032530af9a9d25a52a1fc$997

On 26 December 2017 at 09:58, Tony Thomas <01tonytho...@gmail.com> wrote:

> Great. This would really help. I just posted the same in our tasks (which
> are GCI tasks now) so that people would use it. One more thing, in a unit
> test - is it the only way to create a Wikipage and save it to the db ?
>
> $title = Title::newFromText( 'TestPage' );
> $wikiPage = WikiPage::factory( $title );
> $content = new WikitextContent( $text='this is a test' );
> $wikiPage->doEditContent( $content, $summary='Test commit' );
>
> or are there some other simpler ways ?
>
>
> --
> Tony Thomas
> https://mediawiki.org/wiki/User:01tonythomas
> --
>
> On Tue, Dec 26, 2017 at 1:27 AM, Gergo Tisza  wrote:
>
> > On Mon, Dec 25, 2017 at 12:00 PM, Tony Thomas <01tonytho...@gmail.com>
> > wrote:
> >
> > > Came across a situation similar to [1] where I had to call submit()
> > > function of a FormSpecialPage during a unit test. We have something
> going
> > > on in the background, and $this->getUser() needs to return a valid
> user.
> > >
> > > Is there a way to mimic this context inside the unit test, so that I
> can
> > > manually set a $user in the unit test, and this would be used while
> > > performing inner operations ?
> > >
> >
> > No need to mimic, you can just inject it (as long as the special page
> > correctly uses $this->getUser() & co instead of using globals, which is
> > usually not a problem with special pages) :
> >
> > $specialPage = $this->newSpecialPage();
> > $context = new DerivativeContext( RequestContext::getMain() );
> > $context->setUser( $user );
> > $context->setRequest( ... );
> > $specialPage->setContext( $context );
> > $res = $specialPage->onSubmit( $input );
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] First step for MCR merged: Deprecate and gut the Revision class

2017-12-23 Thread Addshore
So, the patches that would need to be reverted can be found on wikitech
https://wikitech.wikimedia.org/wiki/User:Addshore/MCR_Revert

I have also created a patch with a switch wrapping the refactoring
https://gerrit.wikimedia.org/r/#/c/399881/

I'm going to continue testing this code on beta over the christmas period
patching any holes that I find as I do.

On 23 December 2017 at 00:14, Daniel Kinzler 
wrote:

> Am 23.12.2017 um 00:03 schrieb C. Scott Ananian:
> > I think a simple revert would be simplest.  Adding a feature flag adds
> new
> > possibilities of overlooked bugs, especially since this is "just" a
> > refactoring and so *in theory* shouldn't be changing anything.
> >
> > Maybe we could just cherry-pick a revert onto the Jan 2 branch, rather
> than
> > revert on master and then un-revert after the branch.
>
> A revert is certainly an option, I tried to keep this as isolated as
> possible.
> Reverting just on the branch would allow us to keep testing on beta without
> disruption, and without having to go back and forth con core code, causing
> merge
> conflicts.
>
> But there is another option to consider: Only deploy to testwiki (and
> friends)
> on Jan 2, and not to production wikis. This would give us a week to look
> at this
> in a production-like environment, on top of the time on beta, before it
> really
> goes live.
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] First step for MCR merged: Deprecate and gut the Revision class

2017-12-22 Thread Addshore
So the plan going forward will be to create a feature flag for the MCR
Revision gutting.
I'll crack on with that this evening.

If that turns out to be too messy then we can revert the MCR patches for
the next wmf branch.
I'm currently keeping track of this @
https://wikitech.wikimedia.org/wiki/User:Addshore/MCR_Revert

On 22 December 2017 at 18:39, Ramsey Isler <ris...@wikimedia.org> wrote:

> Fantastic news! Great work handling this behemoth of a technical challenge.
>
> On Fri, Dec 22, 2017 at 2:26 AM, Daniel Kinzler <
> daniel.kinz...@wikimedia.de> wrote:
>
>> Hello all!
>>
>> Addshore last night merged the patch[1] that is the first major step
>> towards
>> Multi-Content-Revisions[2]: it completely guts the Revision class and
>> turns it
>> into a thin proxy for the new RevisionStore service. The new code is now
>> live
>> on beta.
>>
>> This is our second attempt: The first one, on December 18th, thoroughly
>> corrupted the beta database. It took us some time and a lot of help from
>> Aaron
>> and especially Roan to figure out what was happening. A detailed
>> post-mortem by
>> Roan can be found at [3].
>>
>> Anyway - this stage of MCR development introduces the new multi-revision
>> capable
>> interface for revision storage (and blob storage) [4]. It does not yet
>> introduce
>> the new database schema, that will be the next step [5][6]. While doing
>> the
>> refactoring, I tried to keep the structure of the existing code mostly
>> intact,
>> just moving functionality out of Revision into the new classes, most
>> importantly
>> RevisionRecord, RevisionStore, and BlobStore.
>>
>> Beware that with the next deployment (due January 2nd) the live sites
>> will start
>> using the new code. Please keep an eye out for any strangeness regarding
>> revision handling. Adam greatly improved test coverage of the relevant
>> code
>> (thanks Adam!), but it's always possible that we missed some edge case,
>> maybe
>> something about archived revisions that were partially migrated from on
>> old
>> schema or something similarly fun.
>>
>> Exiting times!
>>
>> Cheers
>> Daniel
>>
>>
>> [1] https://gerrit.wikimedia.org/r/#/c/399174/
>> [2] https://www.mediawiki.org/wiki/Requests_for_comment/Multi-
>> Content_Revisions
>> [3] https://phabricator.wikimedia.org/T183252#3853749
>> [4] https://phabricator.wikimedia.org/T174025
>> [5] https://phabricator.wikimedia.org/T174024
>> [6] https://phabricator.wikimedia.org/T174030
>>
>>
>> --
>> Daniel Kinzler
>> Principal Platform Engineer
>>
>> Wikimedia Deutschland
>> Gesellschaft zur Förderung Freien Wissens e.V.
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing MediaWiki code search

2017-12-21 Thread Addshore
Really great!

I'll move my github searches to this asap :D

On 21 December 2017 at 04:15, Kunal Mehta  wrote:

> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA512
>
> Hi,
>
> MediaWiki code search is a fully free software tool that lets you
> easily search through all of MediaWiki core, extensions, and skins
> that are hosted on Gerrit. You can limit your search to specific
> repositories, or types of repositories too. Regular expressions are
> supported in both the search string, and when filtering by path.
>
> Try it out: https://codesearch.wmflabs.org/search/
>
> I started working on this because the only other options to searching
> the entire MediaWiki codebase was either cloning everything locally
> (takes up space, and need to manually keep it up to date) or using
> Github (not free software, has extraneous repositories). The backend
> is powered by hound, a code search tool written by etsy, based on
> Google's Code Search.
>
> Please let me know what you think! More documentation and links are
> at: .
>
> - -- Legoktm
> -BEGIN PGP SIGNATURE-
>
> iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlo7NUoXHGxlZ29rdG1A
> bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KJn/w//YYSD6Fer5EfQAXj+frd02rB5
> yx8cowO4ttPFG+52ZTt4RE24SdjSFcz42jnq6wuSQ47pQsZHgDc5qrr6JRsFGq9l
> Bvnh7NIYsHHOdQDTkxwHHwaHBTb31u35Bt8+qSHPqbB3cCAHMirJJjvs5+yoilIi
> wCmbjpxYoL4eUiMNeZRH/eYyUxpZJwHadc2FuuN3meUIgKoFAblHnKdxTmYoExqr
> 86PkjE36trbvOQkfrxaSyGJjG5Nm7l+83rm3pCo5pX9Fj/GZOdxcp0siRBKGaQ7W
> OciRofZAPjtqmiUunf2pe/wVEAK51VS7EkobgWraSSOwBf62PN7hHVLXQanRn8bh
> tQEcKHOxoVSXDlM/fl45cIBN/YGm9LEmRk0iB1HlZZ+QSC3XYj3kL/eMLlGorOuX
> MtKZ+J1KOjNJ2fmCMBZhGDzdHPSN70VSAN2Th3kqpDTGzXLTcn3D0VqIT0gQ6eiz
> lVyW0haiDuBS7JixZDdLFNr8RkMRLRWmJEdQQi/5VEp1I7K/UQmmt50HqzDBN4d6
> /0iKw8p5lANdmjP1rsVzmRrc5C94IS6GN68VznfXMPD+iXI4j1PEeJ6cgEn4aD3y
> oh2bD4nmX/T4YfBeigWxPVq3OyPHC5tPzTxdy8OHPNfko/xpwhlBMaf70fBIaBPy
> Ciq+thh5hlKuCT1HdXI=
> =Te+C
> -END PGP SIGNATURE-
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing a new security testing tool for MediaWiki extensions "phan-taint-check-plugin"

2017-12-14 Thread Addshore
WMF CI currently uses a docker image to run phan tests.

This is currently at https://hub.docker.com/r/wmfreleng/mediawiki-phan/

Once we get this running in CI there will also be a docker image for phan
with the security plugin.

On 14 December 2017 at 15:40, Brian Wolff  wrote:

> The 7.0 requirement is due to phan 0.8. You could try changing the version
> of phan to a higher one (I used phan 0.8 originally because thats what
> Wikimedia used in their continous integration setup. Which in retrospect
> really didn't matter). I have not tried it with higher versions of phan. I
> have no idea how stable the phan plugin api is, so it could totally work
> with higher versions of phan - I have no idea. That is really something I
> should test.
>
> You should be able to co-install both versions of php beside each other,
> with the php 7.0 binary named php7.0 instead of php. On macs homebrew will
> let you do this, and I assume other installation methods will let you do
> this too.
>
> Thanks,
> Brian
>
> On Thursday, December 14, 2017, Tom Bishop, Wenlin Institute <
> tan...@wenlin.com> wrote:
> >
> >
> >> On Dec 11, 2017, at 4:09 PM, Brian Wolff  wrote:
> >>
> >> ...
> >> Note: the tool has a requirement of php 7.0 (neither higher nor lower)
> >> see
> https://www.mediawiki.org/wiki/Continuous_integration/Phan#Dependencies
> >> for how to install php 7.0 if your system doesn't have it.
> >
> > I'm interested in trying it. However, I'm on macOS with php 7.1.1 and
> reluctant to downgrade to php 7.0 or set up a virtual machine just for
> this. Has anybody tried it wih macOS and/or php 7.1.1?
> >
> > Thanks!
> >
> > Tom
> >
> > Wenlin Institute, Inc. SPC (a Social Purpose Corporation)
> > 文林研究所社会目的公司
> > Software for Learning Chinese
> > E-mail: wen...@wenlin.com Web: http://www.wenlin.com
> > Telephone: 1-877-4-WENLIN (1-877-493-6546)
> > ☯
> >
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] +2 nomination for WMDE-Fisch in mediawiki/*

2017-12-06 Thread Addshore
Hi,

I've filed , nominating
WMDE-Fisch for +2 in mediawiki/ repos.

-- Adshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki CI failed due to lack of tidy.so [solved]

2017-06-28 Thread Addshore
Thanks for fixing this!

This was failing on some of the WMDE repos yesterday and this saves us
looking into it!

On 27 June 2017 at 12:52, Jon Robson <jdlrob...@gmail.com> wrote:

> Thank you for shedding light on this!
>
>
> On Tue, 27 Jun 2017 at 12:44 zppix e <megadev44s.m...@gmail.com> wrote:
>
> > Thanks. Was wondering what went wrong.
> >
> > Zppix
> > Volunteer developer for WMF
> > enwp.org/User:Zppix
> >
> > On Jun 27, 2017 1:55 PM, "Antoine Musso" <hashar+...@free.fr> wrote:
> >
> > > Hello,
> > >
> > > Jenkins jobs relying on HHVM had troubles today exiting with:
> > >
> > >   /usr/lib/x86_64-linux-gnu/hhvm/extensions/20150212/tidy.so:
> > >   cannot open shared object file: No such file or directory
> > >
> > > The root cause is installing libtidy-dev ends up uninstall the HHVM
> tidy
> > > extension (hhvm-tidy).
> > >
> > > The issue occurred between 16:20 UTC and 18:45 UTC.  To the best of my
> > > knowledge it is fully solved.
> > >
> > > Sorry for the inconvenience :(
> > >
> > >
> > > Ref:
> > > https://phabricator.wikimedia.org/T169004
> > > Follow up task:
> > > https://phabricator.wikimedia.org/T169008
> > >
> > > --
> > > Antoine "hashar" Musso
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] BetaFeatures grafana dashboard

2016-07-29 Thread Addshore
Recently WMDE started rolling out the RevisionSlider beta feature to a
handful of wikis.

We wanted to track how many people were using the feature as well as how
many people were enabling / disabling it per day.

Getting this data for all beta features is no more difficult than getting
it for the RevisonSlider only.
Thus we now have this dashboard, Enjoy!

https://grafana.wikimedia.org/dashboard/db/betafeatures

Any comments / suggestions are very welcome.

-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Best practice for WIP patches to help code review office hours

2016-05-13 Thread Addshore
Gerrit also has drafts...

On 13 May 2016 at 01:56, Jon Robson <jdlrob...@gmail.com> wrote:

> .
> On 12 May 2016 4:18 p.m., "Stas Malyshev" <smalys...@wikimedia.org> wrote:
> >
> > Hi!
> >
> > > No, -2 is restricted to project owners and thus not an op-
> > > tion for the vast majority of contributors.  For that pur-
> > > pose, I proposed a Gerrit label "WIP"
> > > (cf.
>
> http://permalink.gmane.org/gmane.science.linguistics.wikipedia.technical/84068
> ).
> >
> > This looks like a nice solution.
>
> Seconded. What's stopping us from adopting? It seems in that thread nothing
> happened.
>
> Greg - is this something we can do?
>
> >
> > --
> > Stas Malyshev
> > smalys...@wikimedia.org
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] WatchedItemStore - What, Why & How

2016-05-10 Thread Addshore
Over the past few months the TCB team at WMDE has been working on
re-factoring code in core surrounding watchlists.

You can find a full blog post about what we did, why we did it and how we
did it at the link below:
http://addshore.com/2016/05/refactoring-around-watcheditem-in-mediawiki

tl;dr This was work put into making introducing expiring watchlist entries
easier. Code was removed from various API modules, special pages and other
random places. The new extracted code now has basically 100% unit &
integration test coverage.

-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Looking for more additions to SWAT (deploy) team

2016-03-04 Thread Addshore
I'll echo exactly what David said.

I'm willing to help (essentially only morning SWAT due to being in Europe)
I've never deployed anything on the cluster so I'll need some help for the
first patches.

On 3 March 2016 at 13:43, David Causse <dcau...@wikimedia.org> wrote:

> Hi Greg,
>
> I'm willing to help (mostly morning SWAT).
> I've never deployed anything so I'll need some help for the first patches.
>
> David.
>
>
> Le 02/03/2016 23:21, Greg Grossmeier a écrit :
>
>> As you know, one of the quickest ways of getting your fix backported to
>> production (from master) is to use a SWAT window:
>> https://wikitech.wikimedia.org/wiki/SWAT_deploys
>>
>> And now, I ask for new volunteers!
>>
>> Would you like to help make developers happy and be a part of the crew
>> of people who deploy during these windows? Of course you would!
>>
>> These happen twice a day every work day (except Friday, naturally) at
>> 8:00 SF (16:00 UTC currently) and 16:00 SF (00:00 UTC).
>>
>> We're open to both those already familiar with deploying to Wikimedia
>> servers and those who are not; we're friendly and willing to teach :)
>>
>> Requirements are:
>> * A shell account on production
>> ** See: https://wikitech.wikimedia.org/wiki/Requesting_shell_access
>> * Availability during one of the two windows on a regular basis
>> * A willingness to learn and being comfortable with asking for
>>help/advice when you aren't sure, especially when you aren't sure what
>>a particular patch will actually do in production.
>>
>>
>> What you'll get:
>> * Fancy access to deploy to "A Top 10 Web Property"(TM)(C)
>> * Support from current SWATers to get started
>> * A t-shirt and/or sticker if you end up breaking, AND FIXING,
>>production
>>
>> Let me know if you're interested!
>>
>> Greg
>>
>>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Download link to properly packed extension

2016-02-09 Thread Addshore
Hi!

Take a look at https://www.mediawiki.org/wiki/Special:ExtensionDistributor

And thus http://extdist.wmflabs.org/dist/extensions/

Addshore

On 9 February 2016 at 08:16, Stephan Gambke <s7ep...@gmail.com> wrote:

> I now used
> https://github.com/wikimedia/mediawiki-extensions-FooBar/archive/someTag.zip
> Github will put the files in a folder named
> mediawiki-extensions-FooBar-someTag, which is not ideal, but better
> than risking a mess in the .../extensions folder.
> If there is a better solution I'd still be interested.
>
> Stephan
>
> On 8 February 2016 at 23:51, Stephan Gambke <s7ep...@gmail.com> wrote:
> > It is possible to download extensions in ZIP format from the WMF repo
> > using a link like this:
> >
> >
> http://git.wikimedia.org/zip/?r=mediawiki/extensions/FooBar.git=someTag=zip
> >
> > However, this will produce a package with the extension's files in its
> > root folder. An unsuspecting user will probably simply extract this
> > package into the .../extensions folder with all its files ending up
> > there instead of in the extensions subfolder.
> >
> > The ExtensionDistributor will provide correctly built packages, but it
> > is apparently only working on MW release tags, e.g. REL1_26.
> >
> > Is there any way to download correctly built packages with tags other
> > than the MW relases?
> >
> > Stephan
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] can't get git-review to work

2015-10-22 Thread Addshore
As someone that has never used git-review I would
suggest not using git-review ;)

On 22 October 2015 at 10:32, Rosalie Perside <rosaliepers...@gmail.com>
wrote:

> I was in the channel yesterday and i think that Kaldari had this
> problem solved with an advice d3r1ck gave him. A guy in #git gave
> Kaldari and idea and it worked but he forgot to update the mailing
> list.
>
> Cheers Rosalie
>
> On 10/22/15, Antoine Musso <hashar+...@free.fr> wrote:
> > Le 21/10/2015 22:47, Ryan Kaldari a écrit :
> >> Recently I upgraded my Mac to OSX 10.10.5 and upgraded a lot of other
> >> stuff
> >> in the process. But now I can't get git-review to work. Whenever I try
> to
> >> run it, it gives the following stacktrace:
> > 
> >>
> "/Library/Python/2.7/site-packages/distribute-0.6.14-py2.7.egg/pkg_resources.py",
> >> line 552, in resolve
> >> raise DistributionNotFound(req)
> >> pkg_resources.DistributionNotFound: git-review
> >
> > This is an issue in setuptools pkg_resources. You want to upgrade
> > setuptools:
> >
> >  pip install --upgrade setuptools
> >
> > Should solve the issue.
> >
> > ...
> >> If I run "/usr/bin/easy_install --version":
> >> distribute 0.6.14
> >
> > easy_install is legacy it is replaced by pip.
> >
> >
> > --
> > Antoine "hashar" Musso
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Getting mediawiki core to pass codesniffer standard

2015-09-26 Thread Addshore
This is also the approach we have just taken on the Wikibase codebase.
It worked very well!

On 25 September 2015 at 22:45, Tyler Romeo <tylerro...@gmail.com> wrote:

> On Fri, Sep 25, 2015 at 2:08 PM, Legoktm <legoktm.wikipe...@gmail.com>
> wrote:
>
> > I've tried out a different approach in
> > <https://gerrit.wikimedia.org/r/#/c/241085/>, by disabling all the
> > failing rules. This will let us make the rules that do pass voting (e.g.
> > no closing ?> tags), and we can selectively enable failing rules instead
> > of trying to make giant patches and hope no one introduces regressions
> > while it's still non-voting.
> >
>
> Very much agree with this post. I have worked on phpcs compliance on other
> projects at my company, and we have found it vastly easier to stage-in over
> time, that way at the very least additional errors are not accidentally
> introduced.
>
> *-- *
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2016
> Major in Computer Science
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikidata-tech] Wikidata API breaking changes

2015-09-09 Thread Addshore
Magnus is that on the live site?
If so the breaking change is not being deployed until this evening! :)
I believe this was probably spotted on either test or beta!

On 9 September 2015 at 10:14, Magnus Manske <magnusman...@googlemail.com>
wrote:

> Hm, works for me:
>
> {"entities":{"Q42":{"pageid":138,"ns":0,"title":"Q42"
>
>
>
> On Wed, Sep 9, 2015 at 8:24 AM Lydia Pintscher <
> lydia.pintsc...@wikimedia.de> wrote:
>
>> On Wed, Sep 9, 2015 at 8:04 AM, John Mark Vandenberg <jay...@gmail.com>
>> wrote:
>> > The merged changeset included changes which were not advertised,
>> > causing pywikibot to break.  See T110559
>> >
>> > The wbgetentities JSON 'entities' is now an array/list of entities
>> > instead of a mapping/dictionary of 'Qd' => entity.
>>
>> We're looking into it now.
>>
>>
>> Cheers
>> Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher
>> Product Manager for Wikidata
>>
>> Wikimedia Deutschland e.V.
>> Tempelhofer Ufer 23-24
>> 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>
>> ___
>> Wikidata-tech mailing list
>> wikidata-t...@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>>
>
> ___
> Wikidata-tech mailing list
> wikidata-t...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
>


-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikidata API breaking changes

2015-08-27 Thread Addshore
Hi all!


We have some breaking API changes that will soon be deployed to wikidata.org.


The deployment date should be: 9th September 2015 (just under 2 weeks)


The change making the breaks an be found at:

https://gerrit.wikimedia.org/r/#/c/227686/


The breaking changes are:

 - XML output aliases are now grouped by language
 - XML output may no longer give elements when they are empty
 - XML any claim, qualifier, reference or snak elements that had an
'_idx' element will no longer have it
 - ALL output may now give empty elements, ie. labels when an entity has none


If you want to see a wikipage explaining these changes take a look at:

https://www.wikidata.org/wiki/User:Addshore/API_Break_September_2015


If you have any questions regarding these breaking changes please ask!


Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Technical Debt

2014-10-24 Thread Addshore
The maximum allowed limit is 5000 issues.
We currently only have 3886 (or thats what the scan is currently showing...

It would be great to get the .scrutinizer.yml file in the git repo so that
we can see what inspections are actually being run and people can poke them!

I did some work toward getting scrutinizer some time ago including the
following pull req that has been merged (and I guess deployed now) as prior
to this the inspection crashed out.
https://github.com/scrutinizer-ci/php-analyzer/pull/133

Great work Hashar!

On 23 October 2014 23:07, Jeroen De Dauw jeroended...@gmail.com wrote:

 Hey,

  Since a friend introduced me to Scrutinizer yesterday and the graph above
 seems to be based on it,

 SensioLabsInsight != ScrutinizerCI

 I added mediawiki/core to their interface:
 
  https://scrutinizer-ci.com/g/wikimedia/mediawiki/
 

 Yay. I tried doing this a year ago or so, and back at that point the
 analysis just aborted due to too many issues. Guess the limit was raised :)

 That being said, is there any point in fixing all those issues? And if
  so how do we track them and make sure they are not reintroduced with new
  patchsets?
 

 Going though the issue list and getting rid of all the warnings is probably
 not a good use of your time. Going though and seeing if it points you to
 something pressing might be worthwhile. What I personally find very
 valuable is that you can get a list of classes sorted by complexity, or by
 coupling, or by quality rating, to get an idea of what areas of the
 codebase could use some love [0].

 At Wikimedia Deutchland we use ScrutinizerCI for most of our PHP
 components, and have it run for each commit merged into master. You can
 then see the changes per commit [1], get weekly reports [2] and view the
 overall trend [3].

 [0]
 https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/code-structure/master
 [1]

 https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/inspections/07fab814-f6bc-42df-aab4-80f745d7f0d9
 [2] https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/reports/
 [3] https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/statistics/

 Cheers

 --
 Jeroen De Dauw - http://www.bn2vs.com
 Software craftsmanship advocate
 Evil software architect at Wikimedia Germany
 ~=[,,_,,]:3
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Addshore
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l