Re: [Wikitech-l] Barkeep code review tool
Roan Kattouw wrote: Yes, ops essentially uses a post-commit workflow right now, and that makes sense for them. ops also uses pre-commit review for non-ops people :-] -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Barkeep code review tool
On Sat, Jun 30, 2012 at 11:53 PM, Antoine Musso hashar+...@free.fr wrote: Roan Kattouw wrote: Yes, ops essentially uses a post-commit workflow right now, and that makes sense for them. ops also uses pre-commit review for non-ops people :-] Yeah, that's right. What I meant to say (and thought I had said in some form later in that message) was that the puppet repo has post-commit review for most changes by ops staff, and pre-commit review for everything else (non-ops staff, volunteers, and certain changes by ops staff in some cases). Roan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Speed up tests, make @group Database smarter
Hi Platonides, On Sat, Jun 30, 2012 at 03:45:14PM +0200, Platonides wrote: On 30/06/12 14:24, Christian Aistleitner wrote: [ Mocking the database ] [...] One would have to abstract database access above the SQL layer (separate methods for select, insert, ...) [...] You still need to implement some complex SQL logic. One might think so, yes. But as I said, one would mock /above/ the SQL layer. For typical database operations, SQL would not even get generated in the first place! Consider for example code containing $db-insert( $param1, $param2, ... ); The mock db's insert function would compare $param1, $param2, ... against the invocations the test setup injected. If there is no match, the test fails. If there is a match, the mock returns the corresponding return value right away. No generating SQL. No call to $db-tableName. No call to $db-makeList. No call to $db-query. No nothing. \o/ But maybe you hinted at DatabaseBase::query? DatabaseBase::query should not be used directly, and it's hardly is. We can go for straight for parameter comparison there as well. No need to parse the SQL. Unit testing is about decoupling and testing things in isolation. With DatabaseBase and the corresponding factories, MediaWiki has a layer that naturally decouples business logic from direct database access. Use the decoupling, Luke! Christian P.S.: As an example for decoupling and mocking in MW, consider tests/phpunit/maintenance/backupTextPassTest.php:testPrefetchPlain. This test is about dumping a wiki's database using prefetch. The idea behind prefetch is to use an old dump and use texts from this old dump instead of asking the database for every single text of the new dump. To test dumping using prefetch /without/ mocking, one would have to setup an XML for the old dump. This old dump's XML would get read, parsed, interpreted, ... upon each single test invocation. Tedious and time consuming. Upon each update of the XML format, we'd also have to update also the XML representation of the old XML dump. Yikes! [1] Besides, it duplicates efforts, as reading dumps, interpreting them is a separate issue and dealt with in isolation already in tests/phpunit/maintenance/backupPrefetchTest.php So the handling of the old dump reading, ... has been mocked out. All that's necessary for this is lines 143--153 and line 160 of tests/phpunit/maintenance/backupTextPassTest.php:testPrefetchPlain $prefetchMock is the mock for the prefetch (i.e.: old dump). $prefetchMap models the expected parameters and return values of the mocked method. So for example invoking $prefetchMock-prefetch( $this-pageId1, $this-revId1_1 ) yields Prefetch_1Text1 . [1] Yes, we had that situation recently, when the parentid tag got introduced [2]. The XML dumps of tests/phpunit/maintenance/backupTextPassTest.php tests/phpunit/maintenance/backupPrefetchTest.php were updated. So the tests assert that both dumping, and prefetch works with parentid. But we did not have to touch the mock due to this decoupling. [2] See commit d04b8ceea67660485245beaa4aca1625cf2170aa https://gerrit.wikimedia.org/r/#/c/10743/ -- quelltextlich e.U. \\ Christian Aistleitner Companies' registry: 360296y in Linz Christian Aistleitner Gruendbergstrasze 65aEmail: christ...@quelltextlich.at 4040 Linz, Austria Phone: +43 732 / 26 95 63 Fax:+43 732 / 26 95 63 Homepage: http://quelltextlich.at/ --- signature.asc Description: Digital signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Handling deletion using the latest revision
Hey, I am writing some code that needs to be executed on every article deletion and which needs the latest revision of the article that was deleted. Is there any place (hook) where I can put this without doing a db read for the revision? And if not, is there any place where I can put this where I actually get the revision id I need or can obtain it without writing my own query? Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Barkeep code review tool
Yeah, that's right. What I meant to say (and thought I had said in some form later in that message) was that the puppet repo has post-commit review for most changes by ops staff, and pre-commit review for everything else (non-ops staff, volunteers, and certain changes by ops staff in some cases). And we'd gladly take better tools for doing post-commit review. Gerrit handles this very poorly. Just having free-form tags in Gerrit would likely fix this for our use case, though. Saved searches would be amazing. - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] git commit history
One thing I just noticed when looking at the git history via gitk (on Ubuntu) is that the history looks totally spaghetti and it is hard to make sense of the history. This seems to have happened since the switch to git and post-commit review workflow. It might be worth considering this as well. git pull --rebase (which I assume is being used) usually helps eliminate noisy merge commits, but I suspect something else is going on -- post-review commit might be one reason. Is this something that is worth fixing and can be fixed? Is there some gerrit config that lets gerrit rebase before merge to let fast-forwarding and eliminate noisy merges? Subbu. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Date Formatting
I think I once added a parser function that formats dates according to a user's preferences. It might be called {{#dateformat}}. You could possibly steal the code that it uses. On Tue, Jun 26, 2012 at 2:45 AM, Derric Atzrott datzr...@alizeepathology.com wrote: How do you format dates according to user preference in Mediawiki? I presume it has something to do with DateFormatter (http://svn.wikimedia.org/doc/classDateFormatter.html), but I'm not 100% sure how to make use of this class. The extension I am working on displays, among other things, a last update time for its data. I want this time to display consistent with user preferences. Right now I am running (time() - (60*60*4)) through date(), but I feel this is a very inelegant solution. Thank you, Derric Atzrott Computer Specialist Alizee Pathology ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Andrew Garrett Wikimedia Foundation agarr...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Extension Skeleton Program
Ooh, I like this. On Thu, Jun 28, 2012 at 7:37 AM, Ori Livneh o...@wikimedia.org wrote: On Monday, June 25, 2012 at 9:06 AM, Derric Atzrott wrote: Would anyone be interested in a program that generates Skeletons for new extensions? I've noticed that when I make extensions I generally go through the exact same process (with minor variations) each time. [ snip ] I've been working on something similar, so I decided to throw it up on GitHub, in case it's useful. It's a skeleton extension for JavaScript-centric MediaWiki extensions. https://github.com/atdt/skeljs Out of the box, you get: * QUnit test scaffold * ResourceLoader integration * MediaWiki-compatible JSHint config * Command-line build tool (grunt) for linting and running Qunit tests using phantomjs I've only recently started developing on MediaWiki so this definitely needs to be reviewed. (*cough* Krinkle / Roan / Trevor?) Derric ( others): feel free to use this in any way you want. I'll migrate this to Gerrit when I get the chance. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Andrew Garrett Wikimedia Foundation agarr...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wmfall] Announcement: Matt Walker joins Wikimedia as Fundraising Engineer
I'm looking forward to meeting you when I get back from Wikimania! On Sat, Jun 30, 2012 at 5:45 AM, Terry Chay tc...@wikimedia.org wrote: Hello everyone, It’s with great pleasure that I’m announcing that Matt Walker has joined the Wikimedia Foundation as a Fundraising Engineer. Before joining us, Matt was a software engineer at Rockwell Collins Control Technologies developing “a DO-178B level A qualified Real-Time Operating System in C and PowerPC assembly.” (Ask him about it.) He got his dual B.S. in EE and CS from the University of Tulsa with a minor in Mathematics. On the side, Matt enjoys tech theatre, glass blowing, SCUBA diving, hiking, and bicycling so I’m not sure how he’ll fit in with his move to the Bay Area. During the reference check, the department chair of Electrical Engineering at his uni regaled me with stories about a time lapse video project he self-started and having to sign a permission slip for a high school prom. His first official day will be on July 9th assuming he survives his cross-country trip through Wyoming and the Dakotas. (I’ve seen the trailers for Longmire http://en.wikipedia.org/wiki/Longmire_(TV_series) so it’s not a given — Wyoming sounds like a very dangerous place.) He will be working with the FR-Tech team trying to establish the lower bounds for the Ballmer Peak http://xkcd.com/323/ during their late night programming sessions; Katie and Peter will be establishing the Long Tail. He’s also great friends with Peter Gehres, but we won’t hold that against him. :-) Please join me in welcoming Matt to the Wikimedia Foundation. Take care, Terry terry chay 최태리 Director of Features Engineering Wikimedia Foundation “Imagine a world in which every single human being can freely share in the sum of all knowledge.* That's our commitment.*” p: +1 (415) 839-6885 x6832 m: +1 (408) 480-8902 e: tc...@wikimedia.org i: http://terrychay.com/ w: http://meta.wikimedia.org/wiki/User:Tychay aim: terrychay ___ Wmfall mailing list wmf...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wmfall -- Andrew Garrett Wikimedia Foundation agarr...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Bugzilla Weekly Report
MediaWiki Bugzilla Report for June 25, 2012 - July 02, 2012 Status changes this week Bugs NEW : 266 Bugs ASSIGNED : 81 Bugs REOPENED : 33 Bugs RESOLVED : 135 Total bugs still open: 8105 Resolutions for the week: Bugs marked FIXED : 87 Bugs marked REMIND : 0 Bugs marked INVALID: 6 Bugs marked DUPLICATE : 28 Bugs marked WONTFIX: 5 Bugs marked WORKSFORME : 12 Bugs marked LATER : 3 Bugs marked MOVED : 0 Specific Product/Component Resolutions User Metrics New Bugs Per Component Upload 7 General/Unknown 7 PageTriage 7 WikidataRepo7 ArticleFeedbackv5 7 New Bugs Per Product MediaWiki 19 Wikimedia 18 MediaWiki extensions45 Wikimedia Mobile5 Security2 Top 5 Bug Resolvers sam [AT] reedyboy.net 14 krinklemail [AT] gmail.com 10 daniel.kinzler [AT] wikimedia.de9 hashar [AT] free.fr 8 yuvipanda [AT] gmail.com6 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] git commit history
On Sun, Jul 1, 2012 at 5:30 PM, Subramanya Sastry ssas...@wikimedia.org wrote: One thing I just noticed when looking at the git history via gitk (on Ubuntu) is that the history looks totally spaghetti and it is hard to make sense of the history. This seems to have happened since the switch to git and post-commit review workflow. It might be worth considering this as well. git pull --rebase (which I assume is being used) usually helps eliminate noisy merge commits, but I suspect something else is going on -- post-review commit might be one reason. Is this something that is worth fixing and can be fixed? Is there some gerrit config that lets gerrit rebase before merge to let fast-forwarding and eliminate noisy merges? Yes, this can be configured in Gerrit on a per-repo basis. I seem to recall we had a reason for not enabling this but I don't remember that discussion very well or whether it took place at all. Roan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l