[Wikitech-l] Please claim your Wikimedia commits in Ohloh
Hi, I have been drafting the first community metrics monthly report at http://www.mediawiki.org/wiki/User:Qgil/MediaWiki_Community_Metrics/Metrics_Report_2012-11-01 Feedback welcome! The plan was to publish it on November 1st, but as any good first release there have been a couple of unexpected glitches in the last minute. :) 1. Ohloh still hasn't scanned all repos for all of October (we have hundreds). It's better to have all the data retrieved. 2. Sumana found instantly four cases of contributors detected as 'new' that seem to be not new at all: thedj, Ori, Tychay and ankur. This is because they have started contributing commits from a new email address, and Ohloh considers them new contributors. Maybe there are more cases, so why not giving ourselves a chance to fix these obvious cases. FIXING YOUR DATA IN OHLOH TAKES ONLY 5 MINUTES This is of course voluntary: 1. Register or login at http://ohloh.net 2. Go to https://www.ohloh.net/people and search your name, nicks, email addresses. 3. Claim all the contributions that belong to you. 4. Also useful: if you are doing these contributions as an affiliated to an organization (e.g. Wikimedia) make this explicit when claiming your work. That's it. If you want to double check you can go to https://www.ohloh.net/orgs/wikimedia , find the project(s) where you are contributing and there your name or nick should appear under the list of contributors. Thank you for helping us providing more accurate data! -- Quim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] unsubscibe
___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] unsubscibe
Hi, you can unsubscribe by going to the bottom of https://lists.wikimedia.org/mailman/listinfo/wikitech-l and logging in. andre -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] unsubscibe
Actually, you don't even need to log in; if you don't it will send you a confirmation email with a link to click. Additionally, you can send an email to wikitech-l-requ...@lists.wikimedia.org (rather than wikitech-l@lists.wikimedia.org) with the subject unsubscribe. On Thu, Nov 1, 2012 at 8:59 AM, Andre Klapper aklap...@wikimedia.org wrote: Hi, you can unsubscribe by going to the bottom of https://lists.wikimedia.org/mailman/listinfo/wikitech-l and logging in. andre -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikimedia in Ohloh (was Re: Seeking feedback on...)
Federico added manually no less than 326 svn repositories from legacy MediaWiki extensions: https://www.ohloh.net/p/mediawiki-extensions-wmf-hosted-svn Thanks a bunch! Once the new data is added to the rest of Ohloh stats we will have more then 5 millions LOC under the Wikimedia umbrella! I didn't get any feedback about the two questions below. I'll wait a few hours more and I will proceed accordingly unless someone has good reasons to obect. On 10/25/2012 11:18 AM, Quim Gil wrote: Let's decide on a couple of important points ref https://www.ohloh.net/orgs/wikimedia 1. SCOPE Is the scope based on WMF driven projects? Plus MediaWiki extensions? Anything Wikimedia at *.mediawiki.org + Gitub exceptions? Should we also add selected friends out there? Or just anything adding to Wikimedia MediaWiki? Default proposal: any commit and any OSS repository related to MediaWiki and the Wikimedia projects help the overall WM/MW community, and therefore I would include everything. We have ways to organize repos within projects to identify what comes from where. 2. PROJECTS vs UMBRELLAS What to do when one repository is listed in different Ohloh projects? For instance, Wiktionary Mobile has its own project and appears also under the Wikimedia Mobile umbrella. Some extensions have projects on their own, and they also appear under their related MediaWiki Extensions project umbrella. Default proposal: this type of duplication is ok. If someone wants to create an own Ohloh poject for a specific repo this already shows a certain interest to watch and show statistics specific to that project. The project umbrellas are also useful on their own, so there is no point in removing those repos from there. -- Quim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Marking /trunk/extensions as read-only
For what is worth, thanks to Federico we have now nice stats showing the activity in the SVN repository: https://www.ohloh.net/p/mediawiki-extensions-wmf-hosted-svn https://www.ohloh.net/p/mediawiki-extensions-wmf-hosted-svn/contributors?sort=latest_commit The contribution trend is clearly pointing to zero. You can read this in two ways: - Nobody is contributing now, we can change to read-only. - Nobody is contributing anyway, we can leave it as it is. :) -- Quim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Wikimedia engineering October 2012 report
Hi, The report covering Wikimedia engineering activities in October 2012 is now available. Wiki version: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/October Blog version: https://blog.wikimedia.org/2012/11/01/engineering-october-2012-report/ As of last month, we're also proposing a shorter and simpler version of this report for less technically-savvy readers: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/October/summary I'm going to attempt to include the full HTML text of the report below, as requested last month. I usually avoid sending HTML e-mails, especially on mailing lists, so I don't know how this is going to turn out. == Engineering metrics in October: - 110 unique committers contributed patchsets of code to MediaWiki. - The total number of unresolved commitshttps://gerrit.wikimedia.org/r/#q,status:open+project:%255Emediawiki.*,n,zremained stable around 440. - About 35 shell requests https://www.mediawiki.org/wiki/Shell_requestswere processed. - About 57 developers got access to Git and Wikimedia Labshttps://www.mediawiki.org/wiki/Developer_access . - Wikimedia Labs https://www.mediawiki.org/wiki/Wikimedia_Labs now hosts 137 projects, 694 users; to date 1268 instances have been created. Major news in October include: - a redesign of the mobile sitehttps://blog.wikimedia.org/2012/10/24/wikipedia-mobile-gets-a-new-look/emphasizing readability and navigation; - the launch of a Wikipedia apphttps://blog.wikimedia.org/2012/10/26/wikipedia-app-for-windows-8-and-windows-rt-tablets/for Windows RT and Windows 8 tablets; - a test of a redesigned account creation pagehttps://blog.wikimedia.org/2012/10/05/testing-new-signup-page-for-wikipedia/ ; *Note: As of last month, we're proposing a shorter and simpler version of this reporthttps://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/October/summaryfor less technically savvy readers. * Personnel Work with us https://wikimediafoundation.org/wiki/Work_with_us Are you looking to work for Wikimedia? We have a lot of hiring coming up, and we really love talking to active community members about these roles. - Senior Software Engineer - Experimental Featureshttp://hire.jobvite.com/Jobvite/Job.aspx?j=oHDiWfwi - Software Engineer - Experimental Featureshttp://hire.jobvite.com/Jobvite/Job.aspx?j=opDhWfwZ - Software Engineer - Visual Editorhttp://hire.jobvite.com/Jobvite/Job.aspx?j=otYPWfwW - Software Engineer (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=o1H6Vfwt - Software Engineer http://hire.jobvite.com/Jobvite/Job.aspx?j=oX2hWfwW - Software Developer General (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=o4cKWfwG - Git and Gerrit software development (Contract)http://hire.jobvite.com/Jobvite/Job.aspx?j=o4gIWfwI - Senior Software Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=ouLnWfwi - Release Manager http://hire.jobvite.com/Jobvite/Job.aspx?j=oZrQWfwW - Visual Designer http://hire.jobvite.com/Jobvite/Job.aspx?j=oomJWfw9 - Product Manager (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=oGWJWfw1 - RFP-Lucene Search Operations Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=oC5fWfwC - Operations Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=ocLCWfwf - IT Technician (Part-time)http://hire.jobvite.com/Jobvite/Job.aspx?j=o1TKWfwk - Operations Engineer/Database Administratorhttp://hire.jobvite.com/Jobvite/Job.aspx?j=obMOWfwr Announcements - Ċ½eljko Filipin joined the Platform engineering team as QA Engineer ( announcementhttp://lists.wikimedia.org/pipermail/wikitech-l/2012-October/063608.html ). - Andre Klapper joined the Platform engineering team as Bug Wrangler ( announcementhttp://lists.wikimedia.org/pipermail/wikitech-l/2012-October/063616.html ). - Michelle Grover joined the Mobile engineering team as a QA contractor ( announcementhttp://lists.wikimedia.org/pipermail/wikitech-l/2012-October/063730.html ). - Luke Welling joined the Features engineering team as Senior Features Engineer (announcementhttp://lists.wikimedia.org/pipermail/wikitech-l/2012-October/064025.html ). - Brad Jorsch joined the Platform engineering team as Software Engineer, working in the MediaWiki Core group (announcementhttp://lists.wikimedia.org/pipermail/wikitech-l/2012-October/064120.html ). - Steven Bernardin joined the Operations team as Data Center Technician, working in our Tampa data center. Technical Operations *Site Infrastructure* Mark Bergsma has successfully implemented range seeking feature in Varnish, fixed several video streaming bugs, and finally redeployed Varnish at Eqiad, replacing the upload Squids in Tampa. Mark is now working on replacing upload Squid at Esams. He has provisioned 8 servers, and based on early testing, less may actually be needed. We are currently using 23 older servers for upload Squid at Esams.
Re: [Wikitech-l] Wikimedia engineering October 2012 report
On 11/01/2012 01:01 PM, Guillaume Paumier wrote: Hi, The report covering Wikimedia engineering activities in October 2012 is now available. Thanks as always for aggregating and editing this report, Guillaume! I'm going to attempt to include the full HTML text of the report below, as requested last month. I usually avoid sending HTML e-mails, especially on mailing lists, so I don't know how this is going to turn out. Worked fine for me. -- Sumana Harihareswara Engineering Community Manager Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Wikimedia Foundation hiring for several positions
I know this might be offtopic, but people on this list seem likely to be interested or know people who are. As pulled from the most recent engineering report: Work with us https://wikimediafoundation.org/wiki/Work_with_us Are you looking to work for Wikimedia? We have a lot of hiring coming up, and we really love talking to active community members about these roles. - Senior Software Engineer - Experimental Featureshttp://hire.jobvite.com/Jobvite/Job.aspx?j=oHDiWfwi - Software Engineer - Experimental Featureshttp://hire.jobvite.com/Jobvite/Job.aspx?j=opDhWfwZ - Software Engineer - Visual Editorhttp://hire.jobvite.com/Jobvite/Job.aspx?j=otYPWfwW - Software Engineer (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=o1H6Vfwt - Software Engineer http://hire.jobvite.com/Jobvite/Job.aspx?j=oX2hWfwW - Software Developer General (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=o4cKWfwG - Git and Gerrit software development (Contract)http://hire.jobvite.com/Jobvite/Job.aspx?j=o4gIWfwI - Senior Software Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=ouLnWfwi - Release Manager http://hire.jobvite.com/Jobvite/Job.aspx?j=oZrQWfwW - Visual Designer http://hire.jobvite.com/Jobvite/Job.aspx?j=oomJWfw9 - Product Manager (Mobile)http://hire.jobvite.com/Jobvite/Job.aspx?j=oGWJWfw1 - RFP-Lucene Search Operations Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=oC5fWfwC - Operations Engineerhttp://hire.jobvite.com/Jobvite/Job.aspx?j=ocLCWfwf - IT Technician (Part-time)http://hire.jobvite.com/Jobvite/Job.aspx?j=o1TKWfwk - Operations Engineer/Database Administratorhttp://hire.jobvite.com/Jobvite/Job.aspx?j=obMOWfwr Our Job Openings page: http://wikimediafoundation.org/wiki/Job_openings also links to our non-discrimination policy, our pluralism, internationalism, and diversity policy, and our compensation practices. And it mentions a few systems administration jobs open in another WMF department. Those positions are based in our San Francisco headquarters, but in many cases we are open to the possibility of people working remotely, unless otherwise noted in the job posting itself. -- Sumana Harihareswara Engineering Community Manager Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] October 2012 Community Metrics Report
EXTRA! EXTRA! :) Read the new monthly Community Metrics Report: http://www.mediawiki.org/wiki/Community_Metrics/October_2012 It's fresh, and will be improved next month thanks to your feedback and better identified code commits. -- Quim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Gerrit - Github replication updates
We've finished replicating all core and extensions to Github now, as I announced yesterday. Per discussion, we're now replicating everything to the Wikimedia account to avoid confusion and duplication. The mediawiki organization was closed to avoid this confusion. All mediawiki/* repos are now being replicated, and have the same name as in Gerrit (with the caveat that slashes / are changed to dashes - due to Github naming conventions). Please let me know if you have any problems with the replicated repositories. https://github.com/organizations/wikimedia Next step: finding a way to get pull requests back into Gerrit :) -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit - Github replication updates
On Thu, 01 Nov 2012 12:42:24 -0700, Chad innocentkil...@gmail.com wrote: We've finished replicating all core and extensions to Github now, as I announced yesterday. Per discussion, we're now replicating everything to the Wikimedia account to avoid confusion and duplication. The mediawiki organization was closed to avoid this confusion. All mediawiki/* repos are now being replicated, and have the same name as in Gerrit (with the caveat that slashes / are changed to dashes - due to Github naming conventions). ;) you mean naming limitations. Please let me know if you have any problems with the replicated repositories. https://github.com/organizations/wikimedia Next step: finding a way to get pull requests back into Gerrit :) -Chad -- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit - Github replication updates
On Thu, Nov 1, 2012 at 3:50 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote: On Thu, 01 Nov 2012 12:42:24 -0700, Chad innocentkil...@gmail.com wrote: All mediawiki/* repos are now being replicated, and have the same name as in Gerrit (with the caveat that slashes / are changed to dashes - due to Github naming conventions). ;) you mean naming limitations. Yeah...never did really understand why Github did that. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Unit-testing a tag extension (parser blows up)?
I'm trying to test a parser tag extension with phpunit and have run into a strange problem. Whenever my extension calls $parser-recursiveTagParse(), the unit test blows up in Parser.php, complaining that $parser-mOptions is a non-object. The tag callback looks pretty normal: static function render($input, $argv, $parser, $frame) { // ... $parser-recursiveTagParse(something); // ... } and I have unit tests that call render()directly: public function testMyTag() { global $wgParser; $this-assertEqual(MyTag::render(some text, array(), $wgParser, false)); } (I don't like using $wgParser here, and maybe that's the root of my problems?) The tag works perfectly in the browser. Just not when unit-testing on the command line. The blowup occurs in Parser.php::replaceVariables, when it calls $this-mOptions-getMaxIncludeSize(). Any advice appreciated!! Thanks, DanB ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Unit-testing a tag extension (parser blows up)?
On Thu, Nov 1, 2012 at 6:11 PM, Daniel Barrett d...@vistaprint.com wrote: I'm trying to test a parser tag extension with phpunit and have run into a strange problem. Whenever my extension calls $parser-recursiveTagParse(), the unit test blows up in Parser.php, complaining that $parser-mOptions is a non-object. The tag callback looks pretty normal: static function render($input, $argv, $parser, $frame) { // ... $parser-recursiveTagParse(something); // ... } and I have unit tests that call render()directly: public function testMyTag() { global $wgParser; $this-assertEqual(MyTag::render(some text, array(), $wgParser, false)); } (I don't like using $wgParser here, and maybe that's the root of my problems?) The tag works perfectly in the browser. Just not when unit-testing on the command line. The blowup occurs in Parser.php::replaceVariables, when it calls $this-mOptions-getMaxIncludeSize(). Any advice appreciated!! Thanks, DanB ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l You're calling the recursiveTagParse while the parser is not in a parsing state. The easiest way to do this is to have some text yadda yaddamytagfoo/mytag and pass it to $wgParser-parse, and have the parser being the thing calling your callback. (If you're going to do that approach you can even use older style parser tests text file with $wgParserTestFiles if you wanted as its basically doing the same thing) You can also work around this (I believe anyhow) by doing something complicated with Parser-startExternalParse first and then doing what you are doing, but I've never really used that method and am not sure how it works. Hope that helps, -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l