Re: [Wikitech-l] [Wikimedia-l] Quarterly reviews of high priority WMF initiatives
Minutes and slides from last Thursday's quarterly review of the Foundation's Editing (formerly VisualEditor) team are now available at https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews/Editing/June_2014 . (A separate but related quarterly review meeting of the Parsoid team took place on Friday, those minutes should be up tomorrow.) On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller e...@wikimedia.org wrote: Hi folks, to increase accountability and create more opportunities for course corrections and resourcing adjustments as necessary, Sue's asked me and Howie Fung to set up a quarterly project evaluation process, starting with our highest priority initiatives. These are, according to Sue's narrowing focus recommendations which were approved by the Board [1]: - Visual Editor - Mobile (mobile contributions + Wikipedia Zero) - Editor Engagement (also known as the E2 and E3 teams) - Funds Dissemination Committe and expanded grant-making capacity I'm proposing the following initial schedule: January: - Editor Engagement Experiments February: - Visual Editor - Mobile (Contribs + Zero) March: - Editor Engagement Features (Echo, Flow projects) - Funds Dissemination Committee We’ll try doing this on the same day or adjacent to the monthly metrics meetings [2], since the team(s) will give a presentation on their recent progress, which will help set some context that would otherwise need to be covered in the quarterly review itself. This will also create open opportunities for feedback and questions. My goal is to do this in a manner where even though the quarterly review meetings themselves are internal, the outcomes are captured as meeting minutes and shared publicly, which is why I'm starting this discussion on a public list as well. I've created a wiki page here which we can use to discuss the concept further: https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews The internal review will, at minimum, include: Sue Gardner myself Howie Fung Team members and relevant director(s) Designated minute-taker So for example, for Visual Editor, the review team would be the Visual Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker. I imagine the structure of the review roughly as follows, with a duration of about 2 1/2 hours divided into 25-30 minute blocks: - Brief team intro and recap of team's activities through the quarter, compared with goals - Drill into goals and targets: Did we achieve what we said we would? - Review of challenges, blockers and successes - Discussion of proposed changes (e.g. resourcing, targets) and other action items - Buffer time, debriefing Once again, the primary purpose of these reviews is to create improved structures for internal accountability, escalation points in cases where serious changes are necessary, and transparency to the world. In addition to these priority initiatives, my recommendation would be to conduct quarterly reviews for any activity that requires more than a set amount of resources (people/dollars). These additional reviews may however be conducted in a more lightweight manner and internally to the departments. We’re slowly getting into that habit in engineering. As we pilot this process, the format of the high priority reviews can help inform and support reviews across the organization. Feedback and questions are appreciated. All best, Erik [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikimedia-l mailing list wikimedi...@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l -- Tilman Bayer Senior Operations Analyst (Movement Communications) Wikimedia Foundation IRC (Freenode): HaeB ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bugday on older MediaWiki bugs with high priority set on Tue, June 24 2014, 17:00UTC
Reminder: The triage starts in 60 minutes. andre On Tue, 2014-06-17 at 16:42 +0200, Andre Klapper wrote: Hi everybody, you are invited to join us on the next Bugday: Tuesday, June 24, 2014, 17:00 to 18:30UTC [1] in #wikimedia-office on Freenode IRC [2] We will be triaging open Bugzilla tickets under the product MediaWiki which have high priority set for more than one year. Everyone is welcome to join, and no technical knowledge needed! It's an easy way to get involved or to give something back. All information can be found here: https://www.mediawiki.org/wiki/Bug_management/Triage/20140624 For more information on triaging in general and what that means, check out https://www.mediawiki.org/wiki/Bug_management/Triage See you there? andre [1] Timezone converter: http://www.timeanddate.com/worldclock/converter.html [2] See http://meta.wikimedia.org/wiki/IRC for more info on IRC chat -- Andre Klapper | Wikimedia Bugwrangler http://blogs.gnome.org/aklapper/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] What namespaces should we use when we use them
Chris Steipp submitted a patch add namespaces to the OAuth extension [0] in order to fix a compatibility issue with HHVM [1]. This change led to a bit of bikeshedding on the proper namespace to use for the extension. Chris initially chose `MWOAuth` but later amended to use `MediaWiki\Extensions\OAuth` at my suggestion. There is some discussion in gerrit, but my primary arguments were made in irc (and in an unlogged channel no less), so I thought this topic would be worth dragging out in front of everyone for a little debate. I don't know of anyone who is actively trying to change MediaWiki core to use namespaces, but extensions are using them. This leads me to think that we should think as a group about the right way to use namespaces and how to chose namespaces. I suggested MediaWiki\Extensions\OAuth because it seems like the most natural naming to me. PSR-4 [2] is very opinionated about namespaces. It requires a top-level (or vendor) namespace. I chose MediaWiki because ... MediaWiki. Beyond that sub-namespaces are optional, but the file system path from the base directory to the php file must match. Assuming that $IP is the base directory led me to suggest MediaWiki\Extensions\OAuth. The loudest argument I've heard here (and elsewhere) about using namespaces like MediaWiki\Extensions\OAuth is that there are too many characters to type. This, in my opinion, is really a we-fear-change argument. With full PSR-4 namespacing, the longest fully qualified class name would be MediaWiki\Extensions\OAuth\Frontend\SpecialPages\SpecialMWOAuthConsumerRegistration.php [3]. I will give no argument that this is a short string. I will say however that I spent enough of my career working in java [4] on a codebase with fully qualified class names like org.accessidaho.apps.itd.dlr.common.pLAZma_RestrictionModel [5] to find out two things: you typically type a fully qualified class name once per file at most, and any reasonable editor can be configured to have tab completion and/or macro expansion for commonly typed strings. [0]: https://gerrit.wikimedia.org/r/#/c/141608/ [1]: https://bugzilla.wikimedia.org/show_bug.cgi?id=66929 [2]: http://www.php-fig.org/psr/psr-4/ [3]: This could be shortened a bit by removing the embedded pseudo-namespacing [4]: A witch! Burn him! [5]: Not meant as an example of a good classname Bryan -- Bryan Davis Wikimedia Foundationbd...@wikimedia.org [[m:User:BDavis_(WMF)]] Sr Software EngineerBoise, ID USA irc: bd808v:415.839.6885 x6855 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] What namespaces should we use when we use them
On Wed, Jun 25, 2014 at 12:30 AM, Bryan Davis bd...@wikimedia.org wrote: Chris initially chose `MWOAuth` but later amended to use `MediaWiki\Extensions\OAuth` at my suggestion. [...] I suggested MediaWiki\Extensions\OAuth because it seems like the most natural naming to me. +2. Abbreviations are mostly annoying and not suited for class (or namespace) names. There's always the risk someone else already picked your two or three-letter abbreviation, which would eventually cause some issues for everyone involved, whereas there's only one MediaWiki out there. The loudest argument I've heard here (and elsewhere) about using namespaces like MediaWiki\Extensions\OAuth is that there are too many characters to type. That could then be used as an argument against the supermajority of our well-estabilished coding conventions. Why call a variable $enableEditing when $e (or $ee) requires far less typing? Let's just name variables from $a to $z and once we run out of valid single-letter vars, let's just use $aa, $bb, and so on, because it requires far less typing! (And produces far more messy and unreadable code. Oops!) you typically type a fully qualified class name once per file at most, and any reasonable editor can be configured to have tab completion and/or macro expansion for commonly typed strings. Ctrl+C and Ctrl+V have been a thing for quite a while, no? Regards, -- Jack Phoenix MediaWiki developer ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Browser tests for core
I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. From a security perspective, this also takes a step towards more efficient security testing, which I'm also a fan of (if you've tried blindly scanning mediawiki, you know what I'm talking about..). I think the QA group is working on vagrant-izing these, but if you have ruby 1.9.3 and firefox, then setting up and running these tests on your local dev system is 4-6 commands, $ cd tests/browser $ gem update --system $ gem install bundler $ bundle install You can either set your environment variables yourself, or edit environment_variables and run `source environment_variables` to set them. Then it's just $ bundle exec cucumber features/ to run the tests. They currently complete in 36 seconds on my laptop. I'd like to see more tests added and backported to REL1_23 to make sure we have an ongoing suite to check releases against for next few years that we support that LTS. If anyone is interested in both mediawiki core and browser tests, I'm sure the QA team would like to get you involved. Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me and getting this done. I'll let them jump in with all the details I've missed. [1] - https://gerrit.wikimedia.org/r/#/c/133507/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On 24 June 2014 15:46, Chris Steipp cste...@wikimedia.org wrote: I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. Great news; thank you all! J. -- James D. Forrester Product Manager, Editing Wikimedia Foundation, Inc. jforres...@wikimedia.org | @jdforrester ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( Ryan Kaldari On Tue, Jun 24, 2014 at 3:46 PM, Chris Steipp cste...@wikimedia.org wrote: I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. From a security perspective, this also takes a step towards more efficient security testing, which I'm also a fan of (if you've tried blindly scanning mediawiki, you know what I'm talking about..). I think the QA group is working on vagrant-izing these, but if you have ruby 1.9.3 and firefox, then setting up and running these tests on your local dev system is 4-6 commands, $ cd tests/browser $ gem update --system $ gem install bundler $ bundle install You can either set your environment variables yourself, or edit environment_variables and run `source environment_variables` to set them. Then it's just $ bundle exec cucumber features/ to run the tests. They currently complete in 36 seconds on my laptop. I'd like to see more tests added and backported to REL1_23 to make sure we have an ongoing suite to check releases against for next few years that we support that LTS. If anyone is interested in both mediawiki core and browser tests, I'm sure the QA team would like to get you involved. Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me and getting this done. I'll let them jump in with all the details I've missed. [1] - https://gerrit.wikimedia.org/r/#/c/133507/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
quote name=Ryan Kaldari date=2014-06-24 time=15:56:31 -0700 Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( That shouldn't affect these tests. These run on your local box/vagrant. -- | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E | | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D | ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On Tue, Jun 24, 2014 at 3:56 PM, Ryan Kaldari rkald...@wikimedia.org wrote: Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( Yes, and I think we should keep it that way. See my msg. to the mobile-l and qa lists earlier today: http://lists.wikimedia.org/pipermail/mobile-l/2014-June/007435.html These tests are intended as acceptance tests for a new mediawiki install and smoke tests for a test env like beta labs. Let's not run them against production. -Chris Ryan Kaldari On Tue, Jun 24, 2014 at 3:46 PM, Chris Steipp cste...@wikimedia.org wrote: I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. From a security perspective, this also takes a step towards more efficient security testing, which I'm also a fan of (if you've tried blindly scanning mediawiki, you know what I'm talking about..). I think the QA group is working on vagrant-izing these, but if you have ruby 1.9.3 and firefox, then setting up and running these tests on your local dev system is 4-6 commands, $ cd tests/browser $ gem update --system $ gem install bundler $ bundle install You can either set your environment variables yourself, or edit environment_variables and run `source environment_variables` to set them. Then it's just $ bundle exec cucumber features/ to run the tests. They currently complete in 36 seconds on my laptop. I'd like to see more tests added and backported to REL1_23 to make sure we have an ongoing suite to check releases against for next few years that we support that LTS. If anyone is interested in both mediawiki core and browser tests, I'm sure the QA team would like to get you involved. Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me and getting this done. I'll let them jump in with all the details I've missed. [1] - https://gerrit.wikimedia.org/r/#/c/133507/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
Thanks for reviewing that, Chris! Chris McMahon did most of the hard work there, porting everything from qa/browsertests into core. I was glad to help on the Vagrant side of things, though: I have a mediawiki-vagrant commit standing by that will do all the heavy heavy installation lifting that you mentioned. :) I'll double check that it's backwards compatible and send it up for review shortly. On Tue, Jun 24, 2014 at 3:46 PM, Chris Steipp cste...@wikimedia.org wrote: I just +2'ed a change to add a few basic selenium tests to core [1]. I think it will benefit us all to have a set of automated tests to quickly make sure mediawiki is working correctly. From a security perspective, this also takes a step towards more efficient security testing, which I'm also a fan of (if you've tried blindly scanning mediawiki, you know what I'm talking about..). I think the QA group is working on vagrant-izing these, but if you have ruby 1.9.3 and firefox, then setting up and running these tests on your local dev system is 4-6 commands, $ cd tests/browser $ gem update --system $ gem install bundler $ bundle install You can either set your environment variables yourself, or edit environment_variables and run `source environment_variables` to set them. Then it's just $ bundle exec cucumber features/ to run the tests. They currently complete in 36 seconds on my laptop. I'd like to see more tests added and backported to REL1_23 to make sure we have an ongoing suite to check releases against for next few years that we support that LTS. If anyone is interested in both mediawiki core and browser tests, I'm sure the QA team would like to get you involved. Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me and getting this done. I'll let them jump in with all the details I've missed. [1] - https://gerrit.wikimedia.org/r/#/c/133507/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Dan Duvall Automation Engineer Wikimedia Foundation http://wikimediafoundation.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On 24 June 2014 18:56, Ryan Kaldari rkald...@wikimedia.org wrote: Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( Sorry to be a bit OT, but if you guys are going to test, please don't do it in article space on enwiki, or this is what is going to happen to the accounts. We've had to almost kick WMF staff off enwiki before because they kept testing in live article space, please don't do that. Just be glad that the blocking admin didn't make it a block with IP autoblocked for 24 hours, or there would have been a bigger problem. Testwiki is for testand if you must test on enwiki, do it in userspace. Risker/Anne ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On Jun 24, 2014 9:05 PM, Risker risker...@gmail.com wrote: On 24 June 2014 18:56, Ryan Kaldari rkald...@wikimedia.org wrote: Nice work guys! One slight issue though. The user Selenium_user (as set in environment_variables) was just indefinitely blocked on en.wiki :( Sorry to be a bit OT, but if you guys are going to test, please don't do it in article space on enwiki, or this is what is going to happen to the accounts. We've had to almost kick WMF staff off enwiki before because they kept testing in live article space, please don't do that. Just be glad that the blocking admin didn't make it a block with IP autoblocked for 24 hours, or there would have been a bigger problem. Testwiki is for testand if you must test on enwiki, do it in userspace. Risker/Anne That sounds entirely reasonable to me. --bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On 24 June 2014 17:05, Risker risker...@gmail.com wrote: Sorry to be a bit OT, but if you guys are going to test, please don't do it in article space on enwiki, or this is what is going to happen to the accounts. We've had to almost kick WMF staff off enwiki before because they kept testing in live article space, please don't do that. Or you'll accidentally insert vandalism into articles when testing the abuse filter. Because I've *totally never* done that by accident. :-) If these browsers tests don't interact with the site that leaves anything user-facing behind (e.g. making edits or take actions that create log entries), there's no problem with running them on enwiki. Otherwise, they should be using our test or beta sites. Dan -- Dan Garry Associate Product Manager for Platform and Mobile Apps Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Browser tests for core
On Jun 24, 2014 6:13 PM, Dan Garry dga...@wikimedia.org wrote: On 24 June 2014 17:05, Risker risker...@gmail.com wrote: Sorry to be a bit OT, but if you guys are going to test, please don't do it in article space on enwiki, or this is what is going to happen to the accounts. We've had to almost kick WMF staff off enwiki before because they kept testing in live article space, please don't do that. Or you'll accidentally insert vandalism into articles when testing the abuse filter. Because I've *totally never* done that by accident. :-) If these browsers tests don't interact with the site that leaves anything user-facing behind (e.g. making edits or take actions that create log entries), there's no problem with running them on enwiki. Otherwise, they should be using our test or beta sites. They do leave some artifacts (documented in the change, iirc). So yeah, they should only be used against your dev environment or beta. Dan -- Dan Garry Associate Product Manager for Platform and Mobile Apps Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] What namespaces should we use when we use them
On 25/06/14 07:30, Bryan Davis wrote: I suggested MediaWiki\Extensions\OAuth because it seems like the most natural naming to me. PSR-4 [2] is very opinionated about namespaces. It requires a top-level (or vendor) namespace. I chose MediaWiki because ... MediaWiki. Beyond that sub-namespaces are optional, but the file system path from the base directory to the php file must match. Assuming that $IP is the base directory led me to suggest MediaWiki\Extensions\OAuth. I think it's fine. In some cases, the vendor will be the extension name, which I think is fine also. For example, Packagist has a wikibase vendor and a data-values vendor, both created by Jeroen/WMDE, and they do already use Wikibase and DataValues respectively as their root namespaces, in accordance with their Packagist vendor names. Along the same lines, one can also imagine a phpBB integration plugin for MediaWiki being phpBB\MediaWiki rather than MediaWiki\Extensions\phpBB. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Including NS_MAIN in $wgNamespacesWithSubpages by default
Hi. Re: https://www.mediawiki.org/wiki/Manual:$wgNamespacesWithSubpages Is there a reason NS_MAIN is not included in $wgNamespacesWithSubpages by default? My understanding is that historically NS_MAIN wasn't included in this array because the English Wikipedia and others don't use subpages in their main namespace. But that's probably no longer really relevant. The current default array (copied below from DefaultSettings.php) seems like it should include NS_MAIN by default. My feeling is that wikis that don't want to use subpages in the main namespace simply won't (the fallback behavior here is pretty clean unless you're running an AC/DC wiki) and I think most MediaWiki installations would want NS_MAIN included in the default (e.g., the manual page's first custom configuration entry addresses this issue... it comes up quite a bit). Thoughts on adding NS_MAIN to $wgNamespacesWithSubpages in MediaWiki core? I briefly searched Bugzilla for a previous discussion of this topic, but couldn't find any relevant tickets. Current default: --- $wgNamespacesWithSubpages = array( NS_TALK = true, NS_USER = true, NS_USER_TALK = true, NS_PROJECT = true, NS_PROJECT_TALK = true, NS_FILE_TALK = true, NS_MEDIAWIKI = true, NS_MEDIAWIKI_TALK = true, NS_TEMPLATE_TALK = true, NS_HELP = true, NS_HELP_TALK = true, NS_CATEGORY_TALK = true ); --- Proposed array addition: --- NS_MAIN = true, --- MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Including NS_MAIN in $wgNamespacesWithSubpages by default
MZMcBride wrote: Current default: --- $wgNamespacesWithSubpages = array( NS_TALK = true, NS_USER = true, NS_USER_TALK = true, NS_PROJECT = true, NS_PROJECT_TALK = true, NS_FILE_TALK = true, NS_MEDIAWIKI = true, NS_MEDIAWIKI_TALK = true, NS_TEMPLATE_TALK = true, NS_HELP = true, NS_HELP_TALK = true, NS_CATEGORY_TALK = true ); --- Proposed array addition: --- NS_MAIN = true, --- The more I look at this, the more I wonder why not instead invert the array: --- $wgNamespacesWithoutSubpages = array( NS_FILE = true, NS_CATEGORY = true ); --- I don't know why NS_TEMPLATE isn't true by default... isn't using Template:Foo/doc for Template:Foo's documentation a common pattern? Though my real goal is saner default behavior for NS_MAIN and inverting the array would be a larger change than a simple one-liner. Hmmm. MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Summer contractors for Multimedia
Hi folks, I just wanted to provide a brief (and belated) announcement regarding some added help we've enlisted over the summer on the Multimedia team. Brian Wolff has been doing little bits of contracting work over the course of his school year (as well as a lot of volunteer development), and this summer will be doing a lot more contracting work. He plans to spend his time taking out various bugs in all multimedia components. He's been at it for a few weeks now, and has already made a lot of headway. Neil Kandalgaonkar will be working with us this summer on unit tests for UploadWizard. Neil originally wrote and maintained UploadWizard as an employee of Wikimedia Foundation between 2009 and 2012. We're delighted to have him back for a little while to help bootstrap our work on improving UploadWizard. Welcome guys! \o/ Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l