Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Giuseppe Lavagetto
On Wed, Feb 4, 2015 at 5:42 AM, Tim Starling  wrote:
> On 04/02/15 12:46, Dan Garry wrote:
>> To address these challenges, we are considering performing some or all of
>> these tasks in a service developed by the Mobile Apps Team with help from
>> Services. This service will hit the APIs we currently hit on the client,
>> aggregate the content we need on the server side, perform transforms we're
>> currently doing on the client on the server instead, and serve the full
>> response to the user via RESTBase. In addition to providing a public API
>> end point, RESTBase would help with common tasks like monitoring, caching
>> and authorisation.
>
> I don't really understand why you want it to be integrated with
> RESTBase. As far as I can tell (it is hard to pin these things down),
> RESTBase is a revision storage backend and possibly a public API for
> that backend. I thought the idea of SOA was to separate concerns.
> Wouldn't monitoring, caching and authorization would be best done as a
> node.js library which RESTBase and other services use?
>

I agree with Tim. Using RESTBase as an integration layer for
everything is SOA done wrong. If we need to have an authorization
system, which is different from our APIs, we need to build it
separately, not to add levels of indirection.

Doing 4 things from one single service is basically rebuilding the
mediawiki monolith, only in a different language :)

What you need, IMO, is a thin proxy layer in front of all the separate
APIs you have to call, including restbase for caching/revision
storage. It may be built into the app or, if it is consumed by
multiple apps, built as a thin proxy service itself.

(I also don't get what "monitoring" means here, but someone could
probably explain it to me)

Cheers,

Giuseppe

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Manuel Schneider
Hi,

there is the tool by Holger which does that using WebM - but it is
hosted on WMF Labs. It uses OAuth for unified login and moving the files
to Commons.

Then, a more elaborate tool for
* storing raw material at the Internet Archive
* generating patent free WebM proxy clips for editing
* rendering high-quality videos
* moving these rendered videos to Commons directly

is the Video Editing Server, developed by some Wikipedians and a MLT
developer, hosted by the Internet Archive:

https://wikimedia.meltvideo.com/

It also uses OAuth for login and moving files to Commons.

The workflow with this:

# upload all your raw files to the server for
## long-term storage
## to make them available to other editors
## to let the server use them in the rendering process

# the server transcodes all files into WebM "proxy clips"

# editors download the WebM proxy clips
## do the editing on your computer
## create an MLT project file (eg. using kdenlive or another MLT-based
video editor)

# upload the project file
## server will replace proxy clips with raw material
## server will render video project
## server will move generated file to Commons

It comes with a search engine, meta data forms... it's still pretty new
(development started in December '14) but can be used.
We plan to add some more features like tagging using Wikidata QIDs
(hence allowing multilingual / localised tagging / searching, adding
more project file formats and renderer, making old project file
revisions available for download, give it a nice vector-based theme,
give it a better domain name and SSL certificate...

Play with it and have fun!

For source code or any issues refer to GitHub:
https://github.com/ddennedy/wikimedia-video-editing-server

also there see the wiki for the specs and a deployment guide:
https://github.com/ddennedy/wikimedia-video-editing-server/wiki


/Manuel
-- 
Wikimedia CH - Verein zur Förderung Freien Wissens
Lausanne, +41 (21) 34066-22 - www.wikimedia.ch

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Erik Moeller
On Tue, Feb 3, 2015 at 5:46 PM, Dan Garry  wrote:
> To address these challenges, we are considering performing some or all of
> these tasks in a service developed by the Mobile Apps Team with help from
> Services. This service will hit the APIs we currently hit on the client,
> aggregate the content we need on the server side, perform transforms we're
> currently doing on the client on the server instead, and serve the full
> response to the user via RESTBase. In addition to providing a public API
> end point, RESTBase would help with common tasks like monitoring, caching
> and authorisation.

Using https://phabricator.wikimedia.org/T87824 as a reference point
for what you're talking about -

I think you will generally find agreement that moving client-side
transformations that only live in the app to server-side code that
enables access by multiple consumers and caching is a good idea. If
there are reasons not do to this, now'd be a good time to speak up.

If not, then I think one thing to keep in mind is how to organize the
transformation code in a manner that it doesn't just become a
server-side hodgepodge still only useful to one consumer, to avoid
some of the pitfalls Brian mentions. Say you want to reformat
infoboxes on the mobile web, but not do all the other stuff the mobile
app does. Can you just get that specific transformation? Are some
transformations dependent on others?  Or say we want to make a change
only for the output that gets fed into the PDF generator, but not for
any other outputs. Can we do that?

Or a more pressing concern for the app team itself, what about alpha,
beta, stable version of the apps -- how would those get more/less
experimental versions of the output? Or languages -- are there cases
where we apply a transformation only in one language, but not another?

Do we need a way to register schemas so we can easily get a certain
set of inter-dependent transformations, like "mobile app stable",
"desktop web", etc.? Or are these all just API/service parameters?

Just some early questions as we're thinking this through.

Erik

-- 
Erik Möller
VP of Product & Strategy, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Corey Floyd
tl;dr; Before the Mobile Apps Team embarks on its own path, I really think
we should work together as an organization to figure out a better strategy
for implementing a set of consistent services that address the needs of all
platforms.


So… I've been involved in similar conversations at other organizations.
They generally sound like this:

"These APIs were designed before mobile. Mobile needs mobile friendly
services to be great. This services currently don't exist and/or aren't
being implemented fast enough for us to build the features we want. Lets
implement our own services!" (Ok that was a bit contrived but you get the
point)

The problem I have with this solution is two fold:

1. The owners of this services project should really be the services team.
2. Building a separate service for mobile does not align the needs of the
mobile team with the rest of the MediaWiki API consumers.

To be clear - I totally agree that we should be working with services
engineers to collaboratively develop APIs. In fact, Mobile Apps, along with
other front end teams, should be proactively giving the services team input
and support so they can build the services that we need. This results in
making the API not only better for Mobile Apps, but the other platforms as
well.

Working around our services issues by providing the mobile apps with
different set of APIs than the rest of the platforms comes with its own set
of problems. In the long run it creates more work (maintaining multiple
APIs) and leads ambiguity ("Oh, that only works on the mobile API"). It
also creates a new point of failure that would only affect mobile apps.
This means at times consumers of the "Real API" may have needs that
conflict with mobile apps and compete for the time and attention of the
services engineers.

Being on iOS and Android already puts the mobile apps on an island because
of the language and framework differences. Using a different API won't help
that, but development of an API that serves the needs of all the platforms
aligns our needs with the rest of the organization while providing all
front end teams a consistent way to access data.

This doesn't even address the fact that the mobile team is staffed with
experts in tuning native apps - which is quite at different skill set than
writing, deploying, and maintaining server side java script. Write good
scalable server code is difficult, and this will be extremely important
here, since, for the foreseeable future, mobile traffic is expected to see
the most growth.

It should go without saying, but I'm obviously not against Mobile App
Engineers submitting patches to the API if they want to, maybe even hack
using days (though rumors have it that iOS is a bit busy this quarter). But
this is quite different than owning the service.

After reading this, you would probably assume all my experiences with this
strategy were bad. Ironically, I was recently on a mobile team that
developed its own services successfully. I even helped to come up with the
Acronym: MISL (Mobile Intermediary Services Layer). :)

However, on that mobile team we had 2 experienced AND dedicated
javascript/python engineers that were specifically tasked to write
services/administration panels and who interfaced directly with the main
services team.

(Now, If we are going to start talking about embedding web services
engineers on the Mobile Apps team, I'm definitely in!)


On Wed, Feb 4, 2015 at 12:59 AM, Marko Obrovac 
wrote:

> On Tue, Feb 3, 2015 at 8:42 PM, Tim Starling 
> wrote:
>
> > I don't really understand why you want it to be integrated with
> > RESTBase. As far as I can tell (it is hard to pin these things down),
> > RESTBase is a revision storage backend and possibly a public API for
> > that backend.
>
>
> Actually, RESTBase's logic applies to the Mobile Apps case quite naturally.
> When a page is fetched and transformed, it can be stored so that consequent
> requests can simply retrieve the transformed document form storage.
>
>
> > I thought the idea of SOA was to separate concerns.
> > Wouldn't monitoring, caching and authorization would be best done as a
> > node.js library which RESTBase and other services use?
> >
>
> Good point. Ideally, what we would need to do is provide the right tools to
> developers to create services, which can then be placed "strategically"
> around DCs (in cooperation with Ops, ofc). For v1, however, we plan to
> provide only logical separation (to a certain extent) via modules which can
> be dynamically loaded/unloaded from RESTBase. In return, RESTBase will
> provide them with routing, monitoring, caching and authorisation out of the
> box. The good point here is that this 'modularisation' eases the transition
> to a more-decomposed orchestration SOA model. Going in that direction,
> however, requires some prerequisites to be fulfilled, such as [1].
>
> Marko
>
> [1] https://phabricator.wikimedia.org/T84923
>
> Marko Obrovac
> Senior Services Engineer
> Wikimedia Foundation
>

Re: [Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Matanya
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA512

This is the tool be prolineserver:
http://tools.wmflabs.org/videoconvert/
-BEGIN PGP SIGNATURE-
Version: APG v1.1.1

iQJABAEBCgAqBQJU0bgrIxxNYXRhbnlhIE1vc2VzIDxtYXRhbnlhQGZvc3MuY28u
aWw+AAoJEKzSGXfsOI0vItEP+weGjpdbmtlUFxzaBoRhCiF6PpMPhWd2d5BMA2x/
Qa2NeDy0N9Xfqixt60W5iHFi0JZ1DUmFlWd9KotEptRNRKzTw/7Mt/hQKqWyhDFo
GO9XKND/9b/omLm/VsMx4jombx3hMbqoMZtx1SCFxTedAfdbszDX6bwnaUPGoSVB
FDHMc43PoNTGxAF6H8UY1nx1gcA+jJ0u+vzApH6hvauay5T1unZ5E7bkvrYfA8fl
qtklcZIDFOOU0iK4IHtC5gKY4Ds1XD7CCj51g1h+cBIgNd+oD+x1GxS7M02kdJDn
jHwUwALVN1XBH3+iUL2DhpPn2e2RFdH8qTpdmLZ0n6Xu03/78ycWuhrmYcZ51awq
j4E2IVyu7jytFBM9QiAXpEzoEm8O8rY77xBo4JDYSO7kZmygGVC8TusjNs9P0JT+
asOMakbKg4z0RxT/QwCUR2S74PBlpG5ON+V9bMXVMF52ZkR36XlaVA7nx+FvwCAX
eCcgVGTbpRLa4Cpw2zQandQ4Hhf2U7dzw5A/ijwBeryVbCxOztR2lHdGFlfz+97S
v6rnM5YmU7L3GCARq0ulFqUw5/uS4ZVWi9zFhdtqTLS8lVoruhVDFFrBe9DePhvQ
XV+5EtJNcKzMvSxUH9Kl+iXw0PjyC14bna99IyU2aWv6bHdqxtneJfdfDJCGas/Z
nixK
=C7uz
-END PGP SIGNATURE-


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Marko Obrovac
On Tue, Feb 3, 2015 at 8:42 PM, Tim Starling 
wrote:

> I don't really understand why you want it to be integrated with
> RESTBase. As far as I can tell (it is hard to pin these things down),
> RESTBase is a revision storage backend and possibly a public API for
> that backend.


Actually, RESTBase's logic applies to the Mobile Apps case quite naturally.
When a page is fetched and transformed, it can be stored so that consequent
requests can simply retrieve the transformed document form storage.


> I thought the idea of SOA was to separate concerns.
> Wouldn't monitoring, caching and authorization would be best done as a
> node.js library which RESTBase and other services use?
>

Good point. Ideally, what we would need to do is provide the right tools to
developers to create services, which can then be placed "strategically"
around DCs (in cooperation with Ops, ofc). For v1, however, we plan to
provide only logical separation (to a certain extent) via modules which can
be dynamically loaded/unloaded from RESTBase. In return, RESTBase will
provide them with routing, monitoring, caching and authorisation out of the
box. The good point here is that this 'modularisation' eases the transition
to a more-decomposed orchestration SOA model. Going in that direction,
however, requires some prerequisites to be fulfilled, such as [1].

Marko

[1] https://phabricator.wikimedia.org/T84923

Marko Obrovac
Senior Services Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Tilman Bayer
On Tue, Feb 3, 2015 at 5:19 PM, Brian Wolff  wrote:
> On Feb 3, 2015 8:43 PM, "Nkansah Rexford"  wrote:
>>
>> I couldn't find a tool to convert my videos from whatever format into .ogv
>> outside my PC box before pushing to Commons. I guess there might exist
>> something like that, but perhaps I can't find it. But I made one for
>> myself. Maybe it might be useful to others too.
>>
>> I call it CommonsConvert. Upload any video format, enter username and
>> password, and get the video converted into .ogv format plus pushed to
>> Commons all in one shot. You upload the original video in whatever format,
>> and get it on Commons in .ogv
>>
>> Some rough edges currently, such as can't/don't know/no available means to
>> append license to uploaded video among other issues. Working on the ones I
>> can asap.
>>
>> It uses mwclient module via Django based on Python. Django gives user info
>> to Python, Python calls avconv in a subprocess, and the converted file is
>> relayed to Commons via mwclient module via Media Wiki API.
>>
>> I think not everyone has means/technical know how/interest/time converting
>> videos taken on their PC  to an Ogg-Vorbis-whatever format before
> uploading
>> to Commons.
>>
>> Doing the conversion on a server leaves room for user to focus on getting
>> videos than processing them.
>>
>> I don't know if this is or will be of any interest that someone might
> wanna
>> use, but I personally would enjoy having a server sitting somewhere
> convert
>> my videos I want to save onto Commons, than using my local computer doing
>> that boring task.
>>
>> In an email to this list a week or so ago, I 'ranted' about why commons
>> wants a specific format (which if not for commons, I never come across
> that
>> format anywhere), but has no provision for converting any videos thrown at
>> it into that format of its choice. Well
>>
>> Tool can be found here: khophi.co /
>> commonsconvert
>> 
>>
>> And this is sample video uploaded using the tool.
>> https://commons.m.wikimedia.org/wiki/File:Testing_file_for_upload_last.ogv
>> (will be deleted soon, likely)
>>
>> What I do not know or have not experimented yet is whether uploading using
>> the api also has the 100mb upload restriction.
>>
>> Will appreciate feedback.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Cool. Thanks for working on this sort of thing. The uploading videos
> process certainly benefit from some love.
>
> May i suggest using webm (of the vorbis/vp8 variety) as the output format?
> Webm will give higher quality videos at lower file size, so is probably a
> better choice if converting from another format.
>
> For the 100mb thing - there are multiple ways to upload things with the
> api. The chunked method has a file size limit of 1gb. All the other methods
> have the 100mb limit.
See https://commons.wikimedia.org/wiki/Commons:Chunked_uploads for
more information.

>
> If you havent already, id encourage mentioning this on [[Commons:VP]].
> "Real" users would probably be able to give much more specific feedback.
>
> Cheers,
> Bawolff
>
> P.s. im not sure, but i think user:Prolineserver might have been working on
> something similar, in case you are looking for collaborators.

That's https://tools.wmflabs.org/videoconvert/ - seems to be down
right now, but usually it works very well for converting to WebM, and
can also upload the result directly to Commons using OAuth. It's
just not very fast (can take several hours to generate a 200MB WebM).

Results: https://commons.wikimedia.org/wiki/Category:Uploaded_with_videoconvert


-- 
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Dan Garry
Hey Kunal,

Responses in-line.

On 3 February 2015 at 21:09, Legoktm  wrote:
>
> Is there a bug for this and the other issues? I'm subscribed to [1], but
> I don't see anything like the issues you've mentioned on it.
>

There's this, which documents some of them, but it's more descriptive of a
specific proposed solution: https://phabricator.wikimedia.org/T87824

I welcome any other ideas that anyone has for solving our problems. We're
not tied to a specific solution.


> What are "read more recommendations"?


When you scroll to the end of an article, we give three suggestions as to
what you could read next. The suggestions are generated quite naively by
running a full text search using the current page's title as a query. Our
metrics have shown a lot of positive engagement with the feature so far.

Here's a screenshot on [[Bern]]: http://i.imgur.com/qnHcsdT.png


> Is there an actual instance where an API change has broken an app or is
> this merely a hypothetical concern? Speaking as an API developer, we
> occasionally have to make breaking changes, and if we broke something
> too fast, it would be nice to have feedback if that was the case so we
> can improve in the future.


I'm referring to a possible upcoming breaking change in MobileFrontend,
which is the API we depend on for our content. A certain API parameter may
be removed. Since the apps were made under the assumption that that value
would always be in the API response, the app doesn't handle this issue
well. It leaves users unable to see file pages.

This issue is particularly prominent on apps since if the user never
updates the app then they'll never get the fix.

Dan

-- 
Dan Garry
Associate Product Manager, Mobile Apps
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Legoktm


On 02/03/2015 05:46 PM, Dan Garry wrote:
> *tl;dr: Mobile Apps will, in partnership with the Services, investigate
> building a content service for the Mobile Apps.*
> 
> The Mobile Apps Team currently has quite a few pain points with the way we
> fetch article content currently:
> 
>- We have to make a lot of API requests to load an article: article
>HTML, lead image, read more recommendations, and more

Is there a bug for this and the other issues? I'm subscribed to [1], but
I don't see anything like the issues you've mentioned on it.

What are "read more recommendations"?

>- We send the user HTML that we then discard, needlessly increasing data
>usage
>- We do transforms to the HTML in JavaScript on the client side, which
>causes code duplication across the apps and degrades user-perceived
>performance
>- Trivial changes to the API (e.g. renaming a parameter) can break the
>app which is problematic since apps can't be hotfixed easily

Is there an actual instance where an API change has broken an app or is
this merely a hypothetical concern? Speaking as an API developer, we
occasionally have to make breaking changes, and if we broke something
too fast, it would be nice to have feedback if that was the case so we
can improve in the future.

[1] https://phabricator.wikimedia.org/T75616

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Tim Starling
On 04/02/15 12:46, Dan Garry wrote:
> To address these challenges, we are considering performing some or all of
> these tasks in a service developed by the Mobile Apps Team with help from
> Services. This service will hit the APIs we currently hit on the client,
> aggregate the content we need on the server side, perform transforms we're
> currently doing on the client on the server instead, and serve the full
> response to the user via RESTBase. In addition to providing a public API
> end point, RESTBase would help with common tasks like monitoring, caching
> and authorisation.

I don't really understand why you want it to be integrated with
RESTBase. As far as I can tell (it is hard to pin these things down),
RESTBase is a revision storage backend and possibly a public API for
that backend. I thought the idea of SOA was to separate concerns.
Wouldn't monitoring, caching and authorization would be best done as a
node.js library which RESTBase and other services use?

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Brian Gerstle
Thanks for getting this ball rolling, Dan! Couldn't agree more with the
points you raised—having in fact raised a few of them myself. Put me down
as one of the mobile/full-stack engineers who wants to work on this service
:-).

On Tue, Feb 3, 2015 at 8:46 PM, Dan Garry  wrote:

> *tl;dr: Mobile Apps will, in partnership with the Services, investigate
> building a content service for the Mobile Apps.*
>
> The Mobile Apps Team currently has quite a few pain points with the way we
> fetch article content currently:
>
>- We have to make a lot of API requests to load an article: article
>HTML, lead image, read more recommendations, and more
>- We send the user HTML that we then discard, needlessly increasing data
>usage
>- We do transforms to the HTML in JavaScript on the client side, which
>causes code duplication across the apps and degrades user-perceived
>performance
>- Trivial changes to the API (e.g. renaming a parameter) can break the
>app which is problematic since apps can't be hotfixed easily
>
> To address these challenges, we are considering performing some or all of
> these tasks in a service developed by the Mobile Apps Team with help from
> Services. This service will hit the APIs we currently hit on the client,
> aggregate the content we need on the server side, perform transforms we're
> currently doing on the client on the server instead, and serve the full
> response to the user via RESTBase. In addition to providing a public API
> end point, RESTBase would help with common tasks like monitoring, caching
> and authorisation.


> So the Mobile Apps Team is going to spend a bit of time investigating
> whether using RESTBase with Node.js is an option for building a content
> service for the Wikipedia app to replace our current method of retrieving
> article content. Our initial scope for this is feature parity with our
> current content retrieval method.
>
> Our action items are as follows:
>
>- Wait for RESTBase to be deployed.
>- Timescale: Weeks
>   - Owner: All of us :-)

   - Figure out what information the service should serve for the first
>iteration (i.e. for feature parity) and what APIs it needs to hit to do
> that
>- Timescale: Wed 4th Feb
>   - Owner: Dan Garry
>- Start implementing the service and see whether it meets our needs
>- Timescale: Planning a spike for next apps sprint (16th Feb - 27th Feb)
>   to perform initial investigation
>   - Owner: Currently undecided engineer from Mobile Apps, with Services
>   engineers serving as consultants
>
> As always, feel free to ask if there are any questions.
>
> Dan
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
EN Wikipedia user page: https://en.wikipedia.org/wiki/User:Brian.gerstle
IRC: bgerstle
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-03 Thread Dan Garry
*tl;dr: Mobile Apps will, in partnership with the Services, investigate
building a content service for the Mobile Apps.*

The Mobile Apps Team currently has quite a few pain points with the way we
fetch article content currently:

   - We have to make a lot of API requests to load an article: article
   HTML, lead image, read more recommendations, and more
   - We send the user HTML that we then discard, needlessly increasing data
   usage
   - We do transforms to the HTML in JavaScript on the client side, which
   causes code duplication across the apps and degrades user-perceived
   performance
   - Trivial changes to the API (e.g. renaming a parameter) can break the
   app which is problematic since apps can't be hotfixed easily

To address these challenges, we are considering performing some or all of
these tasks in a service developed by the Mobile Apps Team with help from
Services. This service will hit the APIs we currently hit on the client,
aggregate the content we need on the server side, perform transforms we're
currently doing on the client on the server instead, and serve the full
response to the user via RESTBase. In addition to providing a public API
end point, RESTBase would help with common tasks like monitoring, caching
and authorisation.

So the Mobile Apps Team is going to spend a bit of time investigating
whether using RESTBase with Node.js is an option for building a content
service for the Wikipedia app to replace our current method of retrieving
article content. Our initial scope for this is feature parity with our
current content retrieval method.

Our action items are as follows:

   - Wait for RESTBase to be deployed.
   - Timescale: Weeks
  - Owner: All of us :-)
   - Figure out what information the service should serve for the first
   iteration (i.e. for feature parity) and what APIs it needs to hit to do that
   - Timescale: Wed 4th Feb
  - Owner: Dan Garry
   - Start implementing the service and see whether it meets our needs
   - Timescale: Planning a spike for next apps sprint (16th Feb - 27th Feb)
  to perform initial investigation
  - Owner: Currently undecided engineer from Mobile Apps, with Services
  engineers serving as consultants

As always, feel free to ask if there are any questions.

Dan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Brian Wolff
On Feb 3, 2015 8:43 PM, "Nkansah Rexford"  wrote:
>
> I couldn't find a tool to convert my videos from whatever format into .ogv
> outside my PC box before pushing to Commons. I guess there might exist
> something like that, but perhaps I can't find it. But I made one for
> myself. Maybe it might be useful to others too.
>
> I call it CommonsConvert. Upload any video format, enter username and
> password, and get the video converted into .ogv format plus pushed to
> Commons all in one shot. You upload the original video in whatever format,
> and get it on Commons in .ogv
>
> Some rough edges currently, such as can't/don't know/no available means to
> append license to uploaded video among other issues. Working on the ones I
> can asap.
>
> It uses mwclient module via Django based on Python. Django gives user info
> to Python, Python calls avconv in a subprocess, and the converted file is
> relayed to Commons via mwclient module via Media Wiki API.
>
> I think not everyone has means/technical know how/interest/time converting
> videos taken on their PC  to an Ogg-Vorbis-whatever format before
uploading
> to Commons.
>
> Doing the conversion on a server leaves room for user to focus on getting
> videos than processing them.
>
> I don't know if this is or will be of any interest that someone might
wanna
> use, but I personally would enjoy having a server sitting somewhere
convert
> my videos I want to save onto Commons, than using my local computer doing
> that boring task.
>
> In an email to this list a week or so ago, I 'ranted' about why commons
> wants a specific format (which if not for commons, I never come across
that
> format anywhere), but has no provision for converting any videos thrown at
> it into that format of its choice. Well
>
> Tool can be found here: khophi.co /
> commonsconvert
> 
>
> And this is sample video uploaded using the tool.
> https://commons.m.wikimedia.org/wiki/File:Testing_file_for_upload_last.ogv
> (will be deleted soon, likely)
>
> What I do not know or have not experimented yet is whether uploading using
> the api also has the 100mb upload restriction.
>
> Will appreciate feedback.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Cool. Thanks for working on this sort of thing. The uploading videos
process certainly benefit from some love.

May i suggest using webm (of the vorbis/vp8 variety) as the output format?
Webm will give higher quality videos at lower file size, so is probably a
better choice if converting from another format.

For the 100mb thing - there are multiple ways to upload things with the
api. The chunked method has a file size limit of 1gb. All the other methods
have the 100mb limit.

If you havent already, id encourage mentioning this on [[Commons:VP]].
"Real" users would probably be able to give much more specific feedback.

Cheers,
Bawolff

P.s. im not sure, but i think user:Prolineserver might have been working on
something similar, in case you are looking for collaborators.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Ricordisamoa

Very nice!
However, according to https://meta.wikimedia.org/wiki/Terms_of_use, 
users' passwords should never be disclosed to any third party.
Please read OAuth/For Developers 
 and convert your 
tool to use OAuth instead ;-)


Il 04/02/2015 01:43, Nkansah Rexford ha scritto:

I couldn't find a tool to convert my videos from whatever format into .ogv
outside my PC box before pushing to Commons. I guess there might exist
something like that, but perhaps I can't find it. But I made one for
myself. Maybe it might be useful to others too.

I call it CommonsConvert. Upload any video format, enter username and
password, and get the video converted into .ogv format plus pushed to
Commons all in one shot. You upload the original video in whatever format,
and get it on Commons in .ogv

Some rough edges currently, such as can't/don't know/no available means to
append license to uploaded video among other issues. Working on the ones I
can asap.

It uses mwclient module via Django based on Python. Django gives user info
to Python, Python calls avconv in a subprocess, and the converted file is
relayed to Commons via mwclient module via Media Wiki API.

I think not everyone has means/technical know how/interest/time converting
videos taken on their PC  to an Ogg-Vorbis-whatever format before uploading
to Commons.

Doing the conversion on a server leaves room for user to focus on getting
videos than processing them.

I don't know if this is or will be of any interest that someone might wanna
use, but I personally would enjoy having a server sitting somewhere convert
my videos I want to save onto Commons, than using my local computer doing
that boring task.

In an email to this list a week or so ago, I 'ranted' about why commons
wants a specific format (which if not for commons, I never come across that
format anywhere), but has no provision for converting any videos thrown at
it into that format of its choice. Well

Tool can be found here: khophi.co /
commonsconvert


And this is sample video uploaded using the tool.
https://commons.m.wikimedia.org/wiki/File:Testing_file_for_upload_last.ogv
(will be deleted soon, likely)

What I do not know or have not experimented yet is whether uploading using
the api also has the 100mb upload restriction.

Will appreciate feedback.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Video Uploads to Commons

2015-02-03 Thread Nkansah Rexford
I couldn't find a tool to convert my videos from whatever format into .ogv
outside my PC box before pushing to Commons. I guess there might exist
something like that, but perhaps I can't find it. But I made one for
myself. Maybe it might be useful to others too.

I call it CommonsConvert. Upload any video format, enter username and
password, and get the video converted into .ogv format plus pushed to
Commons all in one shot. You upload the original video in whatever format,
and get it on Commons in .ogv

Some rough edges currently, such as can't/don't know/no available means to
append license to uploaded video among other issues. Working on the ones I
can asap.

It uses mwclient module via Django based on Python. Django gives user info
to Python, Python calls avconv in a subprocess, and the converted file is
relayed to Commons via mwclient module via Media Wiki API.

I think not everyone has means/technical know how/interest/time converting
videos taken on their PC  to an Ogg-Vorbis-whatever format before uploading
to Commons.

Doing the conversion on a server leaves room for user to focus on getting
videos than processing them.

I don't know if this is or will be of any interest that someone might wanna
use, but I personally would enjoy having a server sitting somewhere convert
my videos I want to save onto Commons, than using my local computer doing
that boring task.

In an email to this list a week or so ago, I 'ranted' about why commons
wants a specific format (which if not for commons, I never come across that
format anywhere), but has no provision for converting any videos thrown at
it into that format of its choice. Well

Tool can be found here: khophi.co /
commonsconvert


And this is sample video uploaded using the tool.
https://commons.m.wikimedia.org/wiki/File:Testing_file_for_upload_last.ogv
(will be deleted soon, likely)

What I do not know or have not experimented yet is whether uploading using
the api also has the 100mb upload restriction.

Will appreciate feedback.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Quarterly reviews of high priority WMF initiatives

2015-02-03 Thread Tilman Bayer
Minutes and slides from last week's quarterly review meeting of the
Foundation's Collaboration team (which was formerly called the Core
features team and is working on the Flow project) have appeared here:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/Quarterly_reviews/Collaboration/January_2015

On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller  wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_reviews
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> ___
> Wikimedia-l mailing list
> wikimedi...@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



-- 
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Provisional proto-roadmap mind-map

2015-02-03 Thread Brion Vibber
Per some comments the horizontal format is hard to scroll through, so I'm
going to switch it around to vertical when I make the on-wiki version. Keep
adding notes, thanks for the input so far! :)

-- brion

On Tue, Feb 3, 2015 at 12:36 PM, Brion Vibber  wrote:

> One of the things we talked about at the MW dev summit recently was the
> need to have a clearer roadmap of what's being worked on, what's going to
> be worked on, and how that's going to fit in with third-party users as well
> as Wikimedia's big sites.
>
> I've started putting together a 'mind-map'-esque roadmap diagram,
> currently in a Google Drive drawing:
>
>
> https://docs.google.com/a/wikimedia.org/drawings/d/18fpigtf0mXIu9ShvJmqEQJvXdmTrSxzT5rIdfw4x1AA/edit
>
> This is open for commenting but not for editing as I don't quite trust
> Google's versioning tools. :) But I would love more feedback on this,
> especially for third-party goals that we (WMF Engineering) may need to
> support or plan around even if we don't do it ourselves.
>
>
> Currently the diagram contains some major projects being worked on at WMF
> or WMDE already (color coded green), plus blue boxes for things that seem
> to be popular ideas already, purple for stuff I think might be awesome, and
> red for some crazy stuff that I still think would be awesome. ;)
>
> The projects are roughly divided up horizontally by area and vertically by
> depth/relation.
>
> Please note this is a preliminary document and should not be taken as WMF
> plans or endorsements of which projects will get done (even when it's
> complete it's going to include some "maybes" and "crazies" :)
>
>
> If a major initiative seems to be left off don't be offended! Let me know
> and I'll try and work it in the map. :)
>
> -- brion
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Provisional proto-roadmap mind-map

2015-02-03 Thread Brion Vibber
One of the things we talked about at the MW dev summit recently was the
need to have a clearer roadmap of what's being worked on, what's going to
be worked on, and how that's going to fit in with third-party users as well
as Wikimedia's big sites.

I've started putting together a 'mind-map'-esque roadmap diagram, currently
in a Google Drive drawing:

https://docs.google.com/a/wikimedia.org/drawings/d/18fpigtf0mXIu9ShvJmqEQJvXdmTrSxzT5rIdfw4x1AA/edit

This is open for commenting but not for editing as I don't quite trust
Google's versioning tools. :) But I would love more feedback on this,
especially for third-party goals that we (WMF Engineering) may need to
support or plan around even if we don't do it ourselves.


Currently the diagram contains some major projects being worked on at WMF
or WMDE already (color coded green), plus blue boxes for things that seem
to be popular ideas already, purple for stuff I think might be awesome, and
red for some crazy stuff that I still think would be awesome. ;)

The projects are roughly divided up horizontally by area and vertically by
depth/relation.

Please note this is a preliminary document and should not be taken as WMF
plans or endorsements of which projects will get done (even when it's
complete it's going to include some "maybes" and "crazies" :)


If a major initiative seems to be left off don't be offended! Let me know
and I'll try and work it in the map. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Urlencoding strip markers

2015-02-03 Thread Brion Vibber
Special page inclusions shouldn't be able to do anything privileged;
they're meant for public data. If that's not being enforced right now I'd
recommend reworking or killing the special page inclusion system...

-- brion
On Feb 3, 2015 10:11 AM, "Brad Jorsch (Anomie)" 
wrote:

> On Fri, Jan 30, 2015 at 4:04 PM, Brion Vibber 
> wrote:
>
> > On Fri, Jan 30, 2015 at 12:11 PM, Jackmcbarn 
> wrote:
> > > On Fri, Jan 30, 2015 at 2:02 PM, Brion Vibber 
> > > wrote:
> > > > I'd be inclined to unstrip the marker *and squash HTML to plaintext*,
> > > then
> > > > encode the plaintext...
> > >
> > > I don't see how that addresses the security issue.
> >
> > Rollback tokens in the Special:Contributions HTML would then not be
> > available in the squashed text that got encoded. Thus it could not be
> > extracted and used in the timing attack.
> >
>
> While it would avoid *this* bug, it would still allow the attack if there
> is ever sensitive data on some transcludable special page that isn't
> embedded in HTML tag attributes.
>
>
> --
> Brad Jorsch (Anomie)
> Software Engineer
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Urlencoding strip markers

2015-02-03 Thread Brad Jorsch (Anomie)
On Fri, Jan 30, 2015 at 4:04 PM, Brion Vibber  wrote:

> On Fri, Jan 30, 2015 at 12:11 PM, Jackmcbarn  wrote:
> > On Fri, Jan 30, 2015 at 2:02 PM, Brion Vibber 
> > wrote:
> > > I'd be inclined to unstrip the marker *and squash HTML to plaintext*,
> > then
> > > encode the plaintext...
> >
> > I don't see how that addresses the security issue.
>
> Rollback tokens in the Special:Contributions HTML would then not be
> available in the squashed text that got encoded. Thus it could not be
> extracted and used in the timing attack.
>

While it would avoid *this* bug, it would still allow the attack if there
is ever sensitive data on some transcludable special page that isn't
embedded in HTML tag attributes.


-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Urlencoding strip markers

2015-02-03 Thread Arlo Breault
On Friday, January 30, 2015 at 1:04 PM, Brion Vibber wrote:
> On Fri, Jan 30, 2015 at 12:11 PM, Jackmcbarn  (mailto:jackmcb...@gmail.com)> wrote:
>  
> > On Fri, Jan 30, 2015 at 2:02 PM, Brion Vibber  > (mailto:bvib...@wikimedia.org)>
> > wrote:
> >  
> > > On Thu, Jan 29, 2015 at 5:38 PM, Brad Jorsch (Anomie) <
> > > bjor...@wikimedia.org (mailto:bjor...@wikimedia.org)
> > > > wrote:
> > >  
> > >  
> > >  
> > > > On Thu, Jan 29, 2015 at 2:47 PM, Arlo Breault  > > > (mailto:abrea...@wikimedia.org)>
> > > > wrote:
> > > > > https://gerrit.wikimedia.org/r/#/c/181519/
> > > >  
> > > >  
> > > >  
> > > > To clarify, the possible solutions seem to be:
> > > >  
> > > > 1. Unstrip the marker and then encode the content. This is a security
> > > hole
> > > > (T73167)
> > >  
> > >  
> > >  
> > > I'd be inclined to unstrip the marker *and squash HTML to plaintext*,
> > then
> > > encode the plaintext...
> >  
> >  
> >  
> > I don't see how that addresses the security issue.
>  
> Rollback tokens in the Special:Contributions HTML would then not be
> available in the squashed text that got encoded. Thus it could not be
> extracted and used in the timing attack.

Is this what you mean by “squash HTML to plaintext”?
urlencode( strip_tags( $parser->mStripState->unstripBoth( $s ) ) );

Is strip_tags reliable enough to not get confused and leave those
tokens lying around?

  
>  
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org (mailto:Wikitech-l@lists.wikimedia.org)
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New "Reference Tooltips" beta feature

2015-02-03 Thread Ricordisamoa

How do you think a Popups renderer would integrate with OOjs UI?
Why shouldn't ReferenceTooltips be put into Cite?

Il 03/02/2015 05:45, Prateek Saxena ha scritto:

On Tue, Feb 3, 2015 at 8:25 AM, Nick Wilson (Quiddity)
 wrote:

See also https://phabricator.wikimedia.org/T67114 (" Hovercards: Show cards
for references ")
CCing Prateek.

And https://gerrit.wikimedia.org/r/#/c/139827/ for a very old patch
that implements this.


On 2 February 2015 at 16:49, Ricordisamoa 
wrote:

It could share some code with the Popups extension <
https://www.mediawiki.org/wiki/Extension:Popups> but, unlike the latter,
the former does not depend on neither TextExtracts nor PageImages.

It could easily share code with the Popups. You can register a new
renderer that corresponds to a certain kind of link. There is no
compulsion to use TextExtracts or PageImages within a renderer. I'd
love to see this functionality within the extension.


—prtksxna

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Phabricator monthly statistics - 2015-01

2015-02-03 Thread Jon Robson
When comparing to Bugzilla rather than bugs open/closed engagement would be
a good measure of success e.g. number of replies.

I sense more involvement from community members with the LDAP integration
since the move which IMO is huge.

I personally am creating more bugs there as Phabricator is a tool for tasks
which are a superset of bugs.
On 3 Feb 2015 08:13, "Quim Gil"  wrote:

> On Mon, Feb 2, 2015 at 10:03 PM, Gergo Tisza  wrote:
>
> > On Sun, Feb 1, 2015 at 11:49 AM, Quim Gil  wrote:
> >
> > > Number of active users (any activity) in (2015-01): 669
> > > Number of task authors in (2015-01): 401
> > > Number of users who have closed tasks in (2015-01): 199
> > >
> >
> > How does that compare to the last months of Bugzilla?
> >
>
> So far Phabricator has 100-150 more monthly users then Bugzilla. See the
>  at https://www.mediawiki.org/wiki/Community_metrics#Reports
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Phabricator monthly statistics - 2015-01

2015-02-03 Thread Quim Gil
On Mon, Feb 2, 2015 at 10:03 PM, Gergo Tisza  wrote:

> On Sun, Feb 1, 2015 at 11:49 AM, Quim Gil  wrote:
>
> > Number of active users (any activity) in (2015-01): 669
> > Number of task authors in (2015-01): 401
> > Number of users who have closed tasks in (2015-01): 199
> >
>
> How does that compare to the last months of Bugzilla?
>

So far Phabricator has 100-150 more monthly users then Bugzilla. See the
 at https://www.mediawiki.org/wiki/Community_metrics#Reports

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Promote GSoC, Outreachy, and Wikimedia Hackathon in France

2015-02-03 Thread Quim Gil
This is a call to all Wikimedia tech contributors with contacts in France.
We need your help reaching out to new developers!

We have the Wikimedia Hackathon in Lyon (23-25 May), which is a good excuse
to focus our developer outreach efforts in France already now. Google
Summer of Code and Outreachy (was FOSS Outreach Program for Women) are
around the corner. Can we coordinate an action between you, your contacts,
Wikimedia France, WMF Engineering Community team... ?

Please subscribe and participate in these tasks:

Promote GSoC, FOSS OPW, and Wikimedia Hackathon in France
https://phabricator.wikimedia.org/T88274

Engage with established technical communities at the Wikimedia Hackathon
2015
https://phabricator.wikimedia.org/T76325

PS: for similar calls focusing on Russia, China, Japan, and your preferred
country, see https://phabricator.wikimedia.org/T925

-- 
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l