Re: [Wikitech-l] Fwd: IRC office hours with the Editor Engagement Experiments team

2013-02-04 Thread Gerard Meijssen
Hoi,
The UI obviously should be internationalised. Obviously this will enable
the software in other languages. But when a tour is to be translated, it is
not internationalisation, it is translation.

The point is that translation and internationalisation/localisation are
quite different and they have their own volunteers working on it.
Thanks,
 GerardM


On 4 February 2013 22:54, Matthew Flaschen  wrote:

> On 02/04/2013 03:58 PM, Steven Walling wrote:
> > Sorry for cross-posting, but for MediaWiki developers, this is a good
> > opportunity to ask any questions you might have about the newly-released
> > Extension:GuidedTour, and how to leverage it to build any tours yourself.
>
> Specifically, any extension with a UI can include a tour with full
> internationalization support.
>
> I'd be glad to help you get started.
>
> Matt Flaschen
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Audio derivatives, turning on MP3/AAC & mobile app feature request.

2013-02-04 Thread Michael Dale
Yes' all that changed is we added support for audio derivatives. We have
not enabled mp3 or AAC. The same code can be used for flac -> ogg or
whatever we configure.
On Feb 3, 2013 2:33 AM, "Yuvi Panda"  wrote:

> Just to be sure that I'm reading this right - nothing actually changed yet.
> We still are a free-formats-only shop for A/V. Right?
>
> --
> Yuvi Panda T
> http://yuvi.in/blog
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Merge Vector extension into core

2013-02-04 Thread maiki
Core of MediaWiki, or core of the Vector skin? I am fine with doing that
for the skin, but I would keep it out of MediaWiki. The design is going
to change, and lots of folks use different skins and don't use the
extension.

maiki


On 02/04/2013 06:02 PM, Sébastien Santoro wrote:
> Good evening,
> 
> Hashar and me discussed about the Vector extension this morning on
> #wikimedia-tech and we wonder if the code shouldn't be merged to core.
> 
> Rationale: Vector is the main design of our product and the Vector
> extension contains enhancement for this skin.
> 

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Asher Feldman
On Mon, Feb 4, 2013 at 5:21 PM, Asher Feldman wrote:

> On Mon, Feb 4, 2013 at 4:59 PM, Arthur Richards 
> wrote:
>
>> In the case of the cookie, the header would actually get set by the
>> backend
>> response (from Apache) and I believe Dave cooked up or was planning on
>> cooking some magic to somehow make that information discernable when
>> results are cached.
>>
>
> Opting into the mobile beta as it is currently implemented bypasses
> varnish caching for all future mobile pageviews for the life of the
> cookie.  So this probably isn't quite right (at least the "when results are
> cached" part.)
>

Thinking about this further.. So long as all beta optins bypass all caching
and always have to hit an apache, it would be fine for mf to set a response
header reflecting the version of the site the optin cookie triggers (but
only if there's an optin, avoid setting on standard.)  I'd just prefer this
to be logged without adding a field to the entire udplog stream that will
generally just be wasted space.  Mobile already has one dedicated udplog
field currently intended for zero carriers, wasted log space for nearly
every request.  Make it a key/value field that can contain multiple keys,
i.e. "zc:orn;v:b1" (zero carrier = orange whatever, version = beta1)

If by some chance mobile beta gets implemented in a way that doesn't kill
frontend caching for its users (maybe solely via different js behavior
based on the presence of the optin cookie?) the above won't be applicable
anymore, so using the event log facility / pixel service to note beta usage
becomes more appropriate.  If beta usage is going to be driven upwards, I
hope this approach is seriously considered.  Mobile currently only has
around a 58% edge cache hitrate as it is and it sounds like upcoming
features will place significant new demands on the apaches and for
memcached space.  If a non cache busting beta site is doable, go for the
logging method now that will later be compatible with it to avoid having to
change processing methods.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Merge Vector extension into core

2013-02-04 Thread Sébastien Santoro
Good evening,

Hashar and me discussed about the Vector extension this morning on
#wikimedia-tech and we wonder if the code shouldn't be merged to core.

Rationale: Vector is the main design of our product and the Vector
extension contains enhancement for this skin.

-- 
Best Regards,
Sébastien Santoro aka Dereckson
http://www.dereckson.be/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Asher Feldman
On Mon, Feb 4, 2013 at 4:59 PM, Arthur Richards wrote:

> In the case of the cookie, the header would actually get set by the backend
> response (from Apache) and I believe Dave cooked up or was planning on
> cooking some magic to somehow make that information discernable when
> results are cached.
>

Opting into the mobile beta as it is currently implemented bypasses varnish
caching for all future mobile pageviews for the life of the cookie.  So
this probably isn't quite right (at least the "when results are cached"
part.)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Asher Feldman
On Mon, Feb 4, 2013 at 4:24 PM, Arthur Richards wrote:

>
> Asher, I understand your hesitation about using HTTP header fields, but
> there are a couple problems I'm seeing with using query string parameters.
> Perhaps you or others have some ideas how to get around these:
> * We should keep user-facing URLs canonical as much as possible (primarily
> for link sharing)
> ** If we keep user-facing URLs canonical, we could potentially add query
> string params via javascript, but that would only work on devices that
> support javascript/have javascript enabled (this might not be a huge deal
> as we are planning changes such that users that do not support jQuery will
> get a simplified version of the stable site)
>

I was thinking of this as a solution for the X-MF-Req header, based on your
explanation of it earlier in the the thread: "Almost correct - I realize I
didn't actually explain it correctly. This would be a request HTTP header
set by the client in API requests made by Javascript provided by
MobileFrontend."

I only meant to apply the query string idea to API requests, which can also
be marked to indicate non-standard versions of the site.  I completely
missed the case of non-api requests about which beta/alpha usage data needs
to be collected.  What about doing so via the eventlog service?  Only for
users actually opted into one of these programs, no need to log anything
special for the majority of users getting the standard site.

* How could this work for the first pageview request (eg a user clicking a
> link from Google or even just browsing to http://en.wikipedia.org)?


I think this is covered by the above, in that the data intended to go into
x-mf-req doesn't apply to this sort of page view, and first views from
users opted into a trial can eventlog the trial usage.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Arthur Richards
On Mon, Feb 4, 2013 at 5:49 PM, Brion Vibber  wrote:

> On Mon, Feb 4, 2013 at 4:38 PM, Arthur Richards  >wrote:
>
> > On Mon, Feb 4, 2013 at 5:30 PM, Brion Vibber  wrote:
> >
> > > On Mon, Feb 4, 2013 at 4:24 PM, Arthur Richards <
> aricha...@wikimedia.org
> > > >wrote:
> > > * How could this work for the first pageview request (eg a user
> clicking
> > a
> > > > link from Google or even just browsing to http://en.wikipedia.org)?
> > > >
> > >
> > > I think mainly we need the tracking on the API requests... that's all
> > > JavaScript-initiated, and all hidden from the user. The main problem
> with
> > > adding parameters would be for caching  but none of the API hits
> are
> > > currently cacheable so that's not an immediate issue perhaps.
> > >
> >
> > We also need to be able to differentiate between alpha/beta/stable
> versions
> > of the mobile site, without having to parse the cookie header (I believe
> as
> > a result of performance constraints around this? I think the analytics
> team
> > had looked into this previously).
> >
>
> Yeah that's probably not possible if you want to track that for initial
> page views. Cookie's the only thing guaranteed to have the data available,
> and we have no way to inject a header into mobile web browsers except for
> the XHR hits to the API.
>
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

In the case of the cookie, the header would actually get set by the backend
response (from Apache) and I believe Dave cooked up or was planning on
cooking some magic to somehow make that information discernable when
results are cached.


-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Brion Vibber
On Mon, Feb 4, 2013 at 4:38 PM, Arthur Richards wrote:

> On Mon, Feb 4, 2013 at 5:30 PM, Brion Vibber  wrote:
>
> > On Mon, Feb 4, 2013 at 4:24 PM, Arthur Richards  > >wrote:
> > * How could this work for the first pageview request (eg a user clicking
> a
> > > link from Google or even just browsing to http://en.wikipedia.org)?
> > >
> >
> > I think mainly we need the tracking on the API requests... that's all
> > JavaScript-initiated, and all hidden from the user. The main problem with
> > adding parameters would be for caching  but none of the API hits are
> > currently cacheable so that's not an immediate issue perhaps.
> >
>
> We also need to be able to differentiate between alpha/beta/stable versions
> of the mobile site, without having to parse the cookie header (I believe as
> a result of performance constraints around this? I think the analytics team
> had looked into this previously).
>

Yeah that's probably not possible if you want to track that for initial
page views. Cookie's the only thing guaranteed to have the data available,
and we have no way to inject a header into mobile web browsers except for
the XHR hits to the API.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Varnish Caching - Using ESI

2013-02-04 Thread Arthur Richards
On Mon, Feb 4, 2013 at 3:42 PM, Matthew Walker wrote:

> Mobile already does use ESI, but this would be extensions to it.
>

Not in production. We've done some experimentation with ESI and have some
implementations of Varnish ESI in MobileFrontend sitting in a remote branch
that by now has suffered serious code drift from master. This is definitely
something we'd like to see in production soon, but we are currently having
to prioritize other things on our plate higher. We'll make some comments on
the RFC :)

-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Arthur Richards
On Mon, Feb 4, 2013 at 5:30 PM, Brion Vibber  wrote:

> On Mon, Feb 4, 2013 at 4:24 PM, Arthur Richards  >wrote:
>
> > Asher, I understand your hesitation about using HTTP header fields, but
> > there are a couple problems I'm seeing with using query string
> parameters.
> > Perhaps you or others have some ideas how to get around these:
> > * We should keep user-facing URLs canonical as much as possible
> (primarily
> > for link sharing)
> > ** If we keep user-facing URLs canonical, we could potentially add query
> > string params via javascript, but that would only work on devices that
> > support javascript/have javascript enabled (this might not be a huge deal
> > as we are planning changes such that users that do not support jQuery
> will
> > get a simplified version of the stable site)
>
> * How could this work for the first pageview request (eg a user clicking a
> > link from Google or even just browsing to http://en.wikipedia.org)?
> >
>
> I think mainly we need the tracking on the API requests... that's all
> JavaScript-initiated, and all hidden from the user. The main problem with
> adding parameters would be for caching  but none of the API hits are
> currently cacheable so that's not an immediate issue perhaps.
>

We also need to be able to differentiate between alpha/beta/stable versions
of the mobile site, without having to parse the cookie header (I believe as
a result of performance constraints around this? I think the analytics team
had looked into this previously).

-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Brion Vibber
On Mon, Feb 4, 2013 at 4:24 PM, Arthur Richards wrote:

> Asher, I understand your hesitation about using HTTP header fields, but
> there are a couple problems I'm seeing with using query string parameters.
> Perhaps you or others have some ideas how to get around these:
> * We should keep user-facing URLs canonical as much as possible (primarily
> for link sharing)
> ** If we keep user-facing URLs canonical, we could potentially add query
> string params via javascript, but that would only work on devices that
> support javascript/have javascript enabled (this might not be a huge deal
> as we are planning changes such that users that do not support jQuery will
> get a simplified version of the stable site)

* How could this work for the first pageview request (eg a user clicking a
> link from Google or even just browsing to http://en.wikipedia.org)?
>

I think mainly we need the tracking on the API requests... that's all
JavaScript-initiated, and all hidden from the user. The main problem with
adding parameters would be for caching  but none of the API hits are
currently cacheable so that's not an immediate issue perhaps.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] RFC: Introducing two new HTTP headers to track mobile pageviews

2013-02-04 Thread Arthur Richards
On Sun, Feb 3, 2013 at 2:35 AM, Asher Feldman wrote:

> Regarding varnish cacheability of mobile API requests with a logging query
> param - it would probably be worth making frontend varnishes strip out all
> occurrences of that query param and its value from their backend requests
> so they're all the same to the caching instances. A generic param name that
> can take any value would allow for adding as many extra log values as
> needed, limited only by the uri log field length.
>
> &l=mft2&l=mfstable etc.
>
> So still an edge cache change but the result is more flexible
> while avoiding changing the fixed field length log format across unrelated
> systems like text squids or image caches.
>
> On Sunday, February 3, 2013, Asher Feldman wrote:
>
> > If you want to differentiate categories of API requests in logs, add
> > descriptive noop query params to the requests. I.e &mfmode=2. Doing this
> in
> > request headers and altering edge config is unnecessary and a bad design
> > pattern. On the analytics side, if parsing query params seems challenging
> > vs. having a fixed field to parse, deal.
> >
>

Asher, I understand your hesitation about using HTTP header fields, but
there are a couple problems I'm seeing with using query string parameters.
Perhaps you or others have some ideas how to get around these:
* We should keep user-facing URLs canonical as much as possible (primarily
for link sharing)
** If we keep user-facing URLs canonical, we could potentially add query
string params via javascript, but that would only work on devices that
support javascript/have javascript enabled (this might not be a huge deal
as we are planning changes such that users that do not support jQuery will
get a simplified version of the stable site)
* How could this work for the first pageview request (eg a user clicking a
link from Google or even just browsing to http://en.wikipedia.org)?

I may be missing other potential problems - it would be great if others
from the mobile team could chime in.

-- 
Arthur Richards
Software Engineer, Mobile
[[User:Awjrichards]]
IRC: awjr
+1-415-839-6885 x6687
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Varnish Caching - Using ESI

2013-02-04 Thread Matthew Walker
Ah shoot -- well -- I'll merge them then and retire my RfC.

Thanks Kaldari

On Mon, Feb 4, 2013 at 2:45 PM, Ryan Kaldari  wrote:

> There's already an RfC on this:
> https://www.mediawiki.org/**wiki/Requests_for_comment/**
> Partial_page_caching
>
> Ryan Kaldari
>
>
> On 2/4/13 2:42 PM, Matthew Walker wrote:
>
>> All,
>>
>> The Fundraising and Mobile teams have been scheming about how we can start
>> to use the capabilities of the Varnish cache in our respective extensions.
>> Mobile already does use ESI, but this would be extensions to it.
>>
>> As I know multiple other parties have been thinking along the same lines,
>> I've taken the liberty of creating an RfC [1] to discuss the subject /
>> develop requirements / wishlists / gotchas. Ideally this will serve as a
>> method to start retiring Squid, moving the site to Varnish, and then being
>> able to separate the site chrome from site content. *evil laugh*
>>
>> [1]
>> https://www.mediawiki.org/**wiki/Requests_for_comment/**
>> Caching_HTML_Fragments_(ESI)
>>
>>
>
> __**_
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/**mailman/listinfo/wikitech-l




-- 
~Matt Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Varnish Caching - Using ESI

2013-02-04 Thread Ryan Kaldari

There's already an RfC on this:
https://www.mediawiki.org/wiki/Requests_for_comment/Partial_page_caching

Ryan Kaldari

On 2/4/13 2:42 PM, Matthew Walker wrote:

All,

The Fundraising and Mobile teams have been scheming about how we can start
to use the capabilities of the Varnish cache in our respective extensions.
Mobile already does use ESI, but this would be extensions to it.

As I know multiple other parties have been thinking along the same lines,
I've taken the liberty of creating an RfC [1] to discuss the subject /
develop requirements / wishlists / gotchas. Ideally this will serve as a
method to start retiring Squid, moving the site to Varnish, and then being
able to separate the site chrome from site content. *evil laugh*

[1]
https://www.mediawiki.org/wiki/Requests_for_comment/Caching_HTML_Fragments_(ESI)




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Chad
On Mon, Feb 4, 2013 at 6:44 AM, Chad  wrote:
> Hi,
>
> After much delay, Gerrit 2.6 will be coming to our servers. This release
> brings a *lot* of really cool features and fixes, but I'd like to outline a
> couple of the major ones:
>
> * A stable, documented RESTful api
> * Plugin support:
> ** We'll be replacing Gitweb with Gitblit once the initial dust of the
> upgrade settles
> ** We've got a plugin to let us delete projects
> ** We're working on plugins for renaming projects, as well as providing
> some Bugzilla integration
> * IE9 & IE10 are now supported
> * The code formatter got some updates, which should solve some of
> the ArrayIndexOutOfBounds errors you saw in some diffs.
> * Ability to leave comments on a whole file (instead of just a line in a
> file)
> * Search suggestions
> * More unicorns!
>
> We're planning to do this on 1:00-2:00UTC on February 12th (that's
> 17:00-18:00 PST on February 11th) -- that's one week from today.
>

I can't believe I forgot to mention two of my favorite new features:
* Editing topics and commit messages directly from the UI :)

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Varnish Caching - Using ESI

2013-02-04 Thread Matthew Walker
All,

The Fundraising and Mobile teams have been scheming about how we can start
to use the capabilities of the Varnish cache in our respective extensions.
Mobile already does use ESI, but this would be extensions to it.

As I know multiple other parties have been thinking along the same lines,
I've taken the liberty of creating an RfC [1] to discuss the subject /
develop requirements / wishlists / gotchas. Ideally this will serve as a
method to start retiring Squid, moving the site to Varnish, and then being
able to separate the site chrome from site content. *evil laugh*

[1]
https://www.mediawiki.org/wiki/Requests_for_comment/Caching_HTML_Fragments_(ESI)

-- 
~Matt Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Audio derivatives, turning on MP3/AAC & mobile app feature request.

2013-02-04 Thread Yuri Astrakhan
It seems Opus is going full speed ahead with both Mozilla and Chrome
already supporting it in beta. Any plans for that?

http://www.infoworld.com/d/applications/webrtc-creates-interop-between-chrome-and-firefox-212230
http://news.slashdot.org/story/13/02/04/1944217/firefox-and-chrome-can-talk-to-each-other


On Mon, Feb 4, 2013 at 5:09 PM, Brion Vibber  wrote:

> On Sun, Feb 3, 2013 at 1:31 PM, Max Semenik  wrote:
>
> > On 03.02.2013, 2:02 Brion wrote:
> >
> > > On Feb 2, 2013 1:45 PM, "Platonides"  wrote:
> > >>
> > >> Your system does not seem to support OGG audio format. Downloading
> this
> > >> file without ogg support may need a significant amount of space and
> > >> bandwith. Downloading article.wav ... 20%
> >
> > > Something tells me that won't be feasible for video. :)
> >
> > We've no idea if your system supports OGG videos, so here's an .iso
> > with a video DVD inside.
> >
>
> Hehehe :)
>
> On a more serious note -- over on the Mobile apps team we're currently
> working on Android and iOS uploader apps for Commons. Currently we only
> support images, but it would be *awesome* to support videos and audio.
>
> There's been some preliminary work on transcoding audio to Ogg Vorbis on
> Android, which we could probably rig up on iOS as well, but for video it's
> probably not feasible to do WebM encoding in software on a relatively slow
> ARM processor.
>
> Native support for ingesting MP4 and AAC would simplify audio and
> especially video upload as well. Even if this happens separately from
> playback, it would be very valuable.
>
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Audio derivatives, turning on MP3/AAC & mobile app feature request.

2013-02-04 Thread Brion Vibber
On Sun, Feb 3, 2013 at 1:31 PM, Max Semenik  wrote:

> On 03.02.2013, 2:02 Brion wrote:
>
> > On Feb 2, 2013 1:45 PM, "Platonides"  wrote:
> >>
> >> Your system does not seem to support OGG audio format. Downloading this
> >> file without ogg support may need a significant amount of space and
> >> bandwith. Downloading article.wav ... 20%
>
> > Something tells me that won't be feasible for video. :)
>
> We've no idea if your system supports OGG videos, so here's an .iso
> with a video DVD inside.
>

Hehehe :)

On a more serious note -- over on the Mobile apps team we're currently
working on Android and iOS uploader apps for Commons. Currently we only
support images, but it would be *awesome* to support videos and audio.

There's been some preliminary work on transcoding audio to Ogg Vorbis on
Android, which we could probably rig up on iOS as well, but for video it's
probably not feasible to do WebM encoding in software on a relatively slow
ARM processor.

Native support for ingesting MP4 and AAC would simplify audio and
especially video upload as well. Even if this happens separately from
playback, it would be very valuable.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: IRC office hours with the Editor Engagement Experiments team

2013-02-04 Thread Matthew Flaschen
On 02/04/2013 03:58 PM, Steven Walling wrote:
> Sorry for cross-posting, but for MediaWiki developers, this is a good
> opportunity to ask any questions you might have about the newly-released
> Extension:GuidedTour, and how to leverage it to build any tours yourself.

Specifically, any extension with a UI can include a tour with full
internationalization support.

I'd be glad to help you get started.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: IRC office hours with the Editor Engagement Experiments team

2013-02-04 Thread Steven Walling
Sorry for cross-posting, but for MediaWiki developers, this is a good
opportunity to ask any questions you might have about the newly-released
Extension:GuidedTour, and how to leverage it to build any tours yourself.

-- Forwarded message --
From: Steven Walling 
Date: Mon, Feb 4, 2013 at 12:57 PM
Subject: IRC office hours with the Editor Engagement Experiments team
To: Wikimedia Mailing List 


Hi all,

This Wednesday at 22:00 UTC,[1] there will be an open discussion in
#wikimedia-office with our team, Editor Engagement Experiments.[2]

We've launched several new features since our last office hours --
including interactive "guided tours" and a getting started page for
newly-registered Wikipedians. We'll likely discuss these projects,
including testing results so far, as well as any questions people might
have.

Thanks,

-- 
Steven Walling
https://wikimediafoundation.org/

1. https://meta.wikimedia.org/wiki/IRC_office_hours
2. https://meta.wikimedia.org/wiki/Editor_engagement_experiments



-- 
Steven Walling
https://wikimediafoundation.org/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New repository "sandbox"

2013-02-04 Thread Chad
Hi everyone,

I've setup a new repository called "sandbox." The purpose of this
repository is simple--it's meant to be a home for one-liners, quick
patches, and other things that don't have a proper home.

I've made pushing & review permissions super liberal on it (as in,
granted to all registered gerrit users), so anyone should be able
to do anything on it they want (direct pushing, make branches,
tag something, review, submit, etc).

As always, please let me know if you have any problems.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Congrats Krenair, new core maintainer

2013-02-04 Thread Sumana Harihareswara
Just wanted to note that volunteer Alex Monk (Krenair) now has +2 rights
in MediaWiki core and all extensions.
https://www.mediawiki.org/w/index.php?title=Git%2FGerrit_project_ownership&diff=640776&oldid=638743

Alex joined our community as a developer about a year ago and has
consistently been helpful, especially in fixing bugs and reviewing
others' patches.  Alex, thanks for your work.

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Matthew Walker
>
> If we can come up with some sane rewrite rules, I think we could
> redirect gitweb urls to gitblit.


+1 -- like Mark I've used gitweb URLs in several places that I wouldn't
like to see breaking (one of them is original source of images moved to
commons from git).

~Walker
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extending Scribunto with new Lua functions

2013-02-04 Thread Gabriel Wicke
Hi Jens!

On 02/04/2013 05:46 AM, Jens Ohlig wrote:
> I'm currently working on the Wikidata project to include Lua
> functions for templates that access Wikidata entities.
> 
> I've toyed around a bit and extended LuaCommon.php with a getEntities
> function and a wikibase table to hold that function. Now I wonder if
> there are any plans for Lua extensions outside the mw.* namespace.

I'm wondering if some of the specialized functionality can be avoided by
fetching JSON data from wikibase / wikidata through a web API. This
would be more versatile, and could be used by alternative templating
systems.

Gabriel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extending Scribunto with new Lua functions

2013-02-04 Thread Daniel Werner
 2013/2/4 Brad Jorsch :
> You'd add your library object in Lua as a child of the "mw" object.

Sounds like same little mistake we are already "dealing" with in
JavaScript, see:
http://www.mediawiki.org/wiki/Manual_talk:Coding_conventions/JavaScript#Place_for_Extensions_Objects_in_JavaScript_20654
Perhaps not too late to get this right in Scribunto from the beginning.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Extending Scribunto with new Lua functions

2013-02-04 Thread Brad Jorsch
On Mon, Feb 4, 2013 at 8:46 AM, Jens Ohlig  wrote:
>
> Here are my questions:
> 1. Is there an easy way to add your own Lua functions (that call PHP Api 
> functions) to Scribunto other than writing them into LuaCommon.php?

Yes. Probably the best example to look at right now is gerrit change
47109.[1] You could also look at 42050, 38013, or 46478. Keep in mind
that any functions you provide have the at least same sorts of
security and performance concerns as parser function hooks.

The major issue right now is that there's no way to add something to
Scribunto_LuaEngine::$libraryClasses from another extension. There's
also the issue of easily adding unit tests from another extension,
since ideally the unit tests should be run against both the
LuaStandalone and LuaSandbox engines. I'll probably have a look at
this today.

 [1]: https://gerrit.wikimedia.org/r/#/c/47109/

> 2. Is using your own namespace the way to go?

You'd add your library object in Lua as a child of the "mw" object.

> 3. Are there some kind of examples how to wrap PHP functions into the Lua 
> environment (using the frame etc)?

See #1. Although I can't think of a case where a PHP function would
need to use the frame instead of being passed whatever parameters it
needs explicitly by the Lua caller.

> 4. Is there any way to introspect or debug such wrapped functions?

I'm not sure what you mean here.


-- 
Brad Jorsch
Software Engineer
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Chad
On Mon, Feb 4, 2013 at 10:21 AM, Sébastien Santoro
 wrote:
> On Mon, Feb 4, 2013 at 3:29 PM, Chad  wrote:
>>> Does this mean new links will be in place?  I've used links like
>>> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fcore.git;a=commit;h=f04103185a071fc7ceb7f25daf1467791f2ae391
>>> and it would be nice if  those didn't break.
>>>
>>
>> Yes, the urls will be changing.
>>
>> If we can come up with some sane rewrite rules, I think we could
>> redirect gitweb urls to gitblit.
> For nginx or Apache?
>

Apache. The configuration of Gerrit can be found in the puppet
repo, in templates/apache/gerrit.wikimedia.org.erb.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Sébastien Santoro
On Mon, Feb 4, 2013 at 3:29 PM, Chad  wrote:
>> Does this mean new links will be in place?  I've used links like
>> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fcore.git;a=commit;h=f04103185a071fc7ceb7f25daf1467791f2ae391
>> and it would be nice if  those didn't break.
>>
>
> Yes, the urls will be changing.
>
> If we can come up with some sane rewrite rules, I think we could
> redirect gitweb urls to gitblit.
For nginx or Apache?

-- 
Sébastien Santoro aka Dereckson
http://www.dereckson.be/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] testable javascript tutorial

2013-02-04 Thread Chris McMahon
Given the "Javascript LevelUp Bootcamp" sessions coming up, I thought this
would be of interest:
https://shanetomlinson.com/2013/testing-javascript-frontend-part-1-anti-patterns-and-fixes/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Chad
On Mon, Feb 4, 2013 at 9:19 AM, Mark A. Hershberger  wrote:
> On 02/04/2013 06:44 AM, Chad wrote:
>> ** We'll be replacing Gitweb with Gitblit once the initial dust of the
>> upgrade settles
>
> Does this mean new links will be in place?  I've used links like
> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fcore.git;a=commit;h=f04103185a071fc7ceb7f25daf1467791f2ae391
> and it would be nice if  those didn't break.
>

Yes, the urls will be changing.

If we can come up with some sane rewrite rules, I think we could
redirect gitweb urls to gitblit.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Mark A. Hershberger
On 02/04/2013 06:44 AM, Chad wrote:
> ** We'll be replacing Gitweb with Gitblit once the initial dust of the
> upgrade settles

Does this mean new links will be in place?  I've used links like
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki%2Fcore.git;a=commit;h=f04103185a071fc7ceb7f25daf1467791f2ae391
and it would be nice if  those didn't break.



-- 
http://hexmode.com/

There is no path to peace. Peace is the path.
   -- Mahatma Gandhi, "Non-Violence in Peace and War"

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Extending Scribunto with new Lua functions

2013-02-04 Thread Jens Ohlig
Hello,

I guess this can be answered by Tim or Victor, but I'm grateful for any 
pointers that can help me with a rather specific problem with Scribunto.

I'm currently working on the Wikidata project to include Lua functions for 
templates that access Wikidata entities.

I've toyed around a bit and extended LuaCommon.php with a getEntities function 
and a wikibase table to hold that function. Now I wonder if there are any plans 
for Lua extensions outside the mw.* namespace. 

I've added a wikibase.lua file and a wikibase.* namespace in Lua. However, the 
way PHP and Lua play together and how Scribunto can be extended looks a bit 
like black magic (which is to be expected, given that Scribunto is far from 
finished).

Here are my questions: 
1. Is there an easy way to add your own Lua functions (that call PHP Api 
functions) to Scribunto other than writing them into LuaCommon.php?
2. Is using your own namespace the way to go?
3. Are there some kind of examples how to wrap PHP functions into the Lua 
environment (using the frame etc)?
4. Is there any way to introspect or debug such wrapped functions?

Thanks for any suggestions!

Cheers,
Jens

-- 
Jens Ohlig
Software developer Wikidata project

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Gerrit 2.6 - coming to a server near you

2013-02-04 Thread Chad
Hi,

After much delay, Gerrit 2.6 will be coming to our servers. This release
brings a *lot* of really cool features and fixes, but I'd like to outline a
couple of the major ones:

* A stable, documented RESTful api
* Plugin support:
** We'll be replacing Gitweb with Gitblit once the initial dust of the
upgrade settles
** We've got a plugin to let us delete projects
** We're working on plugins for renaming projects, as well as providing
some Bugzilla integration
* IE9 & IE10 are now supported
* The code formatter got some updates, which should solve some of
the ArrayIndexOutOfBounds errors you saw in some diffs.
* Ability to leave comments on a whole file (instead of just a line in a
file)
* Search suggestions
* More unicorns!

We're planning to do this on 1:00-2:00UTC on February 12th (that's
17:00-18:00 PST on February 11th) -- that's one week from today.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] talk about MediaWiki groups in Berlin

2013-02-04 Thread Quim Gil
Just a reminder, this is today. If you are in Berlin you are invited to 
join.


Explaining MediaWiki Groups is the main topic of the meetup but of 
course we can discuss other topics people are interested about.


Bis bald!

On 01/21/2013 04:03 PM, Lydia Pintscher wrote:

Heya :)

On the 4th of February Quim will be at the Wikimedia Germany office to
introduce MediaWiki groups. See https://www.mediawiki.org/wiki/Groups
for more info about the groups.

We'll meet at 18:30 in the office in Obentrautstr. 72, Berlin. Quim
will talk and answer questions for about 1 hour and then we'll move on
to Brauhaus Lemke for some food and drinks.

If you're going to attend please let me know soon so I can plan
better. I'd also be delighted if you could forward it to other people
who might be interested. I hope to see many of you there.


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de



--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l