Re: [Wikitech-l] MediaWiki to Latex Converter

2013-11-26 Thread C. Scott Ananian
The new PDF rendering pipeline does indeed use XeLaTeX.  I haven't
used it to typeset non-latin scripts since a summer I spent at SIL in
1996 (and that might have been Omega, not XeLaTeX), so if you wanted
to pitch in and help out I'd greatly appreciate it.  To start with,
short example LaTeX articles typeset in your script would probably
help me ensure I've got all the prologue bits and packages right.
 --scott


On Mon, Nov 25, 2013 at 10:49 PM, Santhosh Thottingal
santhosh.thottin...@gmail.com wrote:
 To support complex
 scriptshttps://en.wikipedia.org/wiki/Complex_text_layout we
 need to use a tex system that can support Unicode and complex script
 rendering system. Xetex https://en.wikipedia.org/wiki/XeTeX works very
 well with these scripts.I tried the MediaWiki to Latex converter with
 Malayalam script, and the result is buggy.

 Thanks
 Santhosh
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
(http://cscott.net)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Deprecation notice for etherpad-old.wikimedia.org

2013-11-26 Thread Alexandros Kosiaris
Hello,

As many of you might be aware, etherpad.wikimedia.org has been
migrated a couple of months ago from the old and no longer supported
etherpad software to the new, actively supported, etherpad-lite
software. The move also involved the migration of  pads from the old
software to the new, a process which was quite successful, albeit not
without glitches. Since then the old installation has been kept around
under http://etherpad-old.wikimedia.org in a read-only state in order
to facilitate people to access pads that, for whatever reason, may not
have made it to the new installation unscathed (or at all). This
service will be discontinued and taken offline on Monday, 30 December
2013. That grants 30+ days to people to copy out any necessary pads.
With that in mind, the operations team would like to remind everyone
that etherpad.wikimedia.org was never intended to be a permanent
storage for pads. Preservation of a pad is up to the people interested
in preserving that pad in another format.

Regards,

-- 
Alexandros Kosiaris akosia...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Engineering Community team updates

2013-11-26 Thread Quim Gil
Our monthly ECT Showcase video stream will start in approximately 30
minutes.

Agenda and link to the stream
https://www.mediawiki.org/wiki/Engineering_Community_Team/Meetings#2013-11-26

-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google Code-in update (WE NEED MORE TASKS!)

2013-11-26 Thread Quim Gil
Interesting data of the first week of GCI, shared by the organizers:

 1929 students registered  (this is already higher than what we had at
the halfway point last year)
 84 countries represented
 342  tasks completed by 162 students

Since completed 32 and there are 10 organizations, we are just lightly
below average. I would say this is pretty good considering that it is
our first time.

And no, it's not late for you to join as mentor and bring your tasks.  :)

On 11/25/2013 10:51 AM, Quim Gil wrote:
 Google Code-in weekly update. Summary:
 
 WE NEED MORE TASKS, URGENTLY!
 
 http://www.google-melange.com/gci/dashboard/google/gci2013#all_org_tasks
 
 We are expecting new tasks coming from Mobile, Wikidata, Language and
 Lua templates. Still, GCI students are crunching tasks faster than we
 are able to create new ones. Please join the party with your tasks!
 
 https://www.mediawiki.org/wiki/Google_Code-In
 
 
 On 11/19/2013 10:23 AM, Quim Gil wrote:
 GCI is moving fast. We need more mentors and tasks, especially for
 software development!
 
 This is still very true a week after starting Google Code-in. These are
 the numbers so far:
 
 * 32 tasks have been completed (28% from the current total of 90)
 
 MediaWiki core, PyWikiBot, Kiwix, and mediawiki.org have been the main
 beneficiaires so far. We have seen students following the process in
 Gerrit and Bugzilla as described, some picking things up quickly, some
 needing an initial push.
 
 * 24 are currently claimed, meaning that 24 students are currently
 working on them.
 
 * 3 need review, 6 need more work, 2 are possibly abandoned, 9 were left
 by students that had claimed them.
 
 * Only 13 tasks haven't been touched at all in this first week.
 
 As you can see, this is working.
 
 First lesson: the best GCI tasks are those expecting an exact result
 e.g. a SVG with PNG fallback to substitute a low-resolution icon. Tasks
 giving more margin to creativity (write an article or a wiki page about
 certain topic) have a higher risk of requiring a lot more mentorship and
 obtaining mixed results.
 
 Second lesson: org admins can cover mentors when the tasks are well
 defined and must be resolved via Gerrit  Bugzilla. The help received
 from non-mentor community members commenting in Gerrit changes and gug
 reports is priceless! Thank You Very Much for your help.
 


-- 
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Tyler Romeo
Hey everybody,

tl;dr - How do we add 3rd party libs to core: composer, git submodules, or
copying the code?

So I have a question to discuss concerning MW core that I was hoping to get
some feedback on: what is our policy on including third-party libraries in
core?

To clarify, by policy I don't mean what factors do we take into account
when deciding to include a library (although feel free to weigh in on that
if you want to say something), but rather how one would go about doing it.

Here are the possibilities:
1) Use Composer to install dependencies
2) Use git submodules to store a reference to the repository
3) Copy the code and add a note somewhere of where it came from
(If I am missing an option, please enlighten me.)

My opinion on the matter is that option 1 is probably the best, primarily
because Composer was designed specifically for this purpose, and it is
widely used and is unlikely to randomly disappear in the near future. Also,
it makes the incorporation of these libraries trivial, since the autoloader
will be automatically registered using Composer. However, the method is not
without fault. A recent patch to core actually removed our composer.json
file, in hopes of allowing MediaWiki sysadmins to make their own custom
composer.json file so that extensions could be installed that way. Which is
more important: better maintenance of core dependencies, or allowing easier
extension installation? I don't know; that's for us to decide. I'm a bit
conflicted on the matter because I really do want to make extension
installation and management easier, but at the same time making sure the
core itself is easy to use should probably be a higher priority.

The next option is pretty much similar to Composer in that you have a
reference to some external code that will be downloaded when told to do so
by the user. However, it's different from Composer in a number of ways: 1)
when packaging tarballs, the code has to be downloaded anyway since
submodules are git-specific, and 2) we have to manage the autoloader
manually. Not too bad. If we decide the Composer option is not viable, I
think this would be a good alternative.

I don't like the final option at all, but it seems to be our current
approach. It's basically the same thing as git submodules except rather
than having a clear reference to where the code came from and where we can
update it, we have to add a README or something explaining it.

Also, just to clarify, this is not an out-of-the-blue request for comment.
I am currently considering whether we might want to replace our
HttpFunctions file with the third-party Guzzle library, since the latter is
very stable, much much more functional, and a lot easier to use. However,
this is out-of-scope for the discussion, so if you have an opinion on
whether doing this is a good/bad idea, please start another thread.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Antoine Musso
Le 26/11/13 22:12, Tyler Romeo a écrit :
 Hey everybody,
 
 tl;dr - How do we add 3rd party libs to core: composer, git submodules, or
 copying the code?

Hello,

Thank you for bringing the subject there. A few folks are eager to have
composer support for MediaWiki / extensions.  I could not find anytime
to work on it though :-(

Basically:

We tend to copy code and let it bit rot. At least there is no unexpected
breakage.

 Git submodules are a mess, needs to download code from github/whatever.
And I tend to dislike submodules.

Result: we need a dependency manager
Fact: Composer is more modern than pear


You should talk about it with Jeroen De Dauw. Since I talked about
composer with him he converted all his extensions to it and seems to
really want us to finally use composer for everything.  He largely
rewrote the lame page I have dumped a year ago at
https://www.mediawiki.org/wiki/Composer

As usual with Jeroen, it is fully documented with nice examples and step
by step tutorials. Definitely worth a read.

Also look at:
 
http://www.bn2vs.com/blog/2013/11/24/introduction-to-composer-for-mediawiki-developers/


snip
 Also, just to clarify, this is not an out-of-the-blue request for comment.
 I am currently considering whether we might want to replace our
 HttpFunctions file with the third-party Guzzle library, since the latter is
 very stable, much much more functional, and a lot easier to use. However,
 this is out-of-scope for the discussion, so if you have an opinion on
 whether doing this is a good/bad idea, please start another thread.

We invented the wheel but forgot to evangelize our classes or make them
easy to reuse.  So we are left with a bunch of code which is robust but
has counterpart which are more modern / populars.  I wish years ago we
thought about code reusability and publishing our classes as easy to use
modules.  We would have ended up as a leading PHP group, that was not a
priority though, the Wikimedia sites is.


You could start out a RFC to identify class that be replaced by a better
third party libraries.  I don't mind.

Symfony (a french PHP framework which is really spring for PHP) has a
bunch of reusable components:

 http://symfony.com/components

Among them:
  Console : could a bunch of our Maintenance class
  HttpFoundation : what you said, HTTP on rails
  Routing : do we have a router?
  ..

cheers,

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: wikimedia dns issue

2013-11-26 Thread George Herbert
Fwd - Randy Bush in Tokyo is reporting WMF donation server DNS issues.


-- Forwarded message --
From: Randy Bush ra...@psg.com
Date: Tue, Nov 26, 2013 at 3:44 PM
Subject: wikimedia dns issue
To: North American Network Operators' Group na...@nanog.org


leslie, you out there?

ryuu.psg.com:/Users/randy host links.email.donate.wikimedia.org.
links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
recp.mkt41.net has address 74.121.50.40

   $ dscacheutil -flushcache

does not clear it

this is all in tokyo.  so i ssh into a server in seattle westin.

psg.com:/usr/home/randy host links.email.donate.wikimedia.org.
links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
recp.mkt41.net has address 74.121.50.40

the real or unreal CNAME RR is pretty prevalent

pretoria

root@afnog:~ # host links.email.donate.wikimedia.org.
links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
recp.mkt41.net has address 74.121.50.40

london starts to scare me

drinx.linx.net:/root# host links.email.donate.wikimedia.org.
;; connection timed out; no servers could be reached
drinx.linx.net:/root# host links.email.donate.wikimedia.org.
;; connection timed out; no servers could be reached

so i look at the nearest zone i can find

ryuu.psg.com:/Users/randy doc -p -w donate.wikimedia.org.
Doc-2.2.3: doc -p -w donate.wikimedia.org.
Doc-2.2.3: Starting test of donate.wikimedia.org.   parent is
wikimedia.org.
Doc-2.2.3: Test date - Tue Nov 26 10:46:48 JST 2013
SYSerr: No servers for donate.wikimedia.org. returned SOAs ...
Summary:
   YIKES: doc aborted while testing donate.wikimedia.org.  parent
wikimedia.org.
   Incomplete test for donate.wikimedia.org. (1)
Done testing donate.wikimedia.org.  Tue Nov 26 10:46:49 JST 2013

at least the parent is ok

ryuu.psg.com:/Users/randy doc -p -w wikimedia.org.
Doc-2.2.3: doc -p -w wikimedia.org.
Doc-2.2.3: Starting test of wikimedia.org.   parent is org.
Doc-2.2.3: Test date - Tue Nov 26 10:47:45 JST 2013
Summary:
   No errors or warnings issued for wikimedia.org.
Done testing wikimedia.org.  Tue Nov 26 10:47:48 JST 2013

aha!

ryuu.psg.com:/Users/randy ns donate.wikimedia.org.
donate-lb.eqiad.wikimedia.org.

i suspect a load balancer

yes, i reported this to wikimedia two days ago.  a response, then black
hole.  problem persists.

randy




-- 
-george william herbert
george.herb...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wikimedia dns issue

2013-11-26 Thread Matthew Walker
Randy,

Thanks for the concern -- if I understand your email correct the issue is
not that you cannot resolve donate.wikimedia.org; but it's that one of our
subdomains resolves to something that's not a WMF server?

If so; the behavior you're seeing, that links.email.donate.wikimedia.org goes
to ?.mkt41.net is correct. What we're doing is year is using an external
bulk mailer to try and ensure deliverability of fundraising mail (it turns
out that we get about $1 per email so getting good deliverability is pretty
important.) The bulk mailer, Silverpop, does link rewriting in the body of
the email for use statistical things (we A/B test our emails and use their
dashboards to do so).

For what its worth our donor rights policy [1] applies to this type of data.

[1] http://wikimediafoundation.org/wiki/Donor_policy

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Tue, Nov 26, 2013 at 3:44 PM, Randy Bush ra...@psg.com wrote:

 leslie, you out there?

 ryuu.psg.com:/Users/randy host links.email.donate.wikimedia.org.
 links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
 recp.mkt41.net has address 74.121.50.40

$ dscacheutil -flushcache

 does not clear it

 this is all in tokyo.  so i ssh into a server in seattle westin.

 psg.com:/usr/home/randy host links.email.donate.wikimedia.org.
 links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
 recp.mkt41.net has address 74.121.50.40

 the real or unreal CNAME RR is pretty prevalent

 pretoria

 root@afnog:~ # host links.email.donate.wikimedia.org.
 links.email.donate.wikimedia.org is an alias for recp.mkt41.net.
 recp.mkt41.net has address 74.121.50.40

 london starts to scare me

 drinx.linx.net:/root# host links.email.donate.wikimedia.org.
 ;; connection timed out; no servers could be reached
 drinx.linx.net:/root# host links.email.donate.wikimedia.org.
 ;; connection timed out; no servers could be reached

 so i look at the nearest zone i can find

 ryuu.psg.com:/Users/randy doc -p -w donate.wikimedia.org.
 Doc-2.2.3: doc -p -w donate.wikimedia.org.
 Doc-2.2.3: Starting test of donate.wikimedia.org.   parent is
 wikimedia.org.
 Doc-2.2.3: Test date - Tue Nov 26 10:46:48 JST 2013
 SYSerr: No servers for donate.wikimedia.org. returned SOAs ...
 Summary:
YIKES: doc aborted while testing donate.wikimedia.org.  parent
 wikimedia.org.
Incomplete test for donate.wikimedia.org. (1)
 Done testing donate.wikimedia.org.  Tue Nov 26 10:46:49 JST 2013

 at least the parent is ok

 ryuu.psg.com:/Users/randy doc -p -w wikimedia.org.
 Doc-2.2.3: doc -p -w wikimedia.org.
 Doc-2.2.3: Starting test of wikimedia.org.   parent is org.
 Doc-2.2.3: Test date - Tue Nov 26 10:47:45 JST 2013
 Summary:
No errors or warnings issued for wikimedia.org.
 Done testing wikimedia.org.  Tue Nov 26 10:47:48 JST 2013

 aha!

 ryuu.psg.com:/Users/randy ns donate.wikimedia.org.
 donate-lb.eqiad.wikimedia.org.

 i suspect a load balancer

 yes, i reported this to wikimedia two days ago.  a response, then black
 hole.  problem persists.

 randy


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] OpenID deployment delayed until sometime in 2014

2013-11-26 Thread Rob Lanphier
Hi everyone,

If you were following our planning process this past spring/summer, you
probably heard that we had planned to deploy both OAuth and OpenID by the
end of 2013.

The good news is that we were able to complete our OAuth deployment (see
Dan Garry's blog post on the subject[1]).  The bad news is that we weren't
able to get OpenID complete enough for deployment, we've decided to put
OpenID on hold to make room for some of our other work.  Thomas Gries has
done some great work on this, and has been very responsive to our input
(we're a finicky bunch!)  We fully anticipate being able to deploy this
sometime in 2014, but we don't yet have an exact plan for making this
happen.

We've communicated this through other channels, but hadn't broadcasted it
here, so we're a bit overdue for an update.

The Auth Systems page now has the latest information on what we were able
to do, and will have more on our future plans as we make them:  
https://www.mediawiki.org/wiki/Auth_systems

Rob

[1] OAuth now available on Wikimedia wikis Dan Garry: 
https://blog.wikimedia.org/2013/11/22/oauth-on-wikimedia-wikis/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Matthew Flaschen

On 11/26/2013 05:28 PM, Antoine Musso wrote:

  Git submodules are a mess, needs to download code from github/whatever.
And I tend to dislike submodules.


If we went with this option, we could just have straight Gerrit mirrors 
of any git repos we wanted to bundle.  When we updated the submodule, we 
would also update the mirror.


That way there's no additional risk in the unlikely event someone 
tampers with the third-party GitHub repo (as long as we find out before 
updating our submodule/mirror).


Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Solution for Third-Party Dependencies

2013-11-26 Thread Tyler Romeo
On Tue, Nov 26, 2013 at 5:28 PM, Antoine Musso hashar+...@free.fr wrote:

 You could start out a RFC to identify class that be replaced by a better
 third party libraries.  I don't mind.

 Symfony (a french PHP framework which is really spring for PHP) has a
 bunch of reusable components:

  http://symfony.com/components

 Among them:
   Console : could a bunch of our Maintenance class
   HttpFoundation : what you said, HTTP on rails
   Routing : do we have a router?


Done.
https://www.mediawiki.org/wiki/Requests_for_comment/Third-party_components

I'll continue on the search for more third-party components, but right now
those listed seem like the main candidates (although even some of those
listed would be really difficult to do and might not happen).

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architecture Summit -- Gathering all relevant RfC's

2013-11-26 Thread MZMcBride
Diederik van Liere wrote:
If you have a Mediawiki related RfC in a personal notepad, on your User
Page, in your mind then this would be a great moment to write or move it
under https://www.mediawiki.org/wiki/Requests_for_comment and add an entry
to the table. If you don't have 'move' rights then please let me know and
I can move it for you.

Thanks for the nudge. I've had one sitting as an e-mail draft for months
about hit counters in MediaWiki core. I've now posted my notes.

I added a comment to one of the tables that the current sorting seems a
bit wonky. For example, there are quite a few RFCs (no apostrophe
required) in the in draft section that should probably be in the in
discussion section.

If you know of a topic that *should* have an RfC but does not yet have an
RfC then please reply to this list mentioning the topic. I will check with
Tim/Brion to see how these topics can get an RfC.

We should encourage boldness. It's a wiki world: if you have an idea for
an RFC, the correct answer is to start writing. :-)  Anyone can
participate.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Status update on new Collections PDF Renderer

2013-11-26 Thread Erik Moeller
Thanks, Matt, for the detailed update, as well as for your leadership
throughout the project, and thanks to everyone who's helped with the
effort so far. :-)

As Matt outlined, we're going to keep moving on critical path issues
til January and will do a second sprint then to get things ready for
production. Currently we're targeting January 6-January 17 for the
second sprint. Will keep you posted.

All best,
Erik
-- 
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l