Re: [Wikitech-l] Mediawiki core has reached to its 400th contributor

2017-02-07 Thread Ricordisamoa

Il 07/02/2017 08:56, Amir Ladsgroup ha scritto:

Hey,
Today I was checking mediawiki/core in github:
https://github.com/wikimedia/mediawiki
And number of contributors is now 400. It might be a little more if we
count SVN era.
To me, it's an important milestone. More important than number of commits
or releases. It's about people not numbers or lines of code.

Best


Hi, according to the CREDITS file there are 646 contributors :)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP namespaces

2016-05-29 Thread Ricordisamoa

According to PSR-4 <http://www.php-fig.org/psr/psr-4/>

   The fully qualified class name MUST have a top-level namespace name,
   also known as a "vendor namespace".

Just like how it is being done within MediaWiki core, it seems wise to 
keep existing classes without any top-level namespace for backwards 
compatibility, while putting any new or substantially refactored class 
into a new namespace to make the break more obvious.


Il 28/05/2016 18:02, Cyken Zeraux ha scritto:

Is the reason for the new namespaces for testing, or because the classes
are more suitable to a namespace?

On Sat, May 28, 2016 at 9:15 AM, Ricordisamoa<ricordisa...@openmailbox.org>
wrote:


Withhttps://gerrit.wikimedia.org/r/288633  some new classes in the Babel
extension were put into a new namespace "MediaWiki\Babel\BabelBox".
Legoktm approved but Thiemo Mättig (WMDE) disagrees. PHPUnit tests are
already in namespace "Babel\Tests".

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PHP namespaces

2016-05-28 Thread Ricordisamoa
With https://gerrit.wikimedia.org/r/288633 some new classes in the Babel 
extension were put into a new namespace "MediaWiki\Babel\BabelBox".
Legoktm approved but Thiemo Mättig (WMDE) disagrees. PHPUnit tests are 
already in namespace "Babel\Tests".


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FW: File with an unknown copyright status on MediaWiki.org

2016-05-25 Thread Ricordisamoa

Why not move them to Commons after review?
They'd benefit from all the goods, not only Information template but 
also categories etc.


Il 25/05/2016 14:39, Steinsplitter Wiki ha scritto:

There are hundreds of files in 
https://www.mediawiki.org/wiki/Category:Files_with_unknown_copyright_status

Likely they schould be nominated after one week or moor automatically for 
deletion (using parser function).  Then a human can review it and press delete.

Likely Template:Information schould be preloaded at Special:Upload (similar to 
commons), so we would also have machine readable data for files.


Best,
--Steinspliter


From: p858sn...@gmail.com
Date: Wed, 25 May 2016 21:11:13 +1000
To: wikitech-l@lists.wikimedia.org
Subject: Re: [Wikitech-l] FW: File with an unknown copyright status on  
MediaWiki.org

On 25 May 2016 at 20:40, Ricordisamoa <ricordisa...@openmailbox.org> wrote:

, while on mediawiki.org no one cares.

That's not true, It's a project I plan to work on shortly (just need
to finish a few other things first).

And it's not a simple case of just hitting delete on everything.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FW: File with an unknown copyright status on MediaWiki.org

2016-05-25 Thread Ricordisamoa

E.g. reuse of content from Wikipedia:
https://www.mediawiki.org/wiki/File:QuickSurveys-Desktop.png
From Wikivoyage:
https://www.mediawiki.org/wiki/File:Banner_example.png
Screenshot of proprietary software:
https://www.mediawiki.org/wiki/File:07-wikitech-phpstorm-deployment.png
On Commons they'd have been speedily deleted, while on mediawiki.org no 
one cares.


Il 25/05/2016 12:11, Steinsplitter Wiki ha scritto:




Hello,

I hope this is the right place to ask, if not please excuse.

All files uploaded to MediaWiki.org must be available under a free license (no "fair use" 
or "noncommercial").
What we do with files with unclear copyright status (no license, no author, 
etc.) such as:

* https://www.mediawiki.org/wiki/File:Scott_profile_pic.jpeg
* 
https://www.mediawiki.org/wiki/File:EMWCon_Spring_2016_Mediawiki_in_the_Enterprise.pdf
* https://www.mediawiki.org/wiki/File:3d_extension_screenshot.png
* a lot of other

Nominating for deletion? Per the intro at 
https://www.mediawiki.org/wiki/Special:Upload such files must be deleted.

We have {{Unknown copyright}} template, but it seems that nobody is working on 
it?


Best,
--Steinsplitter

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reducing the environmental impact of the Wikimedia movement

2016-05-23 Thread Ricordisamoa
Server consumption of course but what about the impact of email, food, 
transport etc?

Earth Hour: switch the wikis to a dark skin

Il 30/03/2016 09:27, Lukas Mezger ha scritto:

Dear readers of the Wikitech mailing list,

I am a member of the Wikipedia community and I have started a project to
reduce the environmental impact of the Wikimedia movement
. The main idea is to
use renewable energy for running the Wikimedia servers and the main reason
for this is that by doing so, Wikipedia can set a great example for
environmental responsibility in the entire internet sector.

My project was started after Greenpeace USA published a report
 about the
energy consumption of the biggest sites on the Internet in 2015 and in
which Wikipedia, to my astonishment, performed poorly, receiving a "D"
score and only passing because of the Wikimedia Foundation's openness about
its energy consumption.

I would very much like to change that and set up a page called "Environmental
impact " on Meta. I
have already discussed the issue with a few people both from the Wikimedia
Foundation's management and from the Wikimedia community and have received
positive responses.

In order to further advance the project, I would like to learn more about
how much energy Wikipedia's servers use. As far as I can tell, these
figures are not public, but I believe they could very well be.

Also, I am interested to learn how changing a server site's energy sources
can be carried out on the operations side since the United States energy
sector hasn't been completely deregulated yet.

So, thank you very much for any comments! Maybe there also is an even
better forum to discuss these questions?

Finally, if you would like to support my project, please consider adding
your name to this list
.
Thank you.
Kind regards,

Lukas Mezger / User:Gnom 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal to invest in Phabricator Calendar

2016-05-14 Thread Ricordisamoa
If we're going to be investing money into improving Phabricator 
upstream, I think we should start with making Differential usable (i.e. 
a suitable replacement for Gerrit)


Il 13/05/2016 21:36, Quim Gil ha scritto:

Annoyed by the difficulties of tracking events in the Wikimedia tech
community? Or by the difficulties of announcing events in an effective way?
Check this out:

Consolidate the many tech events calendars in Phabricator's calendar
https://phabricator.wikimedia.org/T1035

The hypothesis is that it is worth improving the current situation with
calendars in the Wikimedia tech community, and that Phabricator Calendar is
the best starting point. If we get a system that works for Wikimedia Tech,
I believe we can get a system that works for the rest of Wikimedia,
probably with some extra steps.

The Technical Collaboration team has some budget that we could use to fund
the Phabricator maintainers to prioritize some improvements in their
Calendar. If you think this is a bad idea and/or you have a better one,
please discuss in the task (preferred) or here. If you think this is a good
idea, your reassuring feedback is welcome too.  ;)

Thank you!




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Kunal (User:Legoktm) moving to Parsing Team

2016-04-08 Thread Ricordisamoa

You mean subtract 1, or add -1

Il 05/04/2016 18:53, Subramanya Sastry ha scritto:


I suppose you've figured out that I don't know how to write citations. 
Subtract -1 from N for all [N] in the body. :-) -S.


On 04/05/2016 11:41 AM, Subramanya Sastry wrote:

Hi everyone,

We would like to let you know that Kunal (User:Legoktm for those who 
don’t already know) is moving inside the Editing Department from the 
Collaboration Team to the Parsing Team.


The Collaboration team is grateful for Kunal’s great work over the 
past year, especially on the backend for cross-wiki notifications. 
Prior to that, Kunal spent two years working on SUL finalization, 
without which a feature like cross-wiki notifications would not have 
been possible.


The Parsing team is very happy to have Kunal join them. Kunal is 
really interested to work on implementing shadow namespaces [2] which 
enables wikis to specify fallback wikis for pages that don’t resolve 
on the local wiki. Among other things, this could enable creation of 
global repositories for templates, for example, which interests us 
greatly.


There is this little detail of Kunal being a nominee for 
affiliate-selected board seats [3] and what happens if he gets 
elected [4]. We will cross that bridge when we get there. :-)


Trevor, Subbu, Roan.

[1] 
https://www.mediawiki.org/wiki/Requests_for_comment/Shadow_namespaces
[2] 
https://meta.wikimedia.org/wiki/Affiliate-selected_Board_seats/2016/Nominations/Kunal_Mehta
[3] 
https://meta.wikimedia.org/wiki/Affiliate-selected_Board_seats/2016/Questions#Kunal_Mehta_16 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Kunal (User:Legoktm) moving to Parsing Team

2016-04-08 Thread Ricordisamoa

I wish there were a Legoktm Team within the Legoktm Department

Il 05/04/2016 18:41, Subramanya Sastry ha scritto:

Hi everyone,

We would like to let you know that Kunal (User:Legoktm for those who 
don’t already know) is moving inside the Editing Department from the 
Collaboration Team to the Parsing Team.


The Collaboration team is grateful for Kunal’s great work over the 
past year, especially on the backend for cross-wiki notifications. 
Prior to that, Kunal spent two years working on SUL finalization, 
without which a feature like cross-wiki notifications would not have 
been possible.


The Parsing team is very happy to have Kunal join them. Kunal is 
really interested to work on implementing shadow namespaces [2] which 
enables wikis to specify fallback wikis for pages that don’t resolve 
on the local wiki. Among other things, this could enable creation of 
global repositories for templates, for example, which interests us 
greatly.


There is this little detail of Kunal being a nominee for 
affiliate-selected board seats [3] and what happens if he gets elected 
[4]. We will cross that bridge when we get there. :-)


Trevor, Subbu, Roan.

[1] https://www.mediawiki.org/wiki/Requests_for_comment/Shadow_namespaces
[2] 
https://meta.wikimedia.org/wiki/Affiliate-selected_Board_seats/2016/Nominations/Kunal_Mehta
[3] 
https://meta.wikimedia.org/wiki/Affiliate-selected_Board_seats/2016/Questions#Kunal_Mehta_16 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Phabricator was down for a short time today (April 4th)

2016-04-04 Thread Ricordisamoa

Cool down, Phab. Cool down. We need you.

Il 04/04/2016 19:57, Greg Grossmeier ha scritto:

Apologies for not sending out this announcement before hand.

Short summary: The machine that Phabricator is hosted on rebooted itself
last night due to high temperatures. It ended up just shutting itself
down.

Today we needed our DataCenter Technician to reapply the thermal paste
in an attempt to remedy the issue. That took less than 10 minutes but it
happened during the middle of the day.

Full details: https://phabricator.wikimedia.org/T131742

And yes, we are requesting a backup machine so issues like this won't
have as much of an impact on you (our users):
https://phabricator.wikimedia.org/T131775

Best,

Greg




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Feelings

2016-04-01 Thread Ricordisamoa
This sounds like something that some random WMF team would actually 
implement in the near future...


Il 01/04/2016 21:24, Legoktm ha scritto:

Hi,

It's well known that Wikipedia is facing threats from other social
networks and losing editors. While many of us spend time trying to make
Wikipedia different, we need to be cognizant that what other social
networks are doing is working. And if we can't beat them, we need to
join them.

I've written a patch[1] that introduces a new feature to the Thanks
extension called "feelings". When hovering over a "thank" link, five
different emoji icons will pop up[2], representing five different
feelings: happy, love, surprise, anger, and fear. Editors can pick one
of those options instead of just a plain thanks, to indicate how they
really feel, which the recipient will see[3].

Of course, some might consider this feature to be controversial (I
suspect they would respond to my email with "anger" or "fear"), so I've
added a feature flag for it. Setting
  $wgDontFixEditorRetentionProblem = true;
will disable it for your wiki.

Please give the patch a try, I've only tested it in MonoBook so far, it
might need some extra CSS in Vector.

[1] https://gerrit.wikimedia.org/r/280961
[2] https://phabricator.wikimedia.org/F3810964
[3] https://phabricator.wikimedia.org/F3810963

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Medical leave

2016-02-17 Thread Ricordisamoa

Hi Frances,
no hurry. Rest well and come back stronger!

Il 17/02/2016 02:16, Frances Hocutt ha scritto:

Dear all,

I’m writing to give a heads-up that I expect to be taking medical leave
from today through March 9, due in part to stress caused by the recent
uncertainty and organizational departures. In my absence, my work will be
in the capable hands of Ryan Kaldari and the rest of the Community Tech
team. Any questions you would have directed to me, direct to Ryan.

I hope to return as promptly as circumstance allows.

All the best,
Frances
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?

2016-02-12 Thread Ricordisamoa

Il 12/02/2016 20:26, Legoktm ha scritto:

Hi,

On 02/12/2016 07:27 AM, Daniel Kinzler wrote:

Now that we target PHP 5.5, some people are itching to make use of some new
language features, like the new array syntax, e.g.
.

Mass changes like this, or similar changes relating to coding style, tend to
lead to controversy. I want to make sure we have a discussion about this here,
to avoid having the argument over and over on any such patch.

Please give a quick PRO or CON response as a basis for discussion.

In essence, the discussion boils down to two conflicting positions:

PRO: do mass migration to the new syntax, style, or whatever, as soon as
possible. This way, the codebase is in a consistent form, and that form is the
one we agreed is the best for readability. Doing changes like this is
gratifying, because it's low hanging fruit: it's easy to do, and has large
visible impact (well ok, visible in the source).

I'll offer an alternative, which is to convert all of them at once using
PHPCS and then enforce that all new patches use [] arrays. You then only
have one commit which changes everything, not hundreds you have to go
through while git blaming or looking in git log.


CON: don't do mass migration to new syntax, only start using new styles and
features when touching the respective bit of code anyway. The argument is here
that touching many lines of code, even if it's just for whitespace changes,
causes merge conflicts when doing backports and when rebasing patches. E.g. if
we touch half the files in the codebase to change to the new array syntax, who
is going to manually rebase the couple of hundred patches we have open?

There's no need to do it manually. Just tell people to run the phpcs
autofixer before they rebase, and the result should be identical to
what's already there. And we can have PHPCS run in the other direction
for backports ([] -> array()).

But if we don't do that, people are going to start converting things
manually whenever they work on the code, and you'll still end up with
hundreds of open patches needing rebase, except it can't be done
automatically anymore.


My personal vote is CON. No rebase hell please! Changing to the syntax doesn't
buy us anything.

Consistency buys us a lot. New developers won't be confused on whether
to use [] or array(). It makes entry easier for people coming from other
languages where [] is used for lists.


Objection: other languages may use [] for lists, but array() is more 
similar to {} used for hash tables.




I think you're going to end up in rebase hell regardless, so we should
rip off the bandaid quickly and get it over with, and use the automated
tools we have to our advantage.

So, if we're voting, I'm PRO.

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FOSDEM presentation (Converting LQT to Flow)

2016-02-08 Thread Ricordisamoa

Il 08/02/2016 18:42, Marius Hoch ha scritto:
For the sake of completeness, there also were two presentations held 
by WMDE employees:


The presentations held by Lucie about the ArticlePlaceholder can be 
found at 
http://www.slideshare.net/frimelle/increasing-access-to-free-and-open-knowledge-for-speakers-of-underserved-languages-on-wikipedia


Slideshare is not download friendly. Ouch.



The presentation Julia and Jens held about the Game Jam can be found 
at 
https://commons.wikimedia.org/wiki/File:Free_Knowledge_Game_Jam_-_Presentation_at_FOSDEM_2016.pdf


Cheers,

Marius

On 08.02.2016 05:37, Matthew Flaschen wrote:
I've uploaded my slides for my FOSDEM 2016 presentation (Converting 
LiquidThreads to Flow: Or, how I learned to stop worrying and love 
the batch) to Commons:


https://commons.wikimedia.org/wiki/File:Converting_LQT_to_Flow_FOSDEM_2016.pdf 



Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Your speaking schedule at FOSDEM

2016-01-28 Thread Ricordisamoa

I do understand that the conversion has been challenging.
However, I also believe that Flow shouldn't be misrepresented as a 
complete success on behalf of the MediaWiki community, when its 
development has been in fact halted, it is missing critical features and 
one of the biggest independent users of LiquidThreads is still to make 
the leap.


Il 17/01/2016 18:28, Maarten Dammers ha scritto:

Op 17-1-2016 om 8:23 schreef Ricordisamoa:
Or "Converting talk pages from an unmaintained system to a halted 
one" :-D

Not cool Ricordisamoa.

Thanks Matt for putting effort into this. For anyone who will be at 
FOSDEM: Please add yourself to https://phabricator.wikimedia.org/E52 
(if you haven't done already)


Maarten



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Close test2wiki?

2016-01-28 Thread Ricordisamoa

Il 28/01/2016 02:30, Dan Garry ha scritto:

On 27 January 2016 at 17:16, Legoktm  wrote:


Especially when debugging and testing cross-wiki features, it is
extremely useful to have two test wikis to use. MassMessage,
GlobalCssJs, GlobalUserPage, and now cross-wiki notifications were all
initially deployed using testwiki as the "central" wiki, and test2wiki
as a "client" wiki.


That sounds like a good reason to keep it, especially since global
notifications is an active, ongoing work. Perhaps, as an alternative to
shutting it down, we should just make it clearer that test2.wikipedia.org
is primarily intended for that purpose on that wiki's main page (or
anywhere else thought appropriate). If there's some specific overhead to
keeping test2 alive that might outweigh that benefit, now would seem to be
the time to make it clear. :-)

Dan



I second Legoktm's comment. And, for what it's worth, I don't think it 
makes much sense to limit test2wiki to a specific purpose.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Your speaking schedule at FOSDEM

2016-01-16 Thread Ricordisamoa

Or "Converting talk pages from an unmaintained system to a halted one" :-D

Il 15/01/2016 23:17, Matthew Flaschen ha scritto:
I will be presenting at FOSDEM on the LiquidThreads to Flow 
conversion, at 2016-01-30, 16:00 Belgium time:


http://www.timeanddate.com/worldclock/fixedtime.html?msg=FOSDEM+LiquidThreads+to+Flow+talk+by+Matt+Flaschen%2C+Livestreamed=20160130T15 



It will be live-streamed at 
https://live.fosdem.org/watch.php?room=H.2215%20(Ferrer) (from what I 
can tell in advance).


Thanks,

Matt Flaschen

 Forwarded Message 
Subject: Your speaking schedule at FOSDEM
Date: Fri, 15 Jan 2016 22:13:33 +0100 (CET)
From: FOSDEM Programme Team 
Reply-To: speak...@fosdem.org
Organization: FOSDEM - https://fosdem.org/
To: Matt Flaschen 
CC: speak...@fosdem.org

Dear speaker,

Thank you for agreeing to speak at FOSDEM!

Our programme is now complete:
  https://fosdem.org/2016/schedule/

This is your schedule:

.. 


| day| time | room| title  |
++--+-+--+ 

| 2016-01-30 | 16:00:00 | H.2215 (Ferrer) | Converting LiquidThreads 
to Flow |
'+--+-+--' 



.-.
| links   |
+-+
| https://fosdem.org/2016/schedule/event/flow |
'-'

Please check these links carefully.  If you already created an account 
at:

  https://fosdem.org/submit
you can login and update the information if need be.

In particular, please upload a photograph if you have not already done
so and enter or update your biography.

Changes take a few minutes to reach the website - the time it was last
updated appears at the bottom of this page:
  https://fosdem.org/2016/schedule/events/

If you find yourself unable to attend, please let your room's organiser
know this as soon as possible.

FOSDEM intends to record and stream the entire conference live - that's
over 300 hours of content!  The recordings will be published under the
same licence as all FOSDEM content (CC-BY).  You are agreeing to this by
taking part in the event.

Any slide decks you present should be uploaded by about half-an-hour
before the start of your talk so that people watching the live stream
can follow them locally if their video resolution leaves slide content
indistinct.  (Our system does not hold back publication, so if you don't
want people to see them too far in advance, don't upload them yet.)

Projectors work at 1024x768 but expect slides (and any demonstrations)
to be scaled down to 720x576 for the video, so try to make them
readable at this lower resolution if you can.

Please don't forget to bring a VGA adaptor if you hope to present from a
laptop that only has a different type of output connector!  We also
recommend bringing a spare copy of any slides on a USB stick.

As usual, we aim to provide free high-quality wireless network
connectivity throughout the event.

Best wishes,

The FOSDEM Programme Team



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PHP 7.0.0 Released

2015-12-03 Thread Ricordisamoa
The PHP development team announces 
 the immediate 
availability of PHP 7.0.0.
The newphp  project has 
been created in Phabricator to track incompatibility issues.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP 7.0.0 Released

2015-12-03 Thread Ricordisamoa

Il 04/12/2015 01:37, MZMcBride ha scritto:

Ricordisamoa wrote:

The PHP development team announces
<http://php.net/archive/2015.php#id2015-12-03-1>  the immediate
availability of PHP 7.0.0.
The newphp<https://phabricator.wikimedia.org/tag/newphp/>  project has
been created in Phabricator to track incompatibility issues.

Regarding the Phabricator tag, it looks to have been created in November
2014, if I'm reading its history correctly.


For some reason I keep mixing up 2014 and 2015 :-[


The tag description says "Tag for issues involving changes in upcoming/not
yet released PHP versions" and PHP 7.0.0 is now neither, right?


For most MediaWiki developers it is yet to come, isn't it?

Both T115249 <https://phabricator.wikimedia.org/T115249>, T115250 
<https://phabricator.wikimedia.org/T115250> and T115254 
<https://phabricator.wikimedia.org/T115254> have been tagged #newphp by 
the Bugwrangler.



In general, it seems a bit silly/shortsighted to use "new" in a name. A
more specific project name such as #php7 might be better.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Because visualizations are cool

2015-11-27 Thread Ricordisamoa

Il 27/11/2015 14:16, Joaquin Oltra Hernandez ha scritto:

While we are at it, Mike Bostock's block's page is a wonderful place to
wonder around and be amazed. The guy is the creator of d3, works at the NYT
interactive news department


https://twitter.com/mbostock/status/595252571658260481


and is just plain genius.

http://bl.ocks.org/mbostock  (scroll down for more, it has infinite
scrolling)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing WikiToLearn to developers

2015-11-27 Thread Ricordisamoa

What a better proof that itwikiversity has failed.
But it is getting restarted 
 
recently.


Il 27/11/2015 14:22, Riccardo Iaconelli ha scritto:

Hi all,

I would like to introduce to the Wikimedia community WikiToLearn, a FOSS
project of which I am a participant and which is lately getting a lot of
contributions and momentum.

It is a KDE project sponsored (among the others) by Wikimedia Italy and
recently joined by institutions such as HEP Software Foundation (CERN,
Fermilab, Princeton...) or Universities such as University of Pisa and Milano-
Bicocca. These institutions are already populating the website with content.

We aim to provide a platform where learners and teachers can complete, refine
and re-assemble lecture notes in order to create free, collaborative and
accessible textbooks, tailored precisely to their needs.

Although the project is quite young (only a few months old), it is already
growing in allure at an unexpected rate. Thanks to this we are now counting on
nearly 40 developers, and growing (including content developers).

We are different from Wikipedia and other WMF projects in several ways, and in
a sense, complementary. Our focus is on creating complete textbooks (and not
encyclopedic articles), drawing from a professor’s or a student’s own notes,
either existing or that have to be written down.

We also have a strong focus on offline use: all the content of WikiToLearn
should be easily printable by any student for offline use and serious
studying.

Besides a good team for content development, we can count on a small but
motivated team of developers, and we would like to improve communication with
upstream (a.k.a. you ;-) ), because we found ourselves developing a few
features which could probably be made available to the general public, with
some generalization and polishing. ;-)

Is this a right place to start such a discussion?

We would like to help as much as we can, but we might need some mentoring in
how to best approach MediaWiki development, as many of us are relatively new
to OSS/Web development.

Bye,
-Riccardo


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please welcome Joe Matazzoni as Product Manager, Collaboration

2015-11-26 Thread Ricordisamoa

I hope he has at least some experience with the projects.
We are in desperate need of someone who is deeply rooted into the 
community to clean up the Flow mess.

Good luck.

Il 23/11/2015 21:00, James Forrester ha scritto:

Everyone,

It's my pleasure to welcome Joe Matazzoni to the Editing department as the
new product manager for the Collaboration team from today. Joe will help
the team develop aligned with the needs of our editors, prioritizing and
shaping the improvements we're making to notifications, watchlists, change
tracking, discussions and more. He replaces Danny Horn, who is now helping
the Community Tech team.

Joe has had a long career in digital publishing and media, particularly in
public broadcasting, first at NPR, where he helped rebuild the Web site and
then founded the Arts & Life section, and more recently at KQED, where he
created an Emmy-winning arts video team. Joe has a master’s degree in
interactive telecommunications from NYU and a background in UX design. He
was the director of UX at the Wall Street Journal, where he helped lead a
complete site and CMS redesign and relaunch.

After many years in New York and DC, Joe is delighted to be back in his
native Bay Area. He's a big reader, enjoys biking and swimming and likes to
dabble in the kitchen. He will be based in the office, which he plans to
commute to via the ferry from Marin, where he lives with his wife and
11-year-old son. Joe will be on IRC and of course on-wiki as [[User:JMatazzoni
(WMF)]].

Welcome, Joe!

Yours,


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [BREAKING CHANGE] IE 8 will go JavaScript-less starting January 2016

2015-11-12 Thread Ricordisamoa

Yes. Yes. YES!

Il 12/11/2015 03:11, Krinkle ha scritto:

Hey all,

Starting in January 2016, MediaWiki will end JavaScript support for
Microsoft Internet Explorer 8. This raises the cut-off up from MSIE 7.
Users with this browser will still be able to browse, edit, and otherwise
contribute to the site. However, some features will not be available to
them. For example, the enhanced edit toolbar will not appear, and the
notification buttons will take you to a page rather than a pop-out.

This change will affect roughly 0.89% of all traffic to Wikimedia wikis (as
of October 2015). For comparison, 0.33% of traffic comes from Internet
Explorer 6, and 1.46% from Internet Explorer 7. Support for these was
dropped in August and September 2014 respectively.

Providing JavaScript for IE 8 adds a significant maintenance burden. It
also bloats the software we ship to all users, without proportionate
benefit. This enables us to simplify and streamline the JavaScript codebase
for all other users. Users unable to upgrade from Internet Explorer 8 will
have a faster experience going forward, based on well-tested and more
stable code.

This change will land in the development branch in January, and so will be
part of MediaWiki 1.27 (to be released around May 2016).

Tech News will announce this change as well, but please help carry this
message into your communities. In January, we will send a reminder before
the change happens.

Yours,
-- Krinkle

For details about the JavaScript-less experience, see
https://www.mediawiki.org/wiki/Compatibility
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid still doesn't love me

2015-11-09 Thread Ricordisamoa

Il 09/11/2015 15:52, Brad Jorsch (Anomie) ha scritto:

On Fri, Nov 6, 2015 at 3:29 PM, Ricordisamoa <ricordisa...@openmailbox.org>
wrote:


What if I need to get all revisions (~2000) of a page in Parsoid HTML5?
The prop=revisions API (in batches of 50) with mwparserfromhell is much
quicker.


That's a tradeoff you get with a highly-cacheable REST API: you generally
have to fetch each 'thing' individually rather than being able to batch
queries.

If you already know how to individually address each 'thing' (e.g. you
fetch the list of revisions for the page first) and the REST API's ToS
allow it, multiplexing requests might be possible to reduce the impact of
the limitation. If you have to rely on "next" and "previous" links in the
content to address adjacent 'things' hateoas-style you're probably out of
luck.




All of this seems overkill in the first place...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Parsoid still doesn't love me

2015-11-06 Thread Ricordisamoa
What if I need to get all revisions (~2000) of a page in Parsoid HTML5? 
The prop=revisions API (in batches of 50) with mwparserfromhell is much 
quicker.
And what about ~400 revisions from a wiki without Parsoid/RESTBase? I 
would use /transform/wikitext/to/html then.

Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Parsoid still doesn't love me

2015-11-06 Thread Ricordisamoa

I mean RESTBase can't access more than 1 revision at once?

Il 06/11/2015 21:39, Subramanya Sastry ha scritto:


Parsoid is simply a wikitext -> html and a html -> wikitext conversion 
service. Everything else would be tools and libs built on top of it.


Subbu.

On 11/06/2015 02:29 PM, Ricordisamoa wrote:
What if I need to get all revisions (~2000) of a page in Parsoid 
HTML5? The prop=revisions API (in batches of 50) with 
mwparserfromhell is much quicker.
And what about ~400 revisions from a wiki without Parsoid/RESTBase? I 
would use /transform/wikitext/to/html then.

Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Community Tech: October report

2015-11-03 Thread Ricordisamoa

Il 04/11/2015 00:29, Risker ha scritto:

On 3 November 2015 at 17:09, Danny Horn  wrote:


In the Community Tech team, we're constantly striving to make the world
better by creating helpful things and fixing unhelpful things. We're
basically superheroes, and we wear capes at all times. Here's what we've
been up to this month.
* We built a new Special:GadgetUsage report that's live on all wikis; it
lists gadgets used on the wiki, ordered by the number of users. Not to be
clickbait or anything, but THE RESULTS WILL SHOCK YOU. Check it out at
https://commons.wikimedia.org/wiki/Special:GadgetUsage or your own
favorite
wiki.

* HotCat is one of the most popular gadgets -- see: GadgetUsage report
above -- which helps people remove, change and add categories. We fixed
HotCat on over 100 wikis where it was broken, including Wikipedias in
Egyptian Arabic, Ripuarian, Buginese and Navajo, and five projects in Farsi
-- Wikinews, Wikiquote, Wikisource, Wikivoyage and Wiktionary. You're
welcome, Farsi! (More info on
https://en.wikipedia.org/wiki/Wikipedia:HotCat
)

* CitationBot is a combination tool/on-wiki gadget that helps to expand
incomplete citations. We got it running again after the https change,
updated it, and fixed some outstanding bugs, including handling multiple
author names. (See http://tools.wmflabs.org/citations/doibot.html for more
info.)

* We also built a prototype of a new tool called RevisionSlider, which
helps editors navigate through diff pages without having to go back and
forth to the history page. The prototype is live now on test.wp, and we'd
love to get your feedback -- visit
https://meta.wikimedia.org/wiki/Community_Tech/RevisionSlider

Coming up in November:
* We're starting a big cross-project Community Wishlist Survey on November
9th, inviting contributors from any wiki to propose and vote on the
features and fixes they'd like our team to work on. The survey page is on
Meta, at https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey --
please join us there on Monday to add your proposals.

* While that's going on, we're currently considering work in a few
different areas, including completing Gadgets 2.0 and building some modules
to help WikiProjects.

You can keep track of what we're working on by watching Community Tech/News
on Meta: https://meta.wikimedia.org/wiki/Community_Tech/News -- and feel
free to leave questions or comments on the talk page. Thanks!





Okay, so I'm not going to say they shocked me in fact, they're pretty
much what I expected.  However, I notice on the stats for English
Wikipedia[1]  that multiple gadgets appear twice, once with a higher number
and a second time with a "-" in front of them, and a low number.


I think T117440 .



Example:

wikEd 33462
-wikEd 7

Twinkle 32487
-Twinkle 7

Are the negative numbers the number of users who had previously enabled the
gadget and then subsequently disabled it?  If not, what are they?

Thanks for targeting the cleanup and broader distribution of those high-use
tools and gadgets.

Risker/Anne



[1] https://en.wikipedia.org/wiki/Special:GadgetUsage
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] JetBrains IDE licenses for MW developers - php, js/node, python, C/C++, puppet, ruby, ...

2015-10-21 Thread Ricordisamoa
How come the FOSS community hasn't managed to develop a suitable 
alternative yet?!?


Il 19/10/2015 22:02, Yuri Astrakhan ha scritto:

Developers, we now have licenses for JetBrains' IDEA Ultimate
 (JavaScript, PHP, Python, ...)  and CLion
 (C/C++).  The IDE supports debugging in
Vagrant and on-the-fly static code analysis.

If you are a volunteer developer, and want to use an IDE, contact me for a
license. Those with access to the office wiki may register directly at the
licenses  page.

Please try it for a bit before getting a license -- just in case you don't
like it.

P.S. This is only for those who want JetBrains' IDE. If you are happy with
Vim, good for you. If you like Atom or Sublime, great. If notepad is the
best text editor of all times, all the power to you.  Bikeshedding should
go into a separate thread ))
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tools repository

2015-10-15 Thread Ricordisamoa

Il 14/10/2015 00:51, Quim Gil ha scritto:

On Mon, Oct 12, 2015 at 7:04 PM, Ricordisamoa <ricordisa...@openmailbox.org>
wrote:


Il 12/10/2015 18:56, Quim Gil ha scritto:


Interesting. Has there been any discussion about having an authoritative
and well promoted catalog of Wikimedia tools?


Yes.
https://lists.wikimedia.org/pipermail/labs-l/2015-March/thread.html#3462


After reading the (short) discussion, I think it is worth creating a task
to assure that at least this idea is not forgotten again in another mailing
list archive. Ricordisamoa, do you want to create it and associate it at
lest with the #Developer-Relations project?


Yes.
https://phabricator.wikimedia.org/T115650



I also believe that we need a catalog of tools. Even when some experts in
our community know or can find what is available and functional, most
potential users of those tools will have a hard time connecting with them.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tools repository

2015-10-12 Thread Ricordisamoa

Il 12/10/2015 18:56, Quim Gil ha scritto:

The fact that Nick could offer a selection of links, and links of links,
could make us happy or very sad... To start with, anybody able to find
wikitech-l should find the tools repository / catalog in the first place.

On Sun, Oct 11, 2015 at 10:54 PM, Magnus Manske 

Re: [Wikitech-l] PHP CodeSniffer is now voting on MediaWiki core patches

2015-09-26 Thread Ricordisamoa

foreach( $people_Involved as $person): $person->thank( ) ; endforeach ;

Il 27/09/2015 01:28, Legoktm ha scritto:

Hi,

Thanks to the hard work of a lot of different people, PHP CodeSniffer is
now voting on all MediaWiki core patchsets! It checks for basic code
style issues automatically so new contributors (and experienced ones!)
can fix basic issues without a human needing to point them out.

I added some brief instructions to [1] on how to run it locally, or you
can read the jenkins output.

There are still a few code style rules that are disabled, [2] is
tracking fixing those issues.

Please file any bugs or feature requests in the MediaWiki-CodeSniffer[3]
project on Phabricator.

[1] https://www.mediawiki.org/wiki/Continuous_integration/PHP_CodeSniffer
[2] https://phabricator.wikimedia.org/T102609
[3] https://phabricator.wikimedia.org/tag/mediawiki-codesniffer/

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTMLForm and default values

2015-09-23 Thread Ricordisamoa

Finally found a task for this: https://phabricator.wikimedia.org/T38210

Il 16/09/2015 20:27, Ricordisamoa ha scritto:
For https://gerrit.wikimedia.org/r/233423 I need 'default' values of 
HTMLForm fields to overwrite values set via query parameters, when the 
latter are set to empty strings. Do you know a clean way to do it?

Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-09-17 Thread Ricordisamoa
Stephen Niedzielski: "it seems like, as soon as you get the HTML the 
first thing you want to do, perhaps a little bit ironically because it's 
called Parsoid, it's parse the output a little bit more"

https://www.youtube.com/watch?v=3WJID_WC7BQ=35m14s

Il 23/07/2015 22:02, C. Scott Ananian ha scritto:

HTML5+RDFa is a machine-readable format.  But I think what you are asking
for is either better documentation of the template-related stuff (did you
read through the slides inhttps://phabricator.wikimedia.org/T105175  ?) or
HTML template parameter support (https://phabricator.wikimedia.org/T52587)
which is in the codebase but not enabled by default in production.
  --scott
​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] HTMLForm and default values

2015-09-16 Thread Ricordisamoa
For https://gerrit.wikimedia.org/r/233423 I need 'default' values of 
HTMLForm fields to overwrite values set via query parameters, when the 
latter are set to empty strings. Do you know a clean way to do it?

Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] tools.wmflabs.org certificate expired

2015-09-15 Thread Ricordisamoa

https://phabricator.wikimedia.org/T112608

Il 15/09/2015 11:43, Strainu ha scritto:

Hi guys,

The tools.wmflabs.org certificate has expired this morning. This
directly impatcs wikis using services, such as maps, in the pages.

Thanks,
   Strainu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikidata-tech] Wikidata API breaking changes

2015-09-09 Thread Ricordisamoa

It has only been deployed on test.wikidata so far.

Il 09/09/2015 10:14, Magnus Manske ha scritto:

Hm, works for me:

{"entities":{"Q42":{"pageid":138,"ns":0,"title":"Q42"



On Wed, Sep 9, 2015 at 8:24 AM Lydia Pintscher 
wrote:


On Wed, Sep 9, 2015 at 8:04 AM, John Mark Vandenberg 
wrote:

The merged changeset included changes which were not advertised,
causing pywikibot to break.  See T110559

The wbgetentities JSON 'entities' is now an array/list of entities
instead of a mapping/dictionary of 'Qd' => entity.

We're looking into it now.


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata-tech mailing list
wikidata-t...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Question: Should wikimedia wikis such as wikipedia be updated with a user frendly desgn

2015-09-09 Thread Ricordisamoa
There is some work about getting the mobile skin on desktop (T71366 
), and Blueprint 
 powers the Living Style 
Guide.
However, no matter how ancient Vector may look, the little updates it 
has received over the years make me think it isn't that bad for those 
who use it.


Il 08/09/2015 19:53, Thomas Mulhall ha scritto:

  Hi this is a question but shoulden Wikimedia wikis such as Wikipedia be 
updated with a user friendly design. Currently vector is coming out of date 
because now a days you see sites with bruitiful colours not old ones as they 
were in 2010 when vector came out. We could create another skin to replace 
vector as we did with monobook or update vector with a new look that has bold 
colours and goes along with mediawiki code and is also mobile optimised even 
though we have the mobilefrontend extension some users may not want to install 
instead hoping the skin is mobile optimised.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm/composer entry points in release branches

2015-09-08 Thread Ricordisamoa

Il 08/09/2015 10:26, Legoktm ha scritto:

Hi,

On 09/07/2015 01:20 PM, Antoine Musso wrote:

Jenkins jobs start failing on some repositories release branches because
they are lacking the CI [entry points].  For example if the 'npm' job
has been enabled, the master branch has a 'package.json' but the release
branches do not.  That causes CI to reject backports.

Another idea I've been thinking about is making the "npm" and
"composer-test" jobs pass if no package.json/composer.json is present.
In addition to fixing the release branch issue, it also would allow us
to make CI/zuul more self-service. We could enable those jobs for all
repos, and people could add npm/composer tests and immediately get
feedback and start using them, without having to wait for someone to
update the zuul config.


Yes! Also tox.ini please.




The commands being run would need be tweaked but at the very minimal
each branches should check:

* the i18n files using grunt-banana-checker (in package.json)
* php lint (jakub-onderka/php-parallel-lint) (in composer.json)

phplint should be doable, but not all branches may pass banana-checker.
Many repositories still don't pass, and many more have only recently
begun to pass (see blockers on [1]).

[1] https://phabricator.wikimedia.org/T94547

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-03 Thread Ricordisamoa

I appreciate the acknowledgement of failure.
It's time for the community for an even braver move, let's take full 
control of Flow's development and get it to be actually usable.


Il 01/09/2015 23:26, Danny Horn ha scritto:

For a while now, the Collaboration team has been working on Flow, the
structured discussion system. I want to let you know about some changes in
that long-term plan.

While initial announcements about Flow said that it would be a universal
replacement for talk pages, the features that were ultimately built into
Flow were specifically forum-style group discussion tools. But article and
project talk pages are used for a number of important and complex processes
that those tools aren't able to handle, making Flow unsuitable for
deployment on those kinds of pages.

To better address the needs of our core contributors, we're now focusing
our strategy on the curation, collaboration, and admin processes that take
place on a variety of pages. Many of these processes use complex
workarounds -- templates, categories, transclusions, and lots of
instructions -- that turn blank wikitext talk pages into structured
workflows. There are gadgets and user scripts on the larger wikis to help
with some of these workflows, but these tools aren't standardized or
universally available.

As these workflows grow in complexity, they become more difficult for the
next generation of editors to learn and use. This has increased the
workload on the people who maintain those systems today. Complex workflows
are also difficult to adapt to other languages, because a wiki with
thousands of articles may not need the kind of complexity that comes with
managing a wiki with millions of articles. We've talked about this kind of
structured workflow support at Wikimania, in user research sessions, and on
wikis. It's an important area that needs a lot of discussion, exploration,
and work.

Starting in October, Flow will not be in active development, as we shift
the team's focus to these other priorities. We'll be helping core
contributors reduce the stress of an ever-growing workload, and helping the
next generation of contributors participate in those processes. Further
development on these projects will be driven by the needs expressed by wiki
communities.

Flow will be maintained and supported, and communities that are excited
about Flow discussions will be able to use it. There are places where the
discussion features are working well, with communities that are
enthusiastic about them: on user talk pages, help pages, and forum/village
pump-style discussion spaces. By the end of September, we'll have an opt-in
Beta feature available to communities that want it, allowing users to
enable Flow on their own user talk pages.

I'm sure people will want to know more about these projects, and we're
looking forward to those conversations. We'll be reaching out for lots of
input and feedback over the coming months.

Danny Horn
Collaboration team, PM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Collaboration team reprioritization

2015-09-03 Thread Ricordisamoa

Il 04/09/2015 01:24, Brandon Harris ha scritto:

On Sep 3, 2015, at 4:06 PM, Ricordisamoa<ricordisa...@openmailbox.org>  wrote:

I appreciate the acknowledgement of failure.

I don't think that's what was said at all.  You may wish to re-read all 
of this.


Putting "active development" on hold when the software is little used in 
production and even some features a MVP should have had are missing, 
really sounds like a failure to me, although Danny has been very good at 
not making it sound like it.
"To better address the needs of our core contributors", "we shift the 
team's focus to these other priorities", "communities that are excited 
about Flow discussions will be able to use it"



---
Brandon Harris ::bhar...@gaijin.com  :: made of steel wool and whiskey


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Try the free Wikipedia app" banners

2015-09-02 Thread Ricordisamoa

Il 02/09/2015 22:26, Antoine Musso ha scritto:

Le 01/09/2015 17:30, Ori Livneh a écrit :

We appear to be running a banner campaign on the mobile web site, driving
people to download the mobile app:

https://en.m.wikipedia.org/wiki/?banner=Aug2015_app_banner_2
https://en.m.wikipedia.org/wiki/?banner=Aug2015_app_banner_1

Campaign definition:
https://meta.wikimedia.org/w/index.php?title=Special:CentralNotice=noticeDetail=Android_app

This isn't cool. This isn't us. We don't drive people from an open platform
to a closed one.

There other Android apps distribution system, the favourite of mine
being [F-Droid] which host only Free and Open Source Software.

The system is open source, all apps are open source and they work hard
on stripping unfree code and notifying privacy infringement.


I have proposed the app back in October 2014 and they apparently keep it
updated. If you look at the app page, they link to the Privacy policy
and Terms of use and warns about the app tracking activity:


https://lists.wikimedia.org/pipermail/wikitech-l/2015-May/thread.html#81740



https://f-droid.org/repository/browse/?fdid=org.wikipedia


Maybe we can advertise that plaform instead?  Will have to get in touch
with them since our banner could well overwhelm their infrastructure.


[F-Droid] https://f-droid.org/





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Try the free Wikipedia app" banners

2015-09-01 Thread Ricordisamoa

Il 02/09/2015 07:39, Matthew Flaschen ha scritto:

On 09/01/2015 11:30 AM, Ori Livneh wrote:
We appear to be running a banner campaign on the mobile web site, 
driving

people to download the mobile app:

https://en.m.wikipedia.org/wiki/?banner=Aug2015_app_banner_2
https://en.m.wikipedia.org/wiki/?banner=Aug2015_app_banner_1

Campaign definition:
https://meta.wikimedia.org/w/index.php?title=Special:CentralNotice=noticeDetail=Android_app 



This isn't cool. This isn't us. We don't drive people from an open 
platform

to a closed one.


I don't necessarily think it's a great idea to push people from web to 
apps either, especially when we also have people working on mobile web.


I also do most of my mobile Wikipedia browsing on mobile web.

That said, I think that assessment is overly critical.

* The Android mobile app is fully free and open source (obvious, since 
all of our stuff is, but worth re-iterating).


* They've done a great job on the app.  In particular, they've 
implemented features that are easier on app (or only feasible there), 
like a user-friendly saved pages list and a nice UI in general.


* I don't know this for sure, but I would guess the app works on 
fully-FOSS versions of Android (e.g. Replicant), since an updated 
version is in the fully-free app store 
(https://f-droid.org/wiki/page/org.wikipedia).  If it doesn't work on 
Replicant (or some similar fully-FOSS Android), that does seem like 
something important to address.


* No one is going to install proprietary software as a result of this 
ad.  It only shows to people who are *already* running Android and 
asks them to install free and open source software.


It's no different then recommending to a Windows user that they 
install Inkscape because it's a great piece of free and open source 
software.


Finally, this is indeed only configured for Finland.


Linus' birthplace...



Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] renaming Wikimedia domains

2015-08-26 Thread Ricordisamoa
Also note that Wikibase sitelinks are based on database names like 
'be_x_oldwiki'...


Il 26/08/2015 07:20, Amir E. Aharoni ha scritto:

Hi,

Is it possible in 2015 to rename Wikimedia domains?

The usual domain name structure for a Wikimedia project is
languageCode.project.org: en.wikipedia.org, it.wikisource.org, etc.

In a few projects the language code in the domain is different from the
actual language code: 'als' is a code for Tosk Albanian, but
als.wikipedia.org is written in Alemanic; 'no' is a code for both Bokmal
Norwegian and Nynorsk Norwegian, but no.wikipedia.org is only Bokmal; and
there are several other examples.

In the past when requests to rename such domains were raised, the usual
replies were along the lines of it's impossible or it's not worth the
technical effort, but I don't know the details.

Is this still correct in 2015?

I would love to get that done, because these inconsistent and non-standard
codes repeatedly cause issues in various languages applications, the
current big one being ContentTranslation.

Thanks!

--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
‪“We're living in pieces,
I want to live in peace.” – T. Moore‬
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Replace Tidy with HTML 5 parse/reserialize

2015-08-19 Thread Ricordisamoa

Il 19/08/2015 15:46, Brandon Black ha scritto:

On Wed, Aug 19, 2015 at 1:22 PM, MZMcBride z...@mzmcbride.com wrote:

Bartosz wrote:

We really do need this feature. Not anything else that Tidy does, most
of its behavior is actually damaging, but we need to match the open and
close tags to prevent the interface from getting jumbled.

My reading of this thread is that this is the consensus view. The problem,
as I see it, is that Tidy has been deployed long enough that some users
are also relying on all of its other bad behaviors. It seems to me that a
replacement for Tidy either has to reimplement all of its unwanted
behaviors to avoid breakage with current wikitext or it has to break an
unknown amount of current wikitext.

My $0.02 from the peanut gallery: If we fixed up the bulk of the most
common cases we can (where the bad HTML is not the result of an edit
error), could we keep a Tidy/HTML5 type of thing around, but move it
to edit validation rather than render output processing?  We could
start by leaving the current output-side code alone, and warning (to
the user as a minor info blurb on edit submission, and in our logs)
about edits that fail validation, so that we can get some idea of the
scope and causes of the problem, fix what we can, and then evaluate
whether we can eventually start flat-out rejecting the minority of
edits that fail validation and then eventually remove the tidy on the
output side.  That ignores the whole problem of existing bad html
already in the DB, of course, but that could probably be fixed with a
one-time job...


Keep in mind that a lot of templates intentionally consist of 'broken' 
HTML that is then 'put back together' in articles...


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-08-15 Thread Ricordisamoa

Great!
As a further improvement, it should be a separate package using the 
service's REST API.


Il 14/08/2015 00:20, C. Scott Ananian ha scritto:

Good news: https://doc.wikimedia.org/Parsoid/master/#!/guide/jsapi now
documents the new friendlier API for Parsoid.
  --scott

On Tue, Aug 11, 2015 at 3:03 PM, Ricordisamoa ricordisa...@openmailbox.org
wrote:


Il 03/08/2015 22:08, C. Scott Ananian ha scritto:


On Sat, Aug 1, 2015 at 2:23 AM, Ricordisamoaricordisa...@openmailbox.org
wrote:

Il 31/07/2015 21:08, C. Scott Ananian ha scritto:

I agree that we have not (to date) spent a lot of time on APIs supporting

direct editing of the Parsoid DOM.  I tend to do things directly using
the
low-level DOM methods myself (and that's how I presented my Parsoid
tutorial at wikimania this year) but I can see the attractiveness of the
`mwparserfromhell` API in abstracting some of the details of the
representation.

Thankfully you can have it both ways!  Over the past week I've cloned
the
`mwparserfromhell` API, build on top of the Parsoid DOM.  The initial
patches have been merged, but there's a little work to do to get the API
docs up on docs.wikimedia.org properly.  Once that's done I'll post
here
with pointers.

Thanks!

Unfortunately, that still requires using Node.js and depending on the
parsoid package.

Clearly you're just trying to bait me into porting my code to python.

I'm baiting you into exposing a mwparserfromhell-like AST from RESTBase.
Then I can deal with a Python client, a PHP one, etc. :-)

I assure you there is nothing JavaScript-specific about this; there are

HTML
DOM-manipulation libraries available in all major programming languages.
HTML *is* an AST (in this case, at least).
   --scott



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l







___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Where to store OAuth application information?

2015-08-11 Thread Ricordisamoa

Il 11/08/2015 20:10, Risker ha scritto:

On 11 August 2015 at 13:07, Mr. Stradivarius misterst...@gmail.com wrote:


On Wed, Aug 12, 2015 at 1:44 AM, Pine W wiki.p...@gmail.com wrote:


Would keeping sensitive pages in wikitext format under full protection
(meaning that only local administrators can edit) be sufficient?


This is asking for trouble. Even if all our admins acted sensibly all the
time - and if you've been around here long enough, you know that's not true
- there is still the very real possibility of admin accounts being
compromised. I have personally fixed XSS flaws in widely used user scripts,
and a determined attacker would be highly likely to find others. This is
best kept out of the control of admins so that if an admin account is
compromised it will not affect other accounts.
___


Just so we're clear here - locking down these kinds of pages is pretty
much what the Superprotect extension does. It is (to put it mildly) not
well-loved by the Wikimedia community; however, it may be possible to
persuade them that there are certain key pages that must not even be
altered by local admins (copyright being the primary example, but probably
some others as well).

This would require very diplomatic discussion.  And given that this is the
'anniversary' of the introduction of Superprotect, it might be better to
wait for a while to really have that conversation.

Risker/Anne
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


OAuth applications' details must remain editable by the app's author.
Superprotect does not account for them.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-08-11 Thread Ricordisamoa

Il 03/08/2015 22:08, C. Scott Ananian ha scritto:

On Sat, Aug 1, 2015 at 2:23 AM, Ricordisamoaricordisa...@openmailbox.org
wrote:


Il 31/07/2015 21:08, C. Scott Ananian ha scritto:


I agree that we have not (to date) spent a lot of time on APIs supporting
direct editing of the Parsoid DOM.  I tend to do things directly using the
low-level DOM methods myself (and that's how I presented my Parsoid
tutorial at wikimania this year) but I can see the attractiveness of the
`mwparserfromhell` API in abstracting some of the details of the
representation.

Thankfully you can have it both ways!  Over the past week I've cloned the
`mwparserfromhell` API, build on top of the Parsoid DOM.  The initial
patches have been merged, but there's a little work to do to get the API
docs up on docs.wikimedia.org properly.  Once that's done I'll post here
with pointers.


Thanks!
Unfortunately, that still requires using Node.js and depending on the
parsoid package.


Clearly you're just trying to bait me into porting my code to python.


I'm baiting you into exposing a mwparserfromhell-like AST from RESTBase.
Then I can deal with a Python client, a PHP one, etc. :-)


I assure you there is nothing JavaScript-specific about this; there are HTML
DOM-manipulation libraries available in all major programming languages.
HTML *is* an AST (in this case, at least).
  --scott



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for the api.php API

2015-08-10 Thread Ricordisamoa
Since the current REST API is available under v1, my take is the v0 
API :-)


Il 10/08/2015 08:52, S Page ha scritto:

tl;dr: PHP action API

I'm organizing content in the mediawiki.org API namespace,
https://phabricator.wikimedia.org/T105133 , and so back to this bikeshed
from August 2014.

We do now have the extra APIs. https://en.wikipedia.org/api/ names them
   *  PHP action API
   *  REST content API

I don't know who came up with the first name. I like it, it straddles Brad
Jorsch's

seems like action API and api.php are the two contenders.

I'm changing the API navigation accordingly,
https://www.mediawiki.org/wiki/Template:API but the shed isn't going
anywhere :)

FWIW in writing documentation, I've found the core API is misleading
because extensions add API modules to it. Is Wikidata part of the core
API when only one wiki implements all its wbXXX modules? A lot of API
clients rely on the extracts and pageimages modules, but they're not part
of core.

Cheers,


it was twelve months ago... 

On Fri, Aug 15, 2014 at 10:00 AM, Tyler Romeo tylerro...@gmail.com wrote:


Agreed with Aaron. When these proposed additional APIs are actually
implemented, then we can start arguing about what to call them.

I know that I personally will continue to call the API the “core web API”
or sometimes just the “web API”, if it is clear based on the context in
which I am talking.
--
Tyler Romeo
0x405D34A7C86B42DF

From: Aaron Halfaker ahalfa...@wikimedia.org
Cc: MediaWiki API announcements  discussion 
mediawiki-...@lists.wikimedia.org
Subject:  Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for
the api.php API

As a heavy user, I generally just refer to the things api.php does as the
API. or MediaWiki's web API when I'm feeling verbose.

I'd be confused about the action API since I generally use it to read
which isn't really action -- even though it corresponds to action=query

As for the proposed REST API, I don't think that proposed things should
affect the naming scheme of things we already know and love.

Also, I think that all bike sheds should match the color of the house to
(1) denote whose bike shed it is and (2) help tie the yard together like
furniture in a living room.


On Fri, Aug 15, 2014 at 3:50 PM, Sumana Harihareswara 
suma...@wikimedia.org

wrote:
I like action API.

Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation


On Thu, Aug 14, 2014 at 5:06 PM, Brad Jorsch (Anomie) 
bjor...@wikimedia.org

wrote:
Summing up, it seems like action API and api.php are the two
contenders.

api.php is least likely to be confused with anything (only its own

entry

point file). But as a name it's somewhat awkward.

action API might be confused with the Action class and its

subclasses,

although that doesn't seem like a big deal.


As for the rest:

Just API is already causing confusion. Although it'll certainly

continue

to be used in many contexts.

MediaWiki API, Web API, and MediaWiki web API are liable to be
confused with the proposed REST API, which is also supposed to be
web-accessible and will theoretically part of MediaWiki (even though

I'd

guess it's probably going to be implemented as an -oid). MediaWiki web
APIs may well grow to encompass the api.php action API, the REST API,

and

maybe even stuff like Parsoid.

MediaWiki API and Core API are liable to be confused with the

various

hooks and PHP classes used by extensions.

JSON API wouldn't be accurate for well into the future, and would

likely

be confused with other JSON-returning APIs such as Parsoid and maybe

REST.

Classic API makes it sound like there's a full replacement.

All the code name suggestions would be making things less clear, not

more.

If it had started out with a code name there would be historical

inertia,

but using a code name now would just be silly.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-08-01 Thread Ricordisamoa

Il 31/07/2015 21:08, C. Scott Ananian ha scritto:

I agree that we have not (to date) spent a lot of time on APIs supporting
direct editing of the Parsoid DOM.  I tend to do things directly using the
low-level DOM methods myself (and that's how I presented my Parsoid
tutorial at wikimania this year) but I can see the attractiveness of the
`mwparserfromhell` API in abstracting some of the details of the
representation.

Thankfully you can have it both ways!  Over the past week I've cloned the
`mwparserfromhell` API, build on top of the Parsoid DOM.  The initial
patches have been merged, but there's a little work to do to get the API
docs up on docs.wikimedia.org properly.  Once that's done I'll post here
with pointers.


Thanks!
Unfortunately, that still requires using Node.js and depending on the 
parsoid package.
Were the mwparserfromhell-like 'AST' exposed by RESTBase directly, 
there'd easily be lots of thin manipulation libraries in different 
programming languages.




Eventually I'd like to put the pieces together and implement something like
a `pywikibot` clone based on this API and using the RESTBase APIs for
read/write access to the wiki.  As has been mentioned, the RESTBase API for
saving edits is not yet quite complete (
https://phabricator.wikimedia.org/T101501); once that is done there should
be no problem connecting the dots.  (In the meantime you can use the API I
just implemented to reserialize the wikitext and then use the standard PHP
APIs, but that's a little bit clunky.)
  --scott
​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-08-01 Thread Ricordisamoa

Il 01/08/2015 01:20, Subramanya Sastry ha scritto:

On 07/31/2015 12:55 PM, Ricordisamoa wrote:


Hi Subbu,
thank you for this thoughtful insight.


And thank you for starting this thread. :-)

HTML is not a barrier by itself. The problem seems to be Parsoid 
being built primarily with VisualEditor in mind.


While we want the DOM to be VE-friendly, we definitely don't want the 
DOM to be VE-centric and that has been the intention from the very 
beginning. Flow, CX also use the Parsoid DOM for their functionality. 
There are other users too [1].


VE, Flow, CX all take advantage of HTML. And I can't make any sense out 
of editProtectedHelper.js 
https://en.wikipedia.org/wiki/User:Jackmcbarn/editProtectedHelper.js :'(


We definitely want Parsoid's output to be useful and usable more 
broadly as the canonical output representation of wikitext and are 
open to fixing whatever prevents that.


As Scott noted in the other email on the thread, inspired (and maybe 
challenged by :-) ) by mwparserfromhell's utilities, he has already 
whipped out a layer that provides an easier interface for manipulating 
the DOM.


It is not clear to me how can a single DOM serving both view and edit 
modes avoid redundancy.


You are right that there are some redundancies in information 
representation (because of having to serve multiple needs), but as far 
as I know, it is mostly around image attributes. If there is anything 
else specific (beyond image attributes) that is bothering you, can you 
flag that?


https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec#Transclusion_content
All template parameters are in data-mw but not parsed. Parameters ending 
up in the 'final' wikitext are parsed separately.




I see huge demand for alternative wikignome-style editors. The more 
Parsoid's DOM is predictable, concise and documented, the more users 
you get. 


I think Parsoid's DOM is predictable :-) but, can you say more about 
what prompted you to say that? 


For example, to find images I have to search elements where typeof is 
one of mw:Image, mw:Image/Thumb, mw:Image/Frame, mw:Image/Frameless, 
then see if it's a figure or a span, and expect either a figcaption or 
data-mw accordingly. Add that the img tag's parent can be a or span...

Instead, this is what I'd expect a proper structure to look like:

Image
.src = title, internal or external link?
.repository?
.page = number or null
.language = code or null
.format = thumb etc.
.caption = wikitext parsed recursively
.link = internal or external link or null
.size
 .original
  .width = 1234
  .height = 4321
 .specified
  .width = 2468
 .computed
  .width = 2468
  .height = 8642

As for documentation, we document the DOM we generate and its 
semantics here [2]. 


It seems that some sections need updates, e.g. noinclude / includeonly / 
onlyinclude 
https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec#noinclude_.2F_includeonly_.2F_onlyinclude


As for size, I just looked at the Barack Obama page and here are some 
size numbers.


By concise I meant an antonym for redundant, not lengthy :-)



1540407 /tmp/Barack_Obama.parsoid.html
1197318 /tmp/Barack_Obama.parsoid.no-data-mw.html
1045161 /tmp/Barack_Obama.php-parser.output.footer-stripped.html

Right now, because we inline template and other editable information 
(as inline JSON attributes of the DOM), it is a bit bulky. However, we 
have always had plans to move the data-mw attribute into its own 
bucket which we might at some point in which case the size will be 
closer to the current PHP parser output. If we moved page properties 
and other metadata out, it will shrink it a little bit more.


For views that don't need to support editing or any other manipulation 
or analyses, we can more aggressively strip more from the HTML without 
affecting the rendering


Stripping HTML altogether would be a huge step forward. :-)

and get close to or even shrink the size below the PHP parser output 
size (there might be use cases where that might be appropriate thing 
to do). I could get this down to under 1M by stripping rel attributes, 
element ids, and about ids for identifying template output.


But, for editing (not just in VE) use cases, because of additional 
markup in place on the page (element ids, other markup for 
transclusions, extensions, links, etc.), the output will probably be 
somewhat larger than the corresponding PHP parser output. If we can 
keep it under 1.1x of php parser output size, I think we are good.



I hope we can meet in the middle :-)


Please file bugs and continue to report things that get in the way of 
using Parsoid.


Subbu.

[1] https://www.mediawiki.org/wiki/Parsoid/Users
[2] http://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-31 Thread Ricordisamoa

Il 24/07/2015 15:53, Subramanya Sastry ha scritto:

On 07/23/2015 01:07 PM, Ricordisamoa wrote:

Il 23/07/2015 15:28, Antoine Musso ha scritto:

Le 23/07/2015 08:15, Ricordisamoa a écrit :

Are there any stable APIs for an application to get a parse tree in
machine-readable format, manipulate it and send the result back 
without

touching HTML?
I'm sorry if this question doesn't make any sense.

You might want to explain what you are trying to do and which wall you
have hit when attempting to use Parsoid :-)



For example, adding a template transclusion as new parameter in 
another template.

XHTML5+RDFa is the wall :-(
Can't Parsoid's deserialization be caught at some point to get a 
higher-level structure like mwparserfromhell 
https://github.com/earwig/mwparserfromhell's?


Parsoid and mwparserfromhell have different design goals and hence do 
things differently.


Parsoid is meant to support HTML editing and hence provides semantic 
information as annotations over the HTML document. It effectively 
maintains a bidirectional/reversible mapping between segments of 
wikitext and DOM trees. You can manipulate the DOM trees and get back 
wikitext that represents the edited tree. As for useless information 
and duplicate information -- I think if you looked at the Parsoid DOM 
spec [1], you will know what to look for and what to manipulate. The 
information on the DOM is meant to (a) render accurately (b) support 
the various bots / clients / gadgets that look for specific kinds of 
information, and (b) be editable easily. If that spec has holes or 
needs updates or fixing, we are happy to do that. Do let us know.


mwparserfromhell is an entirely wikitext-centric library as far as I 
can tell. It is meant to manipulate wikitext directly. It is a neat 
library which provides a lot of utilities and makes it easy to do 
wikitext transformations. It doesn't know about or care about HTML 
because it doesn't need to. It also seems to effectively gives you 
some kind of wikitext-centric AST. These are all impressions based on 
a quick scan of its docs -- so pardon any misunderstandings.


Parsoid does not provide you a wikitext AST directly since it doesn't 
construct one. All wikitext information shows up indirectly as DOM 
annotations (either attributes or JSON information in attributes). As 
Scott showed, you can still do document (wikitext) manipulations 
using DOM libraries, CSS-style queries, or directly by walking the 
DOM. There are lots of ways you can edit mediawiki pages without 
knowing about wikitext and using the vast array of HTML libraries. 
That happens to be our tagline: we deal with wikitext so you don't 
have to.


But, you are right. It can indeed seem cumbersome if you want to 
directly manipulate wikitext without the DOM getting in between or 
having to deal with DOM libraries. But that is not the use case we 
target. There are a vastly greater number of libraries in all kinds of 
languages (and developers) that know about HTML and can render, 
handle, and manipulate HTML easily than know how to (or want to) 
manipulate wikitext programmatically. Kind of the difference between 
the wikitext editor and the visual editor. They each have their 
constituencies and roles.


All that said, as Scott noted, it is possible to develop a 
mwparserfromhell like layer on top of the Parsoid DOM annotations if 
you want a wikitext-centric view (as opposed to a DOM-centric view 
that most editing clients seem to want). But, since that is not a use 
case that we target, that hasn't been on our radar. If someone does 
want to take that on, and thinks it would be useful, we are happy to 
provide assistance. It should not be too difficult.


Does that help summarize this issue and clarify the differences and 
approaches of these two tools? I am on vacation :-)  so responses 
will be delayed.


Subbu.

[1] http://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec



Hi Subbu,
thank you for this thoughtful insight.
HTML is not a barrier by itself. The problem seems to be Parsoid being 
built primarily with VisualEditor in mind. It is not clear to me how can 
a single DOM serving both view and edit modes avoid redundancy.
I see huge demand for alternative wikignome-style editors. The more 
Parsoid's DOM is predictable, concise and documented, the more users you 
get. I hope we can meet in the middle :-)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-24 Thread Ricordisamoa

Il 24/07/2015 15:56, C. Scott Ananian ha scritto:

On Fri, Jul 24, 2015 at 12:34 AM, Ricordisamoa ricordisa...@openmailbox.org

wrote:
Parsoid's expressiveness seems to convey useless information, overlook
important details, or duplicate them in different places.
If I want to resize an image, am I supposed to change data-file-width
and data-file-height? width and height? Or src?


These are great points, and reports from folks like you will help to
improve our documentation.  My goal for Parsoid's DOM[1] is that every bit
of information from the wikitext is represented exactly *once* in the
result.


Be it so!



In your example, `data-file-width` and `data-file-height` represent the
*unscaled* size of the *source* image.  Many image scaling operations want
to know this, so we include it in the DOM.  It is ignored when you convert
back to wikitext.

The `width` and `height` attributes are what you should modify if you want
to resize an image, just like you would do for any naive html editor.


AFAICS there's still no way to know exactly how an image's size was 
specified in the original wikitext.




The `src` attribute is again mostly ignored (sigh); the 'resource'
attribute specifies the url of the unscaled image.  Of course if 'resource'
is missing we'll try to make do with `src`; we really try hard to do
something reasonable with whatever we're given.
   --scott

[1] There is a tension between don't repeat yourself and the use of
Parsoid DOM for read views.  Certain attributes (like alt and title)
get duplicated by default by the PHP parser.  So far I think we've been
mostly successful in not letting this sort of thing infect the Parsoid DOM,
but there may be corner cases we accomodate for the sake of ease-of-use for
viewers.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-24 Thread Ricordisamoa

Thanks Marko. Replies inline

Il 24/07/2015 15:07, Marko Obrovac ha scritto:

On 24 July 2015 at 07:34, Ricordisamoa ricordisa...@openmailbox.org wrote:


Il 24/07/2015 06:35, C. Scott Ananian ha scritto:


Well, it's really just a different way of thinking about things.  Instead
of:
```


import mwparserfromhell

text = I has a template! {{foo|bar|baz|eggs=spam}} See it?
wikicode = mwparserfromhell.parse(text)
templates = wikicode.filter_templates()


```

you would write:
```
js Parsoid = require('parsoid');
js text = I has a template! {{foo|bar|baz|eggs=spam}} See it?;
js Parsoid.parse(text, { document: true }).then(function(res) {
templates =
res.out.querySelectorAll('[typeof~=mw:Transclusion]');
console.log(templates);
   }).done();
```

That said, it wouldn't be hard to clone the API of

http://mwparserfromhell.readthedocs.org/en/latest/api/mwparserfromhell.html


Parsoid's expressiveness seems to convey useless information, overlook
important details, or duplicate them in different places.
If I want to resize an image, am I supposed to change data-file-width
and data-file-height? width and height? Or src?
I think what I'm looking for is sort of an 'enhanced wikitext' rather than
'annotated HTML'.

  and that would probably be a great addition to the parsoid package API.

HTML is just a tree structured data representation.  Think of it as XML if
it makes you happier.  It just happens to come with well-defined semantics
and lots of manipulation libraries.

I don't know about edits tagged as VisualEditor.  That seems like that
should only be done by VE.


All edits made via visualeditoredit 
https://www.mediawiki.org/w/api.php?action=helpmodules=visualeditoredit
are tagged.

  I take it you would like an easy work flow to

fetch a page, make edits, and then write the new revision back?


Right.


RESTBase could help you there. With one API call, you can get the (stored)
latest HTML revision of a page in Parsoid format~[1], but without the need
to wait for Parsoid to parse it (if the latest revision is in RESTBase's
storage).


What if it isn't?


There is also section API support (you can get individual HTML
fragments of a page by ID, and send only those back for transformation into
wikitext~[2]). There is also support for page editing (aka saving), but
these endpoints have not yet been enabled for WMF wikis in production due
to security concerns.


Then I guess HTML would have to be converted into wikitext before 
saving? +1 API call




[1]
https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/page_html__title__get
[2]
https://en.wikipedia.org/api/rest_v1/?doc#!/Transforms/transform_sections_to_wikitext__title___revision__post

Cheers,
Marko




mwparserfromhell doesn't actually seem to have that functionality
It is actually pretty easy to do with Pywikibot.
But since Parsoid happens to work server-side, it makes sense to request
and send back the structured tree directly.

  , but it

would also be nice to facilitate that use case if we can.
--scott

​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Thanks for your time.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l







___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-24 Thread Ricordisamoa

Il 24/07/2015 17:18, James Forrester ha scritto:

On 23 July 2015 at 22:34, Ricordisamoa ricordisa...@openmailbox.org wrote:


Il 24/07/2015 06:35, C. Scott Ananian ha scritto:


I don't know about edits tagged as VisualEditor.  That seems like that


should only be done by VE.
All edits made via visualeditoredit 
https://www.mediawiki.org/w/api.php?action=helpmodules=visualeditoredit
are tagged.


​Yes. That's because that is the *private* API for VisualEditor. It
absolutely should not ever be used by anyone else.​ It's not like any of
the 'real' APIs in MediaWiki – it is designed for exactly one use case
(VisualEditor), makes huge assumptions about the world and what is needed
(like tagging edits), and we make breaking changes all the time.
Unfortunately, the request to badge internal APIs got turned into flagging
it and similar APIs in MediaWiki as This module is internal or unstable.,
which isn't strong enough on just how bad an idea it is to use it. I would
extremely strongly suggest that you do not use it, ever.


Oops. https://test.wikipedia.org/w/index.php?title=Tablezaction=history



As Marko, Subbu and Scott point out, we have actual public APIs for this
kind of stuff, in the forms of RESTbase and Parsoid, and that's what you
should use.

Yours,



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-23 Thread Ricordisamoa

Il 24/07/2015 06:35, C. Scott Ananian ha scritto:

Well, it's really just a different way of thinking about things.  Instead
of:
```

import mwparserfromhell
text = I has a template! {{foo|bar|baz|eggs=spam}} See it?
wikicode = mwparserfromhell.parse(text)
templates = wikicode.filter_templates()

```
you would write:
```
js Parsoid = require('parsoid');
js text = I has a template! {{foo|bar|baz|eggs=spam}} See it?;
js Parsoid.parse(text, { document: true }).then(function(res) {
   templates = res.out.querySelectorAll('[typeof~=mw:Transclusion]');
   console.log(templates);
  }).done();
```

That said, it wouldn't be hard to clone the API of
http://mwparserfromhell.readthedocs.org/en/latest/api/mwparserfromhell.html


Parsoid's expressiveness seems to convey useless information, overlook 
important details, or duplicate them in different places.
If I want to resize an image, am I supposed to change data-file-width 
and data-file-height? width and height? Or src?
I think what I'm looking for is sort of an 'enhanced wikitext' rather 
than 'annotated HTML'.



and that would probably be a great addition to the parsoid package API.

HTML is just a tree structured data representation.  Think of it as XML if
it makes you happier.  It just happens to come with well-defined semantics
and lots of manipulation libraries.

I don't know about edits tagged as VisualEditor.  That seems like that
should only be done by VE.


All edits made via visualeditoredit 
https://www.mediawiki.org/w/api.php?action=helpmodules=visualeditoredit 
are tagged.



I take it you would like an easy work flow to
fetch a page, make edits, and then write the new revision back?


Right.


  mwparserfromhell doesn't actually seem to have that functionality


It is actually pretty easy to do with Pywikibot.
But since Parsoid happens to work server-side, it makes sense to request 
and send back the structured tree directly.



, but it
would also be nice to facilitate that use case if we can.
   --scott

​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Thanks for your time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Brad Jorsch - Senior Software Engineer

2015-07-23 Thread Ricordisamoa

Hooray! Looking forward to his next evolution :-)

Il 09/07/2015 19:38, Bryan Davis ha scritto:

I'm pleased to announce that Brad Jorsch has been promoted to Senior
Software Engineer.

Brad joined the WMF in October of 2012 as a member of the MediaWiki
Core team under RobLa [0]. Prior to joining the WMF, Brad was a strong
contributor in a volunteer capacity both on-wiki and via code
contributions. He has community earned admin rights on enwiki and his
bots (AnomieBOT, AnomieBOT II, MediationBot, MedcabBot, ...) have made
over 1.8 million edits on project wikis [1]. His community
contributions led directly to his recruitment and hiring following the
2012 Berlin Hackathon.

During his tenure at the WMF, Brad has become the de-facto owner of
the Scribunto extension and the primary maintainer of the MediaWiki
web API (api.php). During the 2014-2015 fiscal year, Brad has focused
primarily on the MediaWiki web API (api.php). This largely solo
project has included writing an RfC [2] and working on implementation
not only in MediaWiki core but also across many affected extensions.
Recent improvements have included JSON format updates [3] and the
simplification of continuation mode processing for clients [4].

I look forward to working with Brad on the Wikimedia Reading
Infrastructure team to continue stewardship of the MediaWiki web API
[5] and expect to see continued excellence from his MediaWiki and
Wikimedia contributions.

[0]: https://lists.wikimedia.org/pipermail/wikitech-l/2012-October/064120.html
[1]: https://tools.wmflabs.org/guc/?user=AnomieBOT
[2]: https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmap
[3]: 
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2015-May/82.html
[4]: 
https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2015-June/84.html
[5]: https://lists.wikimedia.org/pipermail/wikitech-l/2015-May/081648.html

Bryan



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-23 Thread Ricordisamoa
The slides are interesting, but for now it seems VisualEditor-focused 
and not nearly as powerful as mwparserfromhell.

I don't care about presentation. I don't want HTML.
And I hate getting all edits tagged as VisualEditor.

Il 23/07/2015 22:02, C. Scott Ananian ha scritto:

HTML5+RDFa is a machine-readable format.  But I think what you are asking
for is either better documentation of the template-related stuff (did you
read through the slides in https://phabricator.wikimedia.org/T105175 ?) or
HTML template parameter support (https://phabricator.wikimedia.org/T52587)
which is in the codebase but not enabled by default in production.
  --scott
​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] I love Parsoid but it doesn't want me

2015-07-23 Thread Ricordisamoa
Are there any stable APIs for an application to get a parse tree in 
machine-readable format, manipulate it and send the result back without 
touching HTML?

I'm sorry if this question doesn't make any sense.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia iOS app moving to GH

2015-07-23 Thread Ricordisamoa
The even shorter answer is: you can't amend other people's pull requests 
without being explicitly allowed to.


Il 23/07/2015 11:57, Brian Gerstle ha scritto:

The short answer is: yes. GitHub doesn't have the patch as a concept,
only commits, branches, and forks. We only plan on encountering forks when
merging volunteer contributions. Regardless of whether it's a fork, your
ability to push to a branch co Ed down to whether you're a collaborator
for that repo.

On Wednesday, July 22, 2015, Ricordisamoa ricordisa...@openmailbox.org
wrote:


Il 22/07/2015 23:43, Brian Gerstle ha scritto:


This isn't really about Gerrit vs. GitHub. To be clear, we're mainly doing
this for CI (i.e. Travis).

That said, we (the iOS team) plan for our workflow to play to GitHub's
strengths—which also happen to be our personal preferences.  In short,
this
means amending patches becomes pushing another commit onto a branch.
We've run into issues w/ rebasing  amending patches destroying our diff
in
Gerrit, and problems with multiple people collaborating on the same patch.


With GitHub it is not possible to amend other people's patches, is it?

  We think GitHub will not only provide integrations for free CI, but, as an

added bonus, also resolve some of the workflow deficiencies that we've
personally encountered with Gerrit.


On Wed, Jul 22, 2015 at 5:14 PM, Gergo Tisza gti...@wikimedia.org
wrote:

  On Wed, Jul 22, 2015 at 4:39 AM, Petr Bena benap...@gmail.com wrote:

  Good job, you aren't the only one. Huggle team is using it for quite

some time. To be honest I still feel that github is far superior to
our gerrit installation and don't really understand why we don't use
it for other projects too.

  GitHub is focused on small projects; for a project with lots of patches

and committers it is problematic in many ways:
* poor repository management (fun fact: GitHub does not even log force
pushes, much less provides any ability to undo them)
* noisy commit histories due to poor support of amend-based workflows,
and
also because poor message generation of the editing interface (Linus
wrote
a famous rant
https://github.com/torvalds/linux/pull/17#issuecomment-5654674 on
that)
* no way to mark patches which depend on each other
* diff view works poorly for large patches
* CR interface works poorly for large patches (no way to write draft
comments so you need to do two passes; discussions can be marked as
obsolete by unrelated code changes in their vicinity)
* hard to keep track of cherry-picks


___
Mobile-l mailing list
mobil...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia iOS app moving to GH

2015-07-23 Thread Ricordisamoa

Il 22/07/2015 12:40, Brian Gerstle ha scritto:

Hey everyone,

I'm writing with plans for the Wikimedia iOS engineering team to move its
workflow to GitHub with Travis CI, much like RESTbase.

The Wikimedia iOS engineers have been maintaining their own CI and build
server and using Gerrit for code review. The more time efficient and
commonplace approach for open source iOS software development leans heavily
on GitHub with Travis CI instead (e.g., WordPress[0][1] and Firefox[2][3]).
By using GitHub with Travis CI, the team believes it will work faster,
improve testing, grow developer confidence in making code changes, and,
most importantly, deploy fewer bugs to production.


By the way, the Pywikibot team has been using Gerrit  Travis CI 
https://travis-ci.org/wikimedia/pywikibot-core for quite some time.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I love Parsoid but it doesn't want me

2015-07-23 Thread Ricordisamoa

Il 23/07/2015 15:28, Antoine Musso ha scritto:

Le 23/07/2015 08:15, Ricordisamoa a écrit :

Are there any stable APIs for an application to get a parse tree in
machine-readable format, manipulate it and send the result back without
touching HTML?
I'm sorry if this question doesn't make any sense.

You might want to explain what you are trying to do and which wall you
have hit when attempting to use Parsoid :-)



For example, adding a template transclusion as new parameter in another 
template.

XHTML5+RDFa is the wall :-(
Can't Parsoid's deserialization be caught at some point to get a 
higher-level structure like mwparserfromhell 
https://github.com/earwig/mwparserfromhell's?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia iOS app moving to GH

2015-07-22 Thread Ricordisamoa

 * CR fragmentation
 * CI fragmentation
 * more reliance on proprietary software


Il 22/07/2015 12:40, Brian Gerstle ha scritto:

Hey everyone,

I'm writing with plans for the Wikimedia iOS engineering team to move its
workflow to GitHub with Travis CI, much like RESTbase.

The Wikimedia iOS engineers have been maintaining their own CI and build
server and using Gerrit for code review. The more time efficient and
commonplace approach for open source iOS software development leans heavily
on GitHub with Travis CI instead (e.g., WordPress[0][1] and Firefox[2][3]).
By using GitHub with Travis CI, the team believes it will work faster,
improve testing, grow developer confidence in making code changes, and,
most importantly, deploy fewer bugs to production.

For builds requiring sensitive information (e.g., prod certs), will
continue to run on WMF's Mac Mini. As is done for Android, when betas are
pushed, the team will notify mobile-l.

Feel free to reply or email me directly with any questions or comments.

Regards,

Brian

0: https://github.com/wordpress-mobile/WordPress-iOS
1: https://travis-ci.org/wordpress-mobile/WordPress-iOS
2: https://github.com/mozilla/firefox-ios
3: https://travis-ci.org/mozilla/firefox-ios



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [WikimediaMobile] Wikipedia iOS app moving to GH

2015-07-22 Thread Ricordisamoa

Il 22/07/2015 23:43, Brian Gerstle ha scritto:

This isn't really about Gerrit vs. GitHub. To be clear, we're mainly doing
this for CI (i.e. Travis).

That said, we (the iOS team) plan for our workflow to play to GitHub's
strengths—which also happen to be our personal preferences.  In short, this
means amending patches becomes pushing another commit onto a branch.
We've run into issues w/ rebasing  amending patches destroying our diff in
Gerrit, and problems with multiple people collaborating on the same patch.


With GitHub it is not possible to amend other people's patches, is it?


We think GitHub will not only provide integrations for free CI, but, as an
added bonus, also resolve some of the workflow deficiencies that we've
personally encountered with Gerrit.


On Wed, Jul 22, 2015 at 5:14 PM, Gergo Tisza gti...@wikimedia.org wrote:


On Wed, Jul 22, 2015 at 4:39 AM, Petr Bena benap...@gmail.com wrote:


Good job, you aren't the only one. Huggle team is using it for quite
some time. To be honest I still feel that github is far superior to
our gerrit installation and don't really understand why we don't use
it for other projects too.


GitHub is focused on small projects; for a project with lots of patches
and committers it is problematic in many ways:
* poor repository management (fun fact: GitHub does not even log force
pushes, much less provides any ability to undo them)
* noisy commit histories due to poor support of amend-based workflows, and
also because poor message generation of the editing interface (Linus wrote
a famous rant
https://github.com/torvalds/linux/pull/17#issuecomment-5654674 on that)
* no way to mark patches which depend on each other
* diff view works poorly for large patches
* CR interface works poorly for large patches (no way to write draft
comments so you need to do two passes; discussions can be marked as
obsolete by unrelated code changes in their vicinity)
* hard to keep track of cherry-picks


___
Mobile-l mailing list
mobil...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l







___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Phabricator filter by language

2015-07-17 Thread Ricordisamoa
You can already infer some languages from the project: Pywikibot → 
Python, Hierator → Java etc.
And nearly any other one will have language-php then. But for C++ it 
might still make sense.


Il 17/07/2015 10:04, Petr Bena ha scritto:

Hi,

What if we added extra projects to phabricator for programming
languages (such as language-php, language-c) which could be optionally
added to some tickets if help of people who know these languages would
be needed. So that it would be possible for example to c++ experts to
filter out open tasks that need c++ expert to look in them and so on?

Currently I have few of such tasks that I would like to have experts
in some language to look at, but there isn't really an easy way to do
that.

What you think? Should we add these meta-projects?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Yandex?

2015-07-02 Thread Ricordisamoa

Il 02/07/2015 21:55, Legoktm ha scritto:

I am also interested in the answer to Nemo's question about whether this
is the first piece of proprietary software ever entering use in the
Wikimedia projects land?


Also Qualtrics 
https://it.wikisource.org/wiki/Wikisource:Bar/Archivio/2013.10#Sondaggio_su_Wikisource




-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Yandex?

2015-07-01 Thread Ricordisamoa

Il 02/07/2015 03:28, Legoktm ha scritto:

On 07/01/2015 11:29 AM, Grace Gellerman wrote:

https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-07-01

I noticed: Yandex coming up soon! under ContentTranslation. Are there
more details about what this means?

-- Legoktm


https://phabricator.wikimedia.org/T89844 I think

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Freedom of Panorama banner campaign breaking mobile

2015-06-30 Thread Ricordisamoa

If the aim is to grab attention, then it's fulfilled.

Il 30/06/2015 23:49, Jon Robson ha scritto:

I noticed a banner on the mobile site that renders the site unusable:
http://imgur.com/qVGz3mZ

I'm not sure who is responsible for Freedom of Panorama in Europe in
2015 but can someone disable this on mobile asap or make it work on
mobile?

Please also reach out to us on the mobile-l mailing list ahead of
running these campaigns if you are unsure how to test campaigns, we're
happy to help.

Jon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming SyntaxHighlight_GeSHi changes

2015-06-29 Thread Ricordisamoa

Il 29/06/2015 20:01, Brad Jorsch (Anomie) ha scritto:

On Mon, Jun 22, 2015 at 8:48 PM, Ori Livneh o...@wikimedia.org wrote:


Over the course of the next two days, a major update to the
SyntaxHighlight_GeSHi extension will be rolled out to Wikimedia wikis. The
change swaps geshi, the unmaintained PHP library which performs the lexical
analysis and output formatting of code, for another library, called
Pygments.


... Please tell me we're not really going to have the final state here be
an extension named SyntaxHighlight_GeSHi that doesn't use GeSHi anymore.




Here we are.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interwiki gone from Recentchanges?

2015-06-27 Thread Ricordisamoa

Il 27/06/2015 13:14, Bartosz Dziewoński ha scritto:
On Sat, 27 Jun 2015 10:03:32 +0200, Ole Palnatoke Andersen 
palnat...@gmail.com wrote:



On https://da.wikipedia.org/wiki/Speciel:Seneste_%C3%A6ndringer and
https://sv.wikipedia.org/wiki/Special:Senaste_%C3%A4ndringar, there is
no interwiki. Has this bug been reported?


Possibly https://phabricator.wikimedia.org/T102888 ?



Yes, it's definitely that.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data and Developer Hub protoype

2015-06-26 Thread Ricordisamoa

Il 26/06/2015 01:09, S Page ha scritto:

Our competition for developer mindshare is sites like
https://developers.google.com/ .


Act fast when you see a new opportunity and monetize your web content 
with just a small snippet of JavaScript.

vs
Imagine a world in which every single human being can freely share in 
the sum of all knowledge. That's our commitment.


Competition?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upcoming SyntaxHighlight_GeSHi changes

2015-06-23 Thread Ricordisamoa

Il 23/06/2015 07:30, John Mark Vandenberg ha scritto:

Since it is short, here is the full list of languages being de-supported.

6502acme
68000devpac
algol68
arm
avisynth
bibtex
bnf
cil
dot
e
email
euphoria
gml
ldif
lolcode


No!!! I need LOLCODE!!!


mirc
mmix
mpasm
oz
parigp
pcre
pic16
pli
q
robots
sas
teraterm
typoscript
unicon
whois
xpp


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-13 Thread Ricordisamoa

Il 09/06/2015 11:32, Steinsplitter Wiki ha scritto:

Maybe someone with enough time and knowledge can fork compat and keep it 
alive...



Everyone is free to fork it, of course, but in this case it'd only bring 
more fragmentation to the ecosystem.

As John said, patches are always welcome.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Selecting a preview slide for a PDF

2015-06-12 Thread Ricordisamoa

Just add a page parameter to the file syntax:
https://www.mediawiki.org/wiki/Extension:PdfHandler#Usage

This feature is actually quite old:
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/25575

Il 12/06/2015 08:56, Pine W ha scritto:

When creating thumbnails I've been successful at selecting the particular
frame of a video to be shown as the thumbnail. Is there a way to do
something similar, to show a particular page of a PDF as the thumbnail in
an article?

Thanks,

Pine
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please help a patch not to become two years old

2015-06-08 Thread Ricordisamoa

Discover why you should never merge this 10 patches...

Il 09/06/2015 00:18, Stephen Niedzielski ha scritto:

Devs hate him!

and

Congratulations! You have been selected to win a free iPatch.

On Mon, Jun 8, 2015 at 4:11 PM, Vi to vituzzu.w...@gmail.com wrote:


There's a weird loophole in devs' brain which will make any dev want to
merge your patches NOW!

Vito

2015-06-08 23:44 GMT+02:00 Brian Wolff bawo...@gmail.com:


On 6/8/15, Andre Klapper aklap...@wikimedia.org wrote:

On Sat, 2015-06-06 at 21:40 +0200, Ricordisamoa wrote:

https://gerrit.wikimedia.org/r/67588 needs love.
Less than 2 days left! Thanks in advance.

For future reference, adding basic context (e.g.: Scribunto here) is
very welcome if you would also like to reach people who normally do not
click links named 50 things that will make you say Oh! or You cannot
imagine what will happen at the end of this video.  :)

Thanks,
andre (obviously surprise-averse)
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

You won't believe this one weird trick to get your patch reviewed in
under 2 years.

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please help a patch not to become two years old

2015-06-08 Thread Ricordisamoa

Il 08/06/2015 12:56, Andre Klapper ha scritto:

On Sat, 2015-06-06 at 21:40 +0200, Ricordisamoa wrote:

https://gerrit.wikimedia.org/r/67588 needs love.
Less than 2 days left! Thanks in advance.

For future reference, adding basic context (e.g.: Scribunto here) is
very welcome if you would also like to reach people who normally do not
click links named 50 things that will make you say Oh! or You cannot
imagine what will happen at the end of this video.  :)

Thanks,
andre (obviously surprise-averse)


Thanks for the suggestion!
And the two years are over... see you in 2016 :-(

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stalled task: page status indicator sorting

2015-06-08 Thread Ricordisamoa

Il 08/06/2015 14:08, Erwin Dokter ha scritto:

I would like some more eyes on https://phabricator.wikimedia.org/T94307.

Since its inception, page status indicators have one major drawback; 
automatic sorting cannot be disabled.


It purports to give placements control as it sorts by indicator name, 
but this proves difficult, as none of the templates invoking idicators 
allow passing the name.


That depends on local wikis, not indicators.


And even then, order is not guaranteed due to the used sorting algorithm.

So while in theory, it would make more sense to leave sorting be 
govenrned by local consensus, we are now stuck with a solution that 
essentially does not allow any order control at all.


I submitted a patch that disables sorting, but the task and patch are 
stalled. So please, more voices and solutions on how we can solve this.


Regards,


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please help a patch not to become two years old

2015-06-06 Thread Ricordisamoa

Yes, I'm an egoist.

Il 06/06/2015 22:10, Alex Monk ha scritto:

That's actually quite far down the list:
https://www.mediawiki.org/wiki/Gerrit/Reports/Oldest_open_changesets

On 6 June 2015 at 20:40, Ricordisamoa ricordisa...@openmailbox.org wrote:


https://gerrit.wikimedia.org/r/67588 needs love.
Less than 2 days left! Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please help a patch not to become two years old

2015-06-06 Thread Ricordisamoa

https://gerrit.wikimedia.org/r/67588 needs love.
Less than 2 days left! Thanks in advance.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Use Supercounter as an API

2015-06-01 Thread Ricordisamoa

Hi Nasir,
apparently SuperCount's author is not new to preventing others from 
accessing his code:


 * https://github.com/x-Tools/SuperCount/issues/31
 * https://meta.wikimedia.org/wiki/?diff=5767488oldid=5766389

He may have counter-arguments though. Try asking him!

Il 01/06/2015 07:18, Nasir Khan ha scritto:

Hi Ricordisamoa,
the API is working fine.

can you tell me how can i get the access to the code of the super counter?
i found a github repor but there is no code there
https://github.com/x-Tools/SuperCount


--
*Nasir Khan Saikat*
www.nasirkhn.com


On Sun, May 31, 2015 at 12:29 AM, Ricordisamoa ricordisa...@openmailbox.org

wrote:
I saw there's an API at https://tools.wmflabs.org/supercount/api.php


Il 30/05/2015 19:48, Nasir Khan ha scritto:


Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The Super Counter[1] has some of the features i want to implement in my
tool.

So is there any way to use the Super Counter as an API? if so, can
anyone
please show me the path of the documentation?


[1] - https://tools.wmflabs.org/supercount/index.php

--
*Nasir Khan Saikat*
www.nasirkhn.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Use Supercounter as an API

2015-05-30 Thread Ricordisamoa

I saw there's an API at https://tools.wmflabs.org/supercount/api.php

Il 30/05/2015 19:48, Nasir Khan ha scritto:

Hi,
I want to build a tool which will generate the statistics for my native
wiki. Before starting to build a tool i was searching for an existing one.
The Super Counter[1] has some of the features i want to implement in my
tool.

So is there any way to use the Super Counter as an API? if so, can anyone
please show me the path of the documentation?


[1] - https://tools.wmflabs.org/supercount/index.php

--
*Nasir Khan Saikat*
www.nasirkhn.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simplifying the WMF deployment cadence

2015-05-27 Thread Ricordisamoa

Il 27/05/2015 22:19, Greg Grossmeier ha scritto:

Hi all,

Starting the week of June 8th we'll be transitioning our MediaWiki +
Extensions deployment cadence to a shorter/simpler one. This will begin
with 1.26wmf9.

New cadence:
Tuesday: New branch cut, deployed to test wikis
Wednesday: deployed to non-wikipedias
Thursday: deployed to Wikipedias

This is not only a lot simpler to understand (wait, we deploy twice on
Wednesday?) but it also shortens the time to get code to everyone (2 or
3 days from branch cut, depending on how you count).


Two days... this is awesome.



== Transition ==
Transitions from one cadence to another are hard. Here's how we'll be
doing this transition:

Week of June 1st (next week):
* We'll complete the wmf8 rollout on June 3rd
* However, we won't be cutting wmf9 on June 3rd

Week of June 8th (in two weeks):
* We'll begin the new cadence with wmf9 on Tuesday June 9th


I hope this helps our users and developers get great new features and
fixes faster.

Greg

endnotes:
* The task: https://phabricator.wikimedia.org/T97553
* I'll be updating the relevant documentation before the transition




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki, as seen on TV

2015-05-25 Thread Ricordisamoa

It's a well-known fact that MediaWiki can blow up printers.

Il 24/05/2015 22:06, Steven Walling ha scritto:

There's a pretty hilarious American police procedural TV show in 2015
called CSI: Cyber, featuring mostly cybercrime. Obviously they have to
dredge up snippets of code from places for screenshots on the show.

Episode 4 happened to include a tidbit from MediaWiki 1.25/wmf3. Supposedly
the code was a hack to make your printer blow up.

Original lulz and screenshots via
http://moviecode.tumblr.com/post/114815574587/this-is-from-csi-cyber-s01e04-according-the-the
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] First impression with Differential

2015-05-22 Thread Ricordisamoa

Il 21/05/2015 10:43, Quim Gil ha scritto:

Hi, thank you for this short and fresh review. Your help is welcome at
https://phabricator.wikimedia.org/T597, where we are trying to identify
blockers for using Arcanist, so we can discuss them and address them
effectively.

Meanwhile, some comments here.

On Thu, May 21, 2015 at 9:01 AM, Ricordisamoaricordisa...@openmailbox.org
wrote:


review
rant
Arcanist has to be manually cloned from Git and added to $PATH. Really?


Having seen how users struggle installing git-review and dependencies in
their environments, I'm not sure this is a bad idea. Plus, I guess it makes
updating to master pretty easy as well?


It's not hard at all. It just seems hackish and half-baked.




Test Plan is required.


Sounds like a good practice to me. Worst case scenario, type I didn't test
this patch at all.


I believe encouraging and enforcing good practices belongs to people, 
not computer programs.
Moreover, this field could easily scare off new contributors who aren't 
used to such practices yet.





.arcconfig should be automatically detected on git clone.
I can't review my own revisions.


Neither should you, that is the point of code review. Then again, if there
is no workaround for this, it might be a blocker for urgent Ops deployments
(where we see many self-merged patches) and one-person projects. If this is
the case, please create a blocker for T597 so we can discuss it in detail.


Again, I don't see why software should ever intrude on community processes.
For example, I've never merged a patch of mine into Pywikibot, but I 
reserve the right to do so in exceptional cases.





Lint and Unit are shown as completely different processes.
Diffs all over the page clutter the UI.
No powerful plain-text Gerrit-like queries.
I have to click Edit Revision to add reviewers.
No -2/-1/+1/+2. WTF?


See Tokens! below. :) Discuss:https://phabricator.wikimedia.org/T138



/rant
yay
Tokens!
Comment preview!
Can paste raw diffs!
/yay
summary
Some nice features aren't worth a change of workflow.
/summary
/review


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Publish your source!

2015-05-21 Thread Ricordisamoa

It should have sounded playful. Please do not take it seriously!

Il 20/05/2015 23:57, Ricordisamoa ha scritto:
219 tools https://tools.wmflabs.org/hay/directory/ and only 146 with 
source available 
https://tools.wmflabs.org/hay/directory/#/keyword/source%20available?

WTF? Publish your f***ing source! ;-)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Publish your source!

2015-05-21 Thread Ricordisamoa

Il 21/05/2015 00:32, Jacek Wielemborek ha scritto:

W dniu 20.05.2015 o 23:57, Ricordisamoa pisze:

219 tools https://tools.wmflabs.org/hay/directory/ and only 146 with
source available
https://tools.wmflabs.org/hay/directory/#/keyword/source%20available?
WTF? Publish your f***ing source! ;-)

Here's my answer: I'm too lazy to write the json and publish it on my
webserver. Should you want to add it somewhere, code is here:
https://github.com/d33tah/wikispy



Thanks for your kindness.
However, I was specifically referring to tools with an already published 
toolinfo but with no source specified.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] First impression with Differential

2015-05-21 Thread Ricordisamoa

review
rant
Arcanist has to be manually cloned from Git and added to $PATH. Really?
Test Plan is required.
.arcconfig should be automatically detected on git clone.
I can't review my own revisions.
Lint and Unit are shown as completely different processes.
Diffs all over the page clutter the UI.
No powerful plain-text Gerrit-like queries.
I have to click Edit Revision to add reviewers.
No -2/-1/+1/+2. WTF?
/rant
yay
Tokens!
Comment preview!
Can paste raw diffs!
/yay
summary
Some nice features aren't worth a change of workflow.
/summary
/review
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Publish your source!

2015-05-20 Thread Ricordisamoa
219 tools https://tools.wmflabs.org/hay/directory/ and only 146 with 
source available 
https://tools.wmflabs.org/hay/directory/#/keyword/source%20available?

WTF? Publish your f***ing source! ;-)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tech Talk: Graphs! Visualize maps and data graphs live on Wikipedia - May 14

2015-05-14 Thread Ricordisamoa

Il 13/05/2015 02:53, Rachel Farrand ha scritto:

Please join us for the following tech talk:

*Tech Talk**:* Graphs! Visualize maps and data graphs live on Wikipedia
*Presenter:* Yuri Astrakhan and Dan Andreescu
*Date:* May 14th
*Time:* 2100 UTC
http://www.timeanddate.com/worldclock/fixedtime.html?msg=Tech+Talk%3A+Graphs+iso=20150514T21p1=3915ah=1


This link shows 21:00 UTC-1. I guess the correct one is 
http://www.timeanddate.com/worldclock/fixedtime.html?msg=Tech+Talk%3A+Graphs+iso=20150514T21, 
isn't it?



Link to live YouTube stream http://www.youtube.com/watch?v=j7DTn9jHnI0
*IRC channel for questions/discussion:* #wikimedia-office
Google+ page
https://plus.google.com/u/0/b/103470172168784626509/events/cjm55bm8ifohmbpubdbvnvlb1n4,
another
place for questions

*Talk description: *Thanks to many great contributors, we are proud to
present the Graphs https://www.mediawiki.org/wiki/Extension:Graph/Demo...
because wiki pages with SVG and PNG images is so last century. Graph
extension https://www.mediawiki.org/wiki/Extension:Graph allows content
authors to insert a data-defined graph or a map in a wiki page. Graph is
described using Vega visualization grammar http://trifacta.github.io/vega/
  ( demo http://trifacta.github.io/vega/editor/), and allows very complex
data transformations, filtering, and soon even animation  interactivity.
Combine that with the power of wiki template parameters and Lua scripting,
and the results could be stellar. Up to you really. Vega+d3 gives us a huge
list of charting options and maps with numerous projections and ability to
highlight individual regions. Lastly, graphs could be rendered either in a
browser (more interactivity), or on the server (faster load). Demo will be
served.

Special thanks goes to milimetric, krinkle, Brion, and gwicke, without
whose help this project would have been a lot harder, and to Vega and other
open source teams who build such great libraries.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please welcome Jaime Crespo

2015-05-14 Thread Ricordisamoa

Il 14/05/2015 14:46, Mark Bergsma ha scritto:

Hi all,

I'm very pleased to announce that we've recently hired Jaime Crespo as
Sr. Database Administrator. Jaime has joined the Technical Operations
team to strengthen our DBA capacity. He will be working closely with
Sean, and will join responsibility for our production database
infrastructure, the Wikimedia Labs replicas and the Analytics/research
databases. His addition to the team will also allow us to support our
developers better with code review and advice about database queries
and schema tuning.

Before he joined us Jaime has been a MySQL/MariaDB DBA consultant,
both at Percona and later as an independent contractor. In that role
he has supported many database environments, large and small. Being a
fan of the free software and open data movements, Jaime is excited to
be employing his experience in such an environment.

Jaime lives in the Zaragoza area in Spain, and will be working with us
remotely from home. Outside of work, he is an active contributor to
the Spanish Wikipedia and the OpenStreetMap projects as well. His
other hobbies include photography, cycling, astronomy, reading and
acting in theater.

Jaime can be found on IRC under the nickname 'jynus'.

Please join me in welcoming him!



Yes, TechOps needs more manpower!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tracking Antifeature?

2015-05-14 Thread Ricordisamoa

FYI
https://f-droid.org/wiki/index.php?title=org.wikipediadiff=54674
https://f-droid.org/wiki/page/Antifeature:Tracking

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new extension

2015-04-18 Thread Ricordisamoa

For the record: this issue has been discussed yesterday on #wikimedia-dev.
Logs are available here 
http://bots.wmflabs.org/%7Ewm-bot/logs/%23wikimedia-dev/20150417.txt 
(starting at 22:48:15).


Il 26/02/2015 10:40, Ricordisamoa ha scritto:
I was supposing that the Wikimedia Foundation would be willing to run 
the W3C Validator on their servers.

Now I ask: is it feasible?

Il 17/12/2014 17:57, Ricordisamoa ha scritto:
I've written a simple MediaWiki extension that uses an instance of 
the W3C Validator service (via the Services_W3C_HTMLValidator 
http://pear.php.net/package/Services_W3C_HTMLValidator PEAR 
package) to validate SVG images hosted on a wiki. It is meant to 
replace the current system on Commons, that relies on individual 
contributors adding templates (e.g. InvalidSVG 
https://commons.wikimedia.org/wiki/Template:InvalidSVG) by hand to 
file description pages.
It exposes a simple API (and a Scribunto module as well) to get the 
validation status of existing SVG files, can emit warnings when 
trying to upload invalid ones, and is well integrated with 
MediaWiki's native ObjectCache mechanism.
I'm in the process of publishing the code, but have some questions I 
think the community could help me answer.


 * Given that the W3C Validator can also parse HTML files, would it be
   useful to validate wiki pages as well? Even if sometimes the
   validation errors appear to be caused by MediaWiki itself, they can
   also depend on malformed templates.
 * Does storing the validation status of old revisions of images
   (and/or articles) make sense?
 * Do you think the extension should use the extmetadata property of
   ApiQueryImageInfo instead of a its own module?
 * Is it advisable to store validation data permanently in the database?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua module for simple molar mass calculations

2015-04-01 Thread Ricordisamoa
To get parameters passed to the template, just call :getParent() in the 
module: example https://test2.wikipedia.org/wiki/Special:Diff/155912


Il 01/04/2015 11:13, Brenton Horne ha scritto:
Thanks for this. What should /Template:Molar mass calculator/ look 
like? Currently I have:


{{#invoke:Molar mass calculator|calc}}

but this returns a null value when variables are provided.

On 1/04/2015 12:44 PM, Ori Livneh wrote:
On Tue, Mar 31, 2015 at 3:25 PM, Brenton Horne 
brentonhorn...@gmail.com

wrote:


Thanks, I have also posted this question on Stackoverflow (
http://stackoverflow.com/questions/29377097/lua-module-
for-calculating-the-molar-mass-of-chemical-compounds) someone with Lua
skills but not so much with MediaWiki Lua templating and they gave this
code:

|local  AtomicWeightLookup=  {
  C=  12.01,
  H=  1.001,
  O=  16
}

local  function  Calculate(Input)
  -- Input Example: {C = 2, H = 6, O = 1}
  local  Result=  0
  -- Iterate through Input table
  for  Element,Quantityin  next,Inputdo
   -- If element is not found in table, assume 0 weight.
   local  AtomicWeight=  AtomicWeightLookup[Element]  or  0
   -- Multiply
   Result=  Result+  Quantity*  AtomicWeight
  end
  return  Result
end

-- EXAMPLE
print(Calculate({C=  2,  H=  6,  O=  1}))|

but as you can see there's no variables in here that are set by 
MediaWiki

templates, but it seems like a decent starting place.


Here you go:
https://test2.wikipedia.org/wiki/Module:Standard_atomic_weight
https://test2.wikipedia.org/wiki/Module:Molar_mass_calculator
demo: https://test2.wikipedia.org/wiki/Module_talk:Molar_mass_calculator

Note that your table had an error -- the atomic weight of Fluorine is
assigned to symbol 'C' rather than 'F'.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lua module for simple molar mass calculations

2015-04-01 Thread Ricordisamoa



Il 01/04/2015 11:52, Brenton Horne ha scritto:

I have this template within my Chembox
{{Molar mass calculator|
|Ag ={{{Ag|}}}
|As ={{{As|}}}
|Au ={{{Au|}}}
|B={{{B|}}}
|Ba ={{{Ba|}}}
|Bi ={{{Bi|}}}
|Br ={{{Br|}}}
|C={{{C|}}}
|Ca ={{{Ca|}}}
|Cl ={{{Cl|}}}
|Co ={{{Co|}}}
|Cu ={{{Cu|}}}
|F={{{F|}}}
|Fe ={{{Fe|}}}
|H={{{H|}}}
|I={{{I|}}}
|K={{{K|}}}
|Li ={{{Li|}}}
|N={{{N|}}}
|Na ={{{Na|}}}
|Ni ={{{Ni|}}}
|O={{{O|}}}
|P={{{P|}}}
|Pt ={{{Pt|}}}
|S={{{S|}}}
|Se ={{{Se|}}}
|Sr ={{{Sr|}}}
|Zn ={{{Zn|}}}
}}
as each of these element parameters are optional they can be null and 
even when some parameters are provided I get the script error:


Lua error in Module:Molar_mass_calculator at line 8: attempt to 
perform arithmetic on a nil value.


Backtrace:

1. *(tail call)*: ?
2. *Module:Molar_mass_calculator:8
http://localhost/mediawiki/index.php?title=Module:Molar_mass_calculatoraction=edit#mw-ce-l8*:
   in function chunk
3. *mw.lua:497*: ?
4. *(tail call)*: ?
5. *[C]*: in function xpcall
6. *MWServer.lua:87*: in function handleCall
7. *MWServer.lua:301*: in function dispatch
8. *MWServer.lua:58*: ?
9. *(tail call)*: ?
10. *mw.lua:141*: ?
11. *Module:Infobox:320
http://localhost/mediawiki/index.php?title=Module:Infoboxaction=edit#mw-ce-l320*:
   in function preprocessArgs
12. *Module:Infobox:373
http://localhost/mediawiki/index.php?title=Module:Infoboxaction=edit#mw-ce-l373*:
   in function chunk
13. *mw.lua:497*: ?
14. *(tail call)*: ?
15. *[C]*: in function xpcall
16. *MWServer.lua:87*: in function handleCall
17. *MWServer.lua:301*: in function dispatch
18. *MWServer.lua:40*: in function execute
19. *mw_main.lua:7*: in main chunk
20. *[C]*: ?

any ideas of how to overcome this error?


Have you tried putting {{#invoke:Molar mass calculator | calc}} in 
your Chembox instead of the template?


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Org changes in WMF's Platform Engineering group

2015-03-25 Thread Ricordisamoa

Il 25/03/2015 02:45, Rob Lanphier ha scritto:

Hi folks,

First things first:  I'm not burying the lede in this email, so if you
aren't interested in the inner workings of WMF's Platform Engineering team,
feel free to ignore the rest of this.  :-)

We're making a few changes effective in April for Platform Engineering,
which you all care deeply about because you're still reading.

We're looking to give the teams a little more clarity of scope.
Previously, among other teams in Platform Engineering, we had a large
MediaWiki Core team, and a smaller Multimedia team.  We played a big game
of musical chairs, and everyone from those teams is part of a new team.
Additionally, the Parsoid team got into the fun, getting a new member as a
result.


-

Performance - This team is shooting for all page views in under 1000ms

https://docs.google.com/presentation/d/1MtDBNTH1g7CZzhwlJ1raEJagA8qM3uoV7ta6i66bO2M/present?slide=id.g3eb97ca8f_10.
The team plans to establish key frontend and backend performance metrics
and assume responsibility for their curation and upkeep, and get a handle
on web page rendering performance. Right now, it's all about VisualEditor,
but over time, this is going to be a more generalized function.
-

   Members: Ori Livneh, Gilles Dubuc (soon!), now hiring!
   -

Availability - Make MediaWiki backend failures diminishingly infrequent,
and prevent end users from noticing the ones that do by making recovery as
easy and automated as possible. This team does ops facing work that
contributes to the overall stability and maintainability of the system.
Things like multi-datacenter support, and migrating off of outdated
technology to newer, more reliable tech.


Labs included?


-

   Members: Aaron Schulz and Gilles Dubuc (for now, until he wraps up
   work on multi datacenter)
   -

MediaWiki API -  This team's goal wil be make user interface
innovation+evolution easier and make life easier for our sites' robot
overlords by making all business logic for MediaWiki available via well
specified API. Some APIs will be in PHP and some external over HTTP
depending on the needs of other teams.
-

   Members: Brad Jorsch, Kunal Mehta, Gergo Tisza, Mark Holmquist. Stas
   Malyshev plans to join this team when his work on Wikidata Query
wraps up. Bryan
   Davis plans to join as soon as his role as interim Product Manager for
   Platform wraps up.
   -

Search -  Provide unique and highly relevant search results on Wikimedia
sites, increasing the value of our content to readers and providing tools
that help editors make our content better. The team will continue working
on existing backlog of the CirrusSearch/Elasticsearch bugs and
improvements, plus Wikidata Query
-

   Members: Nik Everett, Stas Malyshev (for now...), James Douglas, also
   now hiring!
   -

Security - making life hard for the people that want to do harm to our
sites or the people that use them.
-

   Members: Chris Steipp, also now hiring!
   -

Programs support - support our non-tech programs with tools that delight
our users and maintain the privacy and security of our community, providing
infrastructure for things like Wikimania scholarships, grant program
applications, and ContactForm.
-

   Members: Niharika Kohli, Bryan Davis(20%)
   -

Parsing (renamed from Parsoid) - There are a number of changes to our
PHP parser that would make things easier for VisualEditor and Parsoid,
while at the same time offering a more powerful and easy-to-use authoring
environment for our editors (even those using wikitext)


The latter are the vast majority of the former, AFAICS.


.  Having Tim on a
rebranded “Parsing” team gives that team agency to start evolving wikitext
again, in a way that is supported by Parsoid HTML from day one.
-

   Members: Existing Parsoid team (Subbu Sastry, Marc Ordinas i Llopis,
   Arlo Brenault, and C. Scott Ananian), plus (new) Tim Starling


Breault?



You'll notice that some of these teams are pretty small, especially given
their scope.  This is likely to be at least a little fluid for a while as
we make sure we have the balance of work right and as we figure out the FY
2015-16 budget.

Let us know if you have any questions about this.  I say us because I'll
actually be traveling shortly.  Feel free to ask the individual members of
the teams what's up, or if you don't know who to go to, Bryan Davis will be
filling in for my duties while I'm out.

Thanks
Rob
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starting conversion of LiquidThreads to Flow at mediawiki.org

2015-03-18 Thread Ricordisamoa

Also, some nice-to-have features:

 * provide View Source (showing the wikitext) of someone's post
   https://phabricator.wikimedia.org/T62465
 * post-edit diffs need a Thank link
   https://phabricator.wikimedia.org/T85846
 * Have WhatLinksHere show full information for Flow content
   https://phabricator.wikimedia.org/T92571


Il 17/03/2015 11:05, Ricordisamoa ha scritto:

Hi Nick,
I'm glad the Foundation is finally valuing a usable discussion system.

Unfortunately, there are some serious issues with Flow which will 
prevent my use of it in production if not addressed in full:


 * Administrators *must* be able to to see a deleted Flow board without
   undeleting it (T90972 https://phabricator.wikimedia.org/T90972)
 * Ordinary users *must* be able to move topics between boards (T88140
https://phabricator.wikimedia.org/T88140)
 * Ordinary users *must* be able to edit AND move AND indent AND dedent
   other users' comments (T78253
https://phabricator.wikimedia.org/T78253)
 * An arbitrary indentation level *must* be allowed, with optional
   facilitations for adding an {{outdent}}-like marker
 * Every basic functionality (including but not limited to the
   preview button) *must* work without relying on JavaScript (T60019
https://phabricator.wikimedia.org/T60019)

I see that the implementation of many features was delayed at the 
initial stage of development, but they can't be ignored when trying to 
deploy such a software in production. Thank you.


Il 17/03/2015 01:51, Nick Wilson (Quiddity) ha scritto:

LiquidThreads (LQT) has not been well-supported in a long time. Flow
is in active development, and more real-world use-cases will help
focus attention on the higher-priority features that are needed. To
that end, LQT pages at mediawiki.org will start being converted to
Flow in the next couple of weeks.

There are about 1,600 existing LQT pages on Mediawiki, and the three
most active pages are VisualEditor/Feedback, Project:Support_desk, and
Help_talk:CirrusSearch.[1] The Collaboration team has been running
test conversions of those three pages, and fixing issues that have
come up. Those fixes are almost complete, and the team will be ready
to start converting LQT threads to Flow topics soon. (If you’re
interested in the progress, check out phab:T90788[2] and linked
tasks.) The latest set is visible at a labs test server.[3] See an
example topic comparison here: Flow vs LQT.[4])

The VisualEditor/Feedback page will be converted first (per James'
request), around the middle of next week. We’ll pause to assess any
high-priority changes required. After that, we will start converting
more pages. This process may take a couple of weeks to fully run.

The last page to be converted will be Project:Support_desk, as that is
the largest and most active LQT Board.

LQT Threads that are currently on your watchlist, will still be
watchlisted as Flow Topics. New Topics created at Flow Boards on your
watchlist will appear in your Echo notifications, and you can choose
whether or not to watchlist them.

The LQT namespaces will continue to exist. Links to posts/topics will
redirect appropriately, and the LQT history will remain available at
the original location, as well as being mirrored in the Flow history.

There’s a queue of new features in Flow that will be shipped over the
next month or so:

* Table of Contents is done
* Category support for Flow Header and Topics is done
* VE with editing toolbar coming last week of March (phab:T90763) [5]
* Editing other people’s comments coming last week of March 
(phab:T91086)

* Ability to change the width  side rail in progress, probably out in
April (phab:T88114])
* Search is in progress (no ETA yet) (phab:T76823)
* The ability to choose which Flow notifications end up in Echo,
watchlist, or both, and other more powerful options, will be coming up
next (no ETA yet)

That being said -- there are some LiquidThreads features that don’t
exist in Flow yet.
We’d like to hear which features you use on the current LQT boards,
and that you’re concerned about losing in the Flow conversion. At the
same time, we’d like further suggestions on how we could improve upon
that (or other) features from LQT.

Please give us feedback at
https://www.mediawiki.org/wiki/Topic:Sdoatsbslsafx6lw to keep it
centralized, and test freely at the sandbox.[6]

Much thanks, on behalf of the Collaboration Team,
Quiddity (WMF)

[1] https://www.mediawiki.org/wiki/VisualEditor/Feedback and
https://www.mediawiki.org/wiki/Help_talk:CirrusSearch and
https://www.mediawiki.org/wiki/Project:Support_desk
[2] https://phabricator.wikimedia.org/T90788
[3] http://flow-tests.wmflabs.org/wiki/Testwiki:Support_desk and
http://flow-tests.wmflabs.org/wiki/VisualEditor/Feedback
[4] http://flow-tests.wmflabs.org/wiki/Topic:Qmkwqmp0wfcazy9c and
https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Error_creating_thumbnail:_Unable_to_save_thumbnail_to_destination 


[5] https://phabricator.wikimedia.org/T90763 ,
https

Re: [Wikitech-l] looking for unit testing resources for bot development

2015-03-18 Thread Ricordisamoa

Il 17/03/2015 20:45, Frances Hocutt ha scritto:

I'm working on cleaning up the code[1] for GrantsBot[2] and generally
getting it into better and more robust shape. I've started writing basic
unit tests to assist in the refactoring process. Since it interacts so
heavily with the MediaWiki API, however, this isn't a straightforward
process, and it's even more complicated because I'll be rewriting it to use
a different client library (so any mocks/stubs I include will need to be
rewritten). Does anyone have thoughts on the best strategy for this, or,
more generally, pointers to good resources for writing unit tests for API
clients?

-Frances

[1] dev branch: https://github.com/fhocutt/grantsbot/tree/dev
[2] a bot run by Community Resources to maintain the IdeaLab on MetaWiki:
https://meta.wikimedia.org/wiki/User:GrantsBot
If you're using Pywikibot, you may want to have a look at its extensive 
unit tests :)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starting conversion of LiquidThreads to Flow at mediawiki.org

2015-03-17 Thread Ricordisamoa

Il 18/03/2015 05:08, Danny Horn ha scritto:

Yes, the plan is for editing posts to go everywhere. We want to go a little
bit extra slow with deploying that feature just to make sure that the
pieces we've put in place actually work properly. So it's rolling out to
Mediawiki.org next week, and then English and Russian WP the week after.
(The people at Russian WP that we've been talking to said that they weren't
even interested in test pages until we had editing posts, because Russian
is hardcore.)

Having the ability to edit other people's posts can be very useful, but
there's also a strong cultural tradition that says that we basically don't
do it, except under certain circumstances. People using Flow won't
necessarily come to it with that same tradition, so we want to see that the
feature set encourages the useful editing, and doesn't encourage people to
mess with the wording or intent of someone else's post. Once we've seen it
in action for a little while, it'll go live to all the other languages.


Of course it should be used carefully and for minor changes only. I 
agree with T91086 https://phabricator.wikimedia.org/T91086.




And I'm glad to hear that this thread has come close to almost inspiring
optimism. That's what I'm here for.


On Tue, Mar 17, 2015 at 8:00 PM, MZMcBride z...@mzmcbride.com wrote:


Ricordisamoa wrote:

Il 17/03/2015 23:29, Danny Horn ha scritto:

-- The ability to edit other people's posts will be out on Mediawiki by
   the end of next week. We’ve made a few interface changes to support
   that. Posts that have been edited by someone that isn’t the original
   poster now say “Edited by Username 3 minutes ago”, so that it’s easy
   for everyone to see what’s happened. When someone edits an existing
   post, we fixed the diff pages so that you can browse between previous
   and next changes. [1]

By Mediawiki, do you mean www.mediawiki.org?
I would like to stress the importance of such ability for *all* wikis.
On Wikimedia, unlike most other sites, nothing is 'owned' by someone,
and protection is only a precautionary measure. Flow is supposed to work
with this model.

Agreed. The ability of anyone to revert bad edits also acts a major
anti-abuse feature. As has been pointed out many times, there's a very
real spam concern if only the author and admins can edit posts.


-- Make the links to threads look nicer -- Yeah, this is annoying. It’s
   not in our top five list of annoyances at the moment, but we’ll keep
   checking off annoying items. Nicer links will get its turn. [5]
[5] Less ugly topic page links: https://phabricator.wikimedia.org/T59154

Erik B. says on that task that the deployment of ContentHandler should
help. This is excellent news.


Thanks for the detailed and timely reply.

My thanks as well! This e-mail really helped alleviate some of my concerns
with Flow's development. I have too much experience with Flow's
predecessor LiquidThreads to say that I'm optimistic, but I'm definitely
less concerned now about Flow's future than I was when I started the day.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Our CAPTCHA is very unfriendly

2015-03-17 Thread Ricordisamoa

Il 18/03/2015 04:30, MZMcBride ha scritto:

Ricordisamoa wrote:

Il 09/11/2014 18:33, MZMcBride ha scritto:

Marc A. Pelletier wrote:

But there is also a great heap of anecdotal data that shows that having
to provide an email account increases the barrier of entry to users
signing up.  So, there's a tradeoff.

Eh, I think the anecdotal data (such as Facebook's and Google's hundreds
of millions account registrations) suggests that e-mail confirmation is
not a huge barrier to entry for legitimate users.

I think both Facebook and Google have enough staff resources to deal
with spam, and they could even let bots create fake accounts as long as
they don't harass other users, just to let the accounts counter increase.
We can't afford that.

I'm not sure what you mean by can't afford that. What specific behaviors
are we trying to prevent? Account registration alone isn't really a
problem on MediaWiki wikis, just as it isn't a problem on Facebook or
Google. The system scales. But if the accounts are registering and then
spamming (creating new pages, making bad edits to existing pages, etc.),
that's a real problem that we should try to solve as efficiently and
cleanly as possible. Volunteer time is definitely precious.


If a bot creates 10,000 Facebook profiles and fills them with bogus 
content, that's fine for them. More users, more ads, more money.
But if it creates 10,000 Wikimedia accounts with bogus user pages, it 
isn't fine for us. Less trust between Wikimedians.





I think calling this issue a sacred cow is a bit overblown, but requiring
an e-mail address would be a violation of our shared values. We strive to
be as open and independent as possible and requiring an e-mail address is
antithetical to that. If anything, we could provide e-mail address
aliases (e.g., mzmcbr...@en.wikipedia.org) for our users as a side
benefit.

What about case-sensitivity of user names vs email addresses then?

This is tangential, but... we should fix usernames to be case-insensitive.
And we should support login via e-mail address. And we should (properly)
support a display name field, in my opinion. Hopefully, in time. :-)

In addition to better heuristics, as Robert suggested, we could also focus
on tasks such as https://phabricator.wikimedia.org/T20110, maybe. Using
AbuseFilter to trigger CAPTCHAs seems like it would either be a really
great or a really terrible idea. At least making this functionality
available as an option to potentially try seems worthwhile.


Definitely.



MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starting conversion of LiquidThreads to Flow at mediawiki.org

2015-03-17 Thread Ricordisamoa

Hi Nick,
I'm glad the Foundation is finally valuing a usable discussion system.

Unfortunately, there are some serious issues with Flow which will 
prevent my use of it in production if not addressed in full:


 * Administrators *must* be able to to see a deleted Flow board without
   undeleting it (T90972 https://phabricator.wikimedia.org/T90972)
 * Ordinary users *must* be able to move topics between boards (T88140
   https://phabricator.wikimedia.org/T88140)
 * Ordinary users *must* be able to edit AND move AND indent AND dedent
   other users' comments (T78253
   https://phabricator.wikimedia.org/T78253)
 * An arbitrary indentation level *must* be allowed, with optional
   facilitations for adding an {{outdent}}-like marker
 * Every basic functionality (including but not limited to the
   preview button) *must* work without relying on JavaScript (T60019
   https://phabricator.wikimedia.org/T60019)

I see that the implementation of many features was delayed at the 
initial stage of development, but they can't be ignored when trying to 
deploy such a software in production. Thank you.


Il 17/03/2015 01:51, Nick Wilson (Quiddity) ha scritto:

LiquidThreads (LQT) has not been well-supported in a long time. Flow
is in active development, and more real-world use-cases will help
focus attention on the higher-priority features that are needed. To
that end, LQT pages at mediawiki.org will start being converted to
Flow in the next couple of weeks.

There are about 1,600 existing LQT pages on Mediawiki, and the three
most active pages are VisualEditor/Feedback, Project:Support_desk, and
Help_talk:CirrusSearch.[1] The Collaboration team has been running
test conversions of those three pages, and fixing issues that have
come up. Those fixes are almost complete, and the team will be ready
to start converting LQT threads to Flow topics soon. (If you’re
interested in the progress, check out phab:T90788[2] and linked
tasks.) The latest set is visible at a labs test server.[3] See an
example topic comparison here: Flow vs LQT.[4])

The VisualEditor/Feedback page will be converted first (per James'
request), around the middle of next week. We’ll pause to assess any
high-priority changes required. After that, we will start converting
more pages. This process may take a couple of weeks to fully run.

The last page to be converted will be Project:Support_desk, as that is
the largest and most active LQT Board.

LQT Threads that are currently on your watchlist, will still be
watchlisted as Flow Topics. New Topics created at Flow Boards on your
watchlist will appear in your Echo notifications, and you can choose
whether or not to watchlist them.

The LQT namespaces will continue to exist. Links to posts/topics will
redirect appropriately, and the LQT history will remain available at
the original location, as well as being mirrored in the Flow history.

There’s a queue of new features in Flow that will be shipped over the
next month or so:

* Table of Contents is done
* Category support for Flow Header and Topics is done
* VE with editing toolbar coming last week of March (phab:T90763) [5]
* Editing other people’s comments coming last week of March (phab:T91086)
* Ability to change the width  side rail in progress, probably out in
April (phab:T88114])
* Search is in progress (no ETA yet) (phab:T76823)
* The ability to choose which Flow notifications end up in Echo,
watchlist, or both, and other more powerful options, will be coming up
next (no ETA yet)

That being said -- there are some LiquidThreads features that don’t
exist in Flow yet.
We’d like to hear which features you use on the current LQT boards,
and that you’re concerned about losing in the Flow conversion. At the
same time, we’d like further suggestions on how we could improve upon
that (or other) features from LQT.

Please give us feedback at
https://www.mediawiki.org/wiki/Topic:Sdoatsbslsafx6lw to keep it
centralized, and test freely at the sandbox.[6]

Much thanks, on behalf of the Collaboration Team,
Quiddity (WMF)

[1] https://www.mediawiki.org/wiki/VisualEditor/Feedback and
https://www.mediawiki.org/wiki/Help_talk:CirrusSearch and
https://www.mediawiki.org/wiki/Project:Support_desk
[2] https://phabricator.wikimedia.org/T90788
[3] http://flow-tests.wmflabs.org/wiki/Testwiki:Support_desk and
http://flow-tests.wmflabs.org/wiki/VisualEditor/Feedback
[4] http://flow-tests.wmflabs.org/wiki/Topic:Qmkwqmp0wfcazy9c and
https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Error_creating_thumbnail:_Unable_to_save_thumbnail_to_destination
[5] https://phabricator.wikimedia.org/T90763 ,
https://phabricator.wikimedia.org/T91086 ,
https://phabricator.wikimedia.org/T88114 ,
https://phabricator.wikimedia.org/T76823
[6] https://www.mediawiki.org/wiki/Talk:Sandbox




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Our CAPTCHA is very unfriendly

2015-03-17 Thread Ricordisamoa

Il 09/11/2014 18:33, MZMcBride ha scritto:

Marc A. Pelletier wrote:

But there is also a great heap of anecdotal data that shows that having
to provide an email account increases the barrier of entry to users
signing up.  So, there's a tradeoff.

Eh, I think the anecdotal data (such as Facebook's and Google's hundreds
of millions account registrations) suggests that e-mail confirmation is
not a huge barrier to entry for legitimate users.


I think both Facebook and Google have enough staff resources to deal 
with spam, and they could even let bots create fake accounts as long as 
they don't harass other users, just to let the accounts counter increase.
We can't afford that. CAPTCHA solutions (not necessarily text-based 
ones) should try to free our users from the day-to-day fight.





Spambots (of which there are multitude, and that hammer any mediawiki
site constantly) have gotten pretty good at bypassing captchas but have
yet to respond properly to email loops (and that's a more complicated
obstacle than first appears; throwaway accounts are cheap but any
process that requires a delay - however small - means that spambot must
now maintain state and interact rather than fire-and-forget).

Hmmm, I imagine many spambots have already made this investment if they're
dealing with popular systems that require e-mail address confirmation.

Wikimedia is different. You shouldn't even need an account to edit, much
less an e-mail address. But this is a philosophical and principle-based
(principled, if you will!) decision, not really a user experience or
technical decision, in my opinion.

I think calling this issue a sacred cow is a bit overblown, but requiring
an e-mail address would be a violation of our shared values. We strive to
be as open and independent as possible and requiring an e-mail address is
antithetical to that. If anything, we could provide e-mail address aliases
(e.g.,mzmcbr...@en.wikipedia.org) for our users as a side benefit.


What about case-sensitivity of user names vs email addresses then?


MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   >