Re: [Wikitech-l] [Huggle] Huggle 3 released / Mac people needed

2014-06-02 Thread Bryan Davis
On Sun, Jun 1, 2014 at 2:57 PM, Petr Bena  wrote:
> Yes there has been an update recently, we released 3.0.0 :)
>
> I totally agree, actually ubuntu users can install & run huggle using
> 1 line in terminal, and a goal is to have it as much accessible for
> everyone as possible (unfortunately no apt-get for windows, nor mac).
> If you are able to build it (the latest version) please provide
> details. Thanks

I updated the the build steps at
https://en.wikipedia.org/wiki/Wikipedia:Huggle/Huggle3_Beta#Mac_OS
with the steps I needed to build the binary.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Gergo Tisza
On Fri, May 30, 2014 at 3:56 PM, Bryan Davis  wrote:
>
> There is still some ongoing internal discussion about the best way to
> verify that included libraries are needed and that security patches
> are watched for and applied from upstream. Chris Steipp is awesome,
> but it would be quite an additional burden to hang these thousands of
> new lines of code around his neck as yet another burden to bear. One
> current theory is that need should be determined by the RFC process
> and security support would need to be provided by a "sponsor" of the
> library.
>

As long as those libraries are installed via Composer, and well-maintained,
something like VersionEye  could take on a big
part of that burden.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Niklas Laxström
2014-05-30 0:57 GMT+03:00 Bryan Davis :
> I think bug 65188 [0] is the solution suggested by Ori that you are
> referring to. Would this alone be enough to fix the problems for
> translatewiki.net? More directly, is translatewiki.net using the top
> level composer.json file to manage anything other than extensions? In
> the near term is there any better work around for you (and others in a
> similar position) other than running `git update-index
> --assume-unchanged composer.json` in your local checkout to make it
> ignore anything that happens to composer.json?

Now that composer.json also includes dependencies for core, ignoring
changes to it would also break things.

To make things worse, I noticed on my development environment that our
own scap-equivalent will just go on to run composer update even if the
file conflicted. This causes it to remove the extensions and libraries
we currently install via composer, also breaking the site.

  -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unclear Meaning of $baseRevId in WikiPage::doEditContent

2014-06-02 Thread Daniel Kinzler
Am 30.05.2014 15:38, schrieb Brad Jorsch (Anomie):
> I think you need to look again into how FlaggedRevs uses it, without the
> preconceptions you're bringing in from the way you first interpreted the
> name of the variable. The current behavior makes perfect sense for that
> specific use case. Neither of your proposals would work for FlaggedRevs.

As far as I understand the rather complex FlaggedRevs.hooks.php code, it assumes
that

a) if $newRev === $baseRevId, it's a null edit. As far as I can see, this does
not work, since $baseRevId will be null for a null edit (and all other regular
edits).

b) if $newRev !== $baseRevId but the new rev's hash is the same as the base
rev's hash, it's a rollback. This works with the current implementation of
commitRollback(), but does not for manual reverts or trivial undos.

So, FlaggedRevs assumes that EditPage resp WikiPage set $baseRevId to the edits
logical parent (basically, the revision the user loaded when starting to edit).
That's what I described as option (3) in my earlier mail, except for the
rollback case; It would be fined with me to use the target rev as the base for
rollbacks, as is currently done.

FlaggedRevs.hooks.php also injects a baseRevId form field and uses it in some
cases, adding to the confusion.

In order to handle manual reverts and null edits consistently, EditPage should
probably have a base revision as a form field, and pass it on to doEditContent.
As far as I can tell, this would work with the current code in FlaggedRevs.

> As for the EditPage code path, note that it has already done edit conflict
> resolution so "base revision = current revision of the page". Which is
> probably the intended meaning of false.

Right. If that's the case though, WikiPage::doEditContent should probably set
$baseRevId = $oldid, before passing it to the hooks.

Without changing core, it seems that there is no way to implement a late/strict
conflict check based on the base rev id. That would need an additional "anchor
revision" for checking.

The easiest solution for the current situation is to simply drop the strict
conflict check in Wikibase and accept a race condition that may cause a revision
to be silently overwritten, as is currently the case in core.

-- daniel


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread James HK
> To make things worse, I noticed on my development environment that our
> own scap-equivalent will just go on to run composer update even if the
> file conflicted. This causes it to remove the extensions and libraries
> we currently install via composer, also breaking the site.

I hope for the sake of all-non WMF users that already use Composer to
install extensions that the proposed changes are not making things
worse (well it doesn't seem so).

Cheers

On 6/2/14, Niklas Laxström  wrote:
> 2014-05-30 0:57 GMT+03:00 Bryan Davis :
>> I think bug 65188 [0] is the solution suggested by Ori that you are
>> referring to. Would this alone be enough to fix the problems for
>> translatewiki.net? More directly, is translatewiki.net using the top
>> level composer.json file to manage anything other than extensions? In
>> the near term is there any better work around for you (and others in a
>> similar position) other than running `git update-index
>> --assume-unchanged composer.json` in your local checkout to make it
>> ignore anything that happens to composer.json?
>
> Now that composer.json also includes dependencies for core, ignoring
> changes to it would also break things.
>
> To make things worse, I noticed on my development environment that our
> own scap-equivalent will just go on to run composer update even if the
> file conflicted. This causes it to remove the extensions and libraries
> we currently install via composer, also breaking the site.
>
>   -Niklas
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] An unhappy "resolution" on skin naming

2014-06-02 Thread Stephan Gambke
On 1 June 2014 22:45, Daniel Friesen  wrote:
> What kind of decoupling did you have in mind?

Not specifying that each skin has to have exactly one lc identifier
and then starting to rely on this requirement and generate all sorts
of secondary names, identifiers, paths, class names, etc. from that.
E.g why not just ask that skin for it's localized name?

I know there is loads of legacy code to deal with here and this
business with the message identifiers for the skin names in particular
is not the object of the on-going changes. It's just that I'd rather
not have an explicit requirement introduced specifying that there must
be exactly one all-purpose lower-case id per skin.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] An unhappy "resolution" on skin naming

2014-06-02 Thread James HK
>> What kind of decoupling did you have in mind?
>
> Not specifying that each skin has to have exactly one lc identifier
> and then starting to rely on this requirement and generate all sorts
> of secondary names, identifiers, paths, class names, etc. from that.
> E.g why not just ask that skin for it's localized name?

I second this, code (skin or extension) should be expressive and if
possible be decoupled. Doing all sorts of magic behind a curtain may
save some line of code but it certainly does not improve readability
or expressiveness and makes it prone to breakage if some of the
"magic" disappears.

On 6/2/14, Stephan Gambke  wrote:
> On 1 June 2014 22:45, Daniel Friesen  wrote:
>> What kind of decoupling did you have in mind?
>
> Not specifying that each skin has to have exactly one lc identifier
> and then starting to rely on this requirement and generate all sorts
> of secondary names, identifiers, paths, class names, etc. from that.
> E.g why not just ask that skin for it's localized name?
>
> I know there is loads of legacy code to deal with here and this
> business with the message identifiers for the skin names in particular
> is not the object of the on-going changes. It's just that I'd rather
> not have an explicit requirement introduced specifying that there must
> be exactly one all-purpose lower-case id per skin.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Jenkins can now validate composer.json files

2014-06-02 Thread Antoine Musso
Hello,

I have created a new Jenkins job 'php-composer-validate' which invokes
'composer.json' on a proposed change and bails out whenever it is faulty.

The change is only triggered on changes that modify composer.json.


To have the job triggered on your repository, you would have to edit
Zuul configuration in integration/zuul-config.git and add an entry for
the job php-composer-validate

Example for MediaWiki core:

 https://gerrit.wikimedia.org/r/#/c/136732/1/layout.yaml

Then ask for review :)

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Page view stats

2014-06-02 Thread Andre Klapper
Hi,

On Mon, 2014-06-02 at 11:36 +0900, ikuyamada wrote:
> It seems that the page view stats have not been uploaded for several days.
> http://dumps.wikimedia.org/other/pagecounts-raw/2014/
> 
> Are there any plans to fix this?

See https://bugzilla.wikimedia.org/show_bug.cgi?id=65978

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] 2300 UTC Monday: grid system discussion

2014-06-02 Thread C. Scott Ananian
A grid system for content authors would be useful as well.  There have
been some discussions about better semantic markup for image widths
and placement as part of the change to image thumbnail sizing which
landed last week and was reverted yesterday.

This doesn't seem to be included in the current RfC (which seems to
concentrate on features for skin authors) -- do the "grid experts"
think that discussion of how this might be used in content would be
on-topic for the discussion tonight?  [7pm Boston time, somehow Sumana
left out the east coast from her list! ;) ]
  --scott

On Mon, Jun 2, 2014 at 2:14 AM, Sumana Harihareswara
 wrote:
> Later today, at 2300 UTC, we'll be in #wikimedia-office discussing Pau
> Giner's grid system RfC.
> https://www.mediawiki.org/wiki/Requests_for_comment/Grid_system "to simplify
> the creation of user interfaces and make them ready for multiple screen
> sizes."
> Check out the new patchset https://gerrit.wikimedia.org/r/#/c/133683/
>
> "It makes the log-in form responsive (adjusting layout, typography and
> visibility to the current screen size). It does not leverage all the
> potential of responsive design, but may be useful as a demo to help the
> reviewers."
>
> This is at a time meant to make it easier for Australia and China to
> participate.
>
> http://www.timeanddate.com/worldclock/fixedtime.html?hour=23&min=00&sec=0&day=02&month=06&year=2014
>
> Sydney: Tuesday 9am
> Beijing: Tuesday 7am
> San Francisco: Monday 4pm
>
> I'm sorry for the late announcement of this one; for the next several weeks
> I'll be haranguing authors to help me get discussions set up a few weeks in
> advance. :)
>
> More:
> https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-06-02
>
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation
>
> ___
> Engineering mailing list
> engineer...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/engineering
>



-- 
(http://cscott.net)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [ImageMap] Looking for a reviewer: Link directly to media file when [[Media:...]] namespace is used within an imagemap.

2014-06-02 Thread Remco de Boer
Hi,

I'm looking for someone to review and (hopefully) accept a very small (3
line) patch I wrote for the ImageMap extension. The patch improves the
behavior of image maps with links to [[Media:...]] files. Instead of
linking to the image page, these links should go directly to the media file
(comparable to how regular [[Media:...]] links in wiki text work).

My patch has been open for over three months now, and there is no
registered maintainer of the ImageMap extension (
https://www.mediawiki.org/wiki/Developers/Maintainers#MediaWiki_extensions_deployed_by_WMF).
I've tried to ping some folks whom I thought might be able to review my
patch, but none of them has responded.

The patch is in Gerrit, and can be viewed at
https://gerrit.wikimedia.org/r/#/c/114439/

Kind regards,

Remco de Boer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Jeroen De Dauw
Hey,

> To make things worse, I noticed on my development environment that our
> > own scap-equivalent will just go on to run composer update even if the
> > file conflicted. This causes it to remove the extensions and libraries
> > we currently install via composer, also breaking the site.
>
> I hope for the sake of all-non WMF users that already use Composer to
> install extensions that the proposed changes are not making things
> worse (well it doesn't seem so).
>

I would also like to express my disappointment at third party users being
thrown under the bus once again. Several people have been putting a lot of
effort into supporting third party users, so it really saddens me that this
is dismissed as an irrelevance by some so easily.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] php[world] Conference (November 12-14th, 2014)

2014-06-02 Thread Terry Chay
Hey all,

I got this e-mail because of my participation at various PHP and open source 
conferences. However, this new conference is more trying to focus around the 
PHP ecosystem of applications (MediaWiki, Wordpress, Drupal, …) which for a 
long time have had their own separate conferences (Berlin Hackathon, WordCamp, 
DrupalCon, …) independent of the normal PHP/LAMP ones (php|tek, ZendCon, OSCON, 
FOSDEM, …).

Since I'm now a useless middle manager, haven't ever been a MediaWiki 
developer, and haven't developed on WordPress for at least a couple years, this 
conference doesn't apply to me. But you might be interesting in presenting a 
paper or attending.

Take care,

terry

terry chay  최태리
Director of Features Engineering
Wikimedia Foundation
“Imagine a world in which every single human being can freely share in the sum 
of all knowledge. That's our commitment.”

p: +1 (415) 839-6885 x6832
m: +1 (408) 480-8902
e: tc...@wikimedia.org
i: http://terrychay.com/
w: http://meta.wikimedia.org/wiki/User:Tychay
aim: terrychay

Begin forwarded message:

> From: php[architect] 
> Subject: Announcing php[world] Conference : Call for Papers Open!
> Date: June 2, 2014 at 11:03:26 AM PDT
> To: Terry 
> Reply-To: 
> 
> Announcing php[world] - Call for Papers
> November 10-14 - Washington, DC, USA
> Is this email not displaying correctly?
> View this email in your browser
> 
> Bringing the world of PHP one step closer together...
> We are excited to introduce a brand new conference from the team at 
> php[architect]. This conference, php[world], is designed to bring together 
> all of the various PHP communities into one place to share ideas.
> 
> Our php[tek] conference is known as the definitive conference in the US for 
> the core PHP developer. For this new conference, we wanted to create an 
> environment that appealed to all the different communities that exist in the 
> PHP ecosystem. Whether you are a core PHP developer, a heavy framework user 
> (such as Zend Framework or Symfony), or a WordPress, Drupal, Joomla, or 
> Magento developer … this conference will be for you!
> 
> Details are still coming together, but lots of information has already been 
> posted on the php[world] website and more updates will appear soon!
> 
>  
> Call for Papers
> Speaking of which, to make this conference happen and to bring these 
> communities together, we need lots of paper submissions from a broad spectrum 
> of speakers. Simply put ... we want to hear from you!
> 
> Our Call for Papers is open until June 20th, but don't delay, get your 
> submissions in early!
> 
> 
> 
>  
> Registration
> We've opened up registration as well, including some steep Early Bird 
> discounts. If you aren't interested in speaking but want to make sure that 
> you don't miss this event, you should pick up your tickets immediately before 
> the prices rise.
> 
> 
> 
>   Share
>   Tweet
>   +1
>   Share
> Copyright © 2014 php[architect], All rights reserved.
> You are receiving this email because you have done business with 
> php[architect] in the past.
> 
> Our mailing address is:
> php[architect]
> 201 Adams Avenue
> Alexandria, VA 22301
> 
> Add us to your address book
> 
> 
> unsubscribe from this listupdate subscription preferences 
> 
> 

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] 2300 UTC Monday: grid system discussion

2014-06-02 Thread Peter Coombe
IANA "grid expert", but I think it would be a huge missed opportunity not
to let this be used for content. It could be a great help with pages like
Portals, which are currently reliant on loads of inline styles for layout,
or worse, tables.

Peter


On 2 June 2014 15:39, C. Scott Ananian  wrote:

> A grid system for content authors would be useful as well.  There have
> been some discussions about better semantic markup for image widths
> and placement as part of the change to image thumbnail sizing which
> landed last week and was reverted yesterday.
>
> This doesn't seem to be included in the current RfC (which seems to
> concentrate on features for skin authors) -- do the "grid experts"
> think that discussion of how this might be used in content would be
> on-topic for the discussion tonight?  [7pm Boston time, somehow Sumana
> left out the east coast from her list! ;) ]
>   --scott
>
> On Mon, Jun 2, 2014 at 2:14 AM, Sumana Harihareswara
>  wrote:
> > Later today, at 2300 UTC, we'll be in #wikimedia-office discussing Pau
> > Giner's grid system RfC.
> > https://www.mediawiki.org/wiki/Requests_for_comment/Grid_system "to
> simplify
> > the creation of user interfaces and make them ready for multiple screen
> > sizes."
> > Check out the new patchset https://gerrit.wikimedia.org/r/#/c/133683/
> >
> > "It makes the log-in form responsive (adjusting layout, typography and
> > visibility to the current screen size). It does not leverage all the
> > potential of responsive design, but may be useful as a demo to help the
> > reviewers."
> >
> > This is at a time meant to make it easier for Australia and China to
> > participate.
> >
> >
> http://www.timeanddate.com/worldclock/fixedtime.html?hour=23&min=00&sec=0&day=02&month=06&year=2014
> >
> > Sydney: Tuesday 9am
> > Beijing: Tuesday 7am
> > San Francisco: Monday 4pm
> >
> > I'm sorry for the late announcement of this one; for the next several
> weeks
> > I'll be haranguing authors to help me get discussions set up a few weeks
> in
> > advance. :)
> >
> > More:
> >
> https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-06-02
> >
> > Sumana Harihareswara
> > Engineering Community Manager
> > Wikimedia Foundation
> >
> > ___
> > Engineering mailing list
> > engineer...@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/engineering
> >
>
>
>
> --
> (http://cscott.net)
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] new auth/debug/Zero RfCs, + adding more office hours

2014-06-02 Thread Sumana Harihareswara
Some newer RfCs that you ought to know about, mostly draft-ish and in
progress. Please comment on their talkpages.

* https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication
"With many more entry points and the need for inter-service
authentication, a service-oriented architecture requires a stronger
authentication system."
*
https://www.mediawiki.org/wiki/Requests_for_comment/Debugging_at_production_server
"Sometimes we have to debug on production wiki, but don't want to show
internal information to normal users..."
*
https://www.mediawiki.org/wiki/Requests_for_comment/Unfragmented_ZERO_design
"In order to significantly reduce varnish fragmentation and reduce the
complexity, Zero team would like to unify HTML served to all Zero
partners's users"

Also: I've heard feedback that the RfC process ought to move faster, and
that we ought to have more people from more disciplines take a look at
changes that users will visually notice. Sounds good to me. So:

* I'm going to add more IRC discussion hours that will be more for
*discussing* specific RfCs and getting general community feedback,
rather than asking for go/no-go decisions. These will be in addition to
the weekly decision-oriented meetings. It seems like people get a lot
out of having these real-time conversations in addition to the option to
comment onwiki and onlist, so this is a way to push the most complex
topics forward. This will also allow us to have more chats that suit
different timezones. We will also sometimes piggyback them onto the
videostreamed Tech Talks, with simultaneous Etherpad notes + IRC.

* I'm going to make more of an effort to invite specific people from
diverse disciplines to RfC discussions, e.g., QA people for debugging
stuff, design/product people for user-visible changes, etc. I've been
doing this but I'll be more systematic. Please help me out by spreading
the word to people whom you think will be interested.


-- 
Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Grid system RfC discussion in <1 hour

2014-06-02 Thread Sumana Harihareswara
This is in about 50 minutes, in #wikimedia-office.
-Sumana

On 06/02/2014 02:14 AM, Sumana Harihareswara wrote:
> Later today, at 2300 UTC, we'll be in #wikimedia-office discussing Pau
> Giner's grid system RfC.
> https://www.mediawiki.org/wiki/Requests_for_comment/Grid_system "to
> simplify the creation of user interfaces and make them ready for multiple
> screen sizes."
> Check out the new patchset https://gerrit.wikimedia.org/r/#/c/133683/
> 
> "It makes the log-in form responsive (adjusting layout, typography and
> visibility to the current screen size). It does not leverage all the
> potential of responsive design, but may be useful as a demo to help the
> reviewers."
> 
> This is at a time meant to make it easier for Australia and China to
> participate.
> 
> http://www.timeanddate.com/worldclock/fixedtime.html?hour=23&min=00&sec=0&day=02&month=06&year=2014
> 
> Sydney: Tuesday 9am
> Beijing: Tuesday 7am
> San Francisco: Monday 4pm
> 
> I'm sorry for the late announcement of this one; for the next several weeks
> I'll be haranguing authors to help me get discussions set up a few weeks in
> advance. :)
> 
> More:
> https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-06-02


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki 1.23.0-rc.3 release candidate is available

2014-06-02 Thread Markus Glaser
The third (and hopefully final) release candidate for 1.23.0 (1.23.0-rc.3) is 
now available for download.

The changes since 1.23.0-rc.2 are as follows:

* (bug 65225) jquery.suggestions: Handle CSS ellipsis better for IE.
* (bug 65765) Add ar_text to the list from Revision::selectArchiveFields().
* DerivativeContext::setConfig should take a Config object.
* Make abstract Config class truly implementation-agnostic.
* (bug 65748) Officially deprecate skin autodiscovery.

Full release notes:
https://git.wikimedia.org/blob/mediawiki%2Fcore.git/1.23.0-rc.3/RELEASE-NOTES-1.23
https://www.mediawiki.org/wiki/Release_notes/1.23

**

Download:
http://download.wikimedia.org/mediawiki/1.23/mediawiki-core-1.23.0-rc.3.tar.gz
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.0-rc.3.tar.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.23/mediawiki-core-1.23.0-rc.3.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.0-rc.3.tar.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html

Markus Glaser
(Release Team)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new auth/debug/Zero RfCs, + adding more office hours

2014-06-02 Thread Tyler Romeo
* https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication
"With many more entry points and the need for inter-service
authentication, a service-oriented architecture requires a stronger
authentication system."
This is literally the same thing as AuthStack except more generic and without 
any code or plan for implementation.

Also, on the same note, I was told previously that the AuthStack RFC was too 
big and needed to be split up, because I tackled both 
authentication/authorization *and* session management and logout in the same 
proposal. Since this RFC has the same problem, it should be split up 
accordingly.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
I would also like to express my disappointment at third party users being
thrown under the bus once again. Several people have been putting a lot of
effort into supporting third party users, so it really saddens me that this
is dismissed as an irrelevance by some so easily.
Third party users were not thrown under the bus. Unfortunately, the solution 
you are looking for in terms of extension installation is not yet available 
with the current tools available. That is just the unfortunate truth. We are 
not going to misuse libraries and hack together MediaWiki just so extension 
installation can be *slightly* easier.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread James HK
> That is just the unfortunate truth. We are
> not going to misuse libraries and hack together MediaWiki just so extension
> installation can be *slightly* easier.

This sort of behaviour towards non-WMF extension developers is
interesting and if your objective is to alienate (as with the attitude
above) volunteer developers then your on the right path.

Some people forget that users not always choose MW because of its
software but because it provides some extensions that people outside
of the WMF cluster find useful.

On 6/3/14, Tyler Romeo  wrote:
> I would also like to express my disappointment at third party users being
> thrown under the bus once again. Several people have been putting a lot of
> effort into supporting third party users, so it really saddens me that this
> is dismissed as an irrelevance by some so easily.
> Third party users were not thrown under the bus. Unfortunately, the solution
> you are looking for in terms of extension installation is not yet available
> with the current tools available. That is just the unfortunate truth. We are
> not going to misuse libraries and hack together MediaWiki just so extension
> installation can be *slightly* easier.
>
> --
> Tyler Romeo
> 0xC86B42DF
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
This sort of behaviour towards non-WMF extension developers is
interesting and if your objective is to alienate (as with the attitude
above) volunteer developers then your on the right path.

Some people forget that users not always choose MW because of its
software but because it provides some extensions that people outside
of the WMF cluster find useful.
Considering I *am* a non-WMF extension developer I don’t see how your argument 
is relevant.

And as I literally just said in my previous, the goal was not to disadvantage 
third-party extension developers. Composer is simply not meant to be used in 
the way it was shoe-horned into MediaWiki. I’m not going to re-explain this 
every time because it is in multiple places on Bugzilla and in this mailing 
list.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Ori Livneh
On Mon, Jun 2, 2014 at 6:37 PM, James HK 
wrote:

> > That is just the unfortunate truth. We are
> > not going to misuse libraries and hack together MediaWiki just so
> extension
> > installation can be *slightly* easier.
>
> This sort of behaviour towards non-WMF extension developers is
> interesting and if your objective is to alienate (as with the attitude
> above) volunteer developers then your on the right path.
>

You are free to use composer.json to manage extensions, in which case you
should version it in SCM. There's no conflict here. We did not favor one
use-case over another; we went with the path that coheres with the design
of Composer, as explicitly discussed in fantastic detail in its
documentation, bringing MediaWiki in line with every other significant
application or framework that uses Composer that I could find.

We're not so far down a path that we can't change course, but I've yet to
see you rebut any of the points I raised in my commit message accompanying
change I3e7c668ee[0] or articulate a coherent alternative.

As for the accusation that the current approach favors the WMF, it's almost
not worth responding to. We don't even intend to use Composer in
production; all the momentum behind the recent work around Composer
integration has in mind how MediaWiki fits with the broader open-source
ecosystem.

[0]: https://gerrit.wikimedia.org/r/#/c/132788/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Accessing current template information on wiki commons description page

2014-06-02 Thread james harvey
Given a Wikimedia Commons description page URL - such as:
https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg

I would like to be able to programmatically retrieve the information in the
"Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
"Dimensions", "Current location", etc.)

I believe all this information is in "Template:Artwork".  I can't figure
out how to get the wikitext/json-looking template data.

If I use the API and call:
https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content

Then I don't get the information I'm looking for.  This shows the most
recent revision, and its changes.  Unless the most recent revision changed
this data, it doesn't show up.

To see all the information I'm looking for, it seems I'd have to specify
rvlimit=max and go through all the past revisions to figure out which is
most current.  For example, if I do so and I look at revid 79665032, that
includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
= 1889 | Technique = {{Oil on canvas}} | . . ."

Isn't there a way to get the current version in whatever format you'd call
that - the wikitext/json looking format?

In my API call, I can specify rvexpandtemplates which even with only the
most recent revision gives me the information I need, but it's largely in
HTML tables/divs/etc format rather than wikitext/json/xml/etc.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-02 Thread Gerard Meijssen
Hoi,
The good news is that work on the "Wikidatafication" of multimedia files
has started. You just provided an excellent example why it is so needed.

One drawback is that once it is done, your current work will need to be
revisited.
Thanks,
 GerardM


On 3 June 2014 07:59, james harvey  wrote:

> Given a Wikimedia Commons description page URL - such as:
>
> https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
>
> I would like to be able to programmatically retrieve the information in the
> "Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
> "Dimensions", "Current location", etc.)
>
> I believe all this information is in "Template:Artwork".  I can't figure
> out how to get the wikitext/json-looking template data.
>
> If I use the API and call:
>
> https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
>
> Then I don't get the information I'm looking for.  This shows the most
> recent revision, and its changes.  Unless the most recent revision changed
> this data, it doesn't show up.
>
> To see all the information I'm looking for, it seems I'd have to specify
> rvlimit=max and go through all the past revisions to figure out which is
> most current.  For example, if I do so and I look at revid 79665032, that
> includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
> = 1889 | Technique = {{Oil on canvas}} | . . ."
>
> Isn't there a way to get the current version in whatever format you'd call
> that - the wikitext/json looking format?
>
> In my API call, I can specify rvexpandtemplates which even with only the
> most recent revision gives me the information I need, but it's largely in
> HTML tables/divs/etc format rather than wikitext/json/xml/etc.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Accessing current template information on wiki commons description page

2014-06-02 Thread james harvey
I may have stumbled upon it.  If I change the API call from
"titles=File:XYZ.jpg" to "titles=Template:XYZ" (note: dropped the .jpg)
then it *appears* to get me what I need.

Is this correct, or did I run across a case where it appears to work but
isn't going to be the right way to go?  (Like, I'm not sure if
"Template:XYZ" directly relates to the Summary information on the
"File:XYZ.jpg" page, or if it's duplicated data that in this case matches.
 And, I'm confused why the .jpg gets dropped switching "File:" to
"Template:")

And, will this always get me the full template information, or if someone
just updates the "Year" portion, would it only return back that part --
since the revisions seem to be returning data as much as they can based on
changes from the previous revision, rather than the answer ignoring any
other revisions.

On Tue, Jun 3, 2014 at 1:59 AM, james harvey 
wrote:

> Given a Wikimedia Commons description page URL - such as:
> https://commons.wikimedia.org/wiki/File:Van_Gogh_-_Starry_Night_-_Google_Art_Project.jpg
>
> I would like to be able to programmatically retrieve the information in
> the "Summary" header.  (Values for "Artist", "Title", "Date", "Medium",
> "Dimensions", "Current location", etc.)
>
> I believe all this information is in "Template:Artwork".  I can't figure
> out how to get the wikitext/json-looking template data.
>
> If I use the API and call:
> https://commons.wikimedia.org/w/api.php?action=query&format=xml&titles=File:Van%20Gogh%20-%20Starry%20Night%20-%20Google%20Art%20Project.jpg&iilimit=max&iiprop=timestamp|user|comment|url|size|mime&prop=imageinfo|revisions&rvgeneratexml=&rvprop=ids|timestamp|user|comment|content
> 
>
> Then I don't get the information I'm looking for.  This shows the most
> recent revision, and its changes.  Unless the most recent revision changed
> this data, it doesn't show up.
>
> To see all the information I'm looking for, it seems I'd have to specify
> rvlimit=max and go through all the past revisions to figure out which is
> most current.  For example, if I do so and I look at revid 79665032, that
> includes: "{{Artwork | Artist = {{Creator:Vincent van Gogh}} | . . . | Year
> = 1889 | Technique = {{Oil on canvas}} | . . ."
>
> Isn't there a way to get the current version in whatever format you'd call
> that - the wikitext/json looking format?
>
> In my API call, I can specify rvexpandtemplates which even with only the
> most recent revision gives me the information I need, but it's largely in
> HTML tables/divs/etc format rather than wikitext/json/xml/etc.
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l