Restrictive wikis for captchas are only a handful (plus pt.wiki which is
in permanent emergency mode).
https://meta.wikimedia.org/wiki/Newly_registered_user
For them you could request confirmed flag at
https://meta.wikimedia.org/wiki/SRP
Personally I found it easier to do the required 10, 50 or
Sounds like a site config issue. All wikis that have NS_TEMPLATE in
$wgFlaggedRevsNamespaces should also have NS_MODULE in there.
--
View this message in context:
http://wikimedia.7.n6.nabble.com/Flagged-revs-and-Lua-modules-tp4999685p497.html
Sent from the Wikipedia Developers mailing
great news. :)
*
--
**Nasir Khan Saikat* http://profiles.google.com/nasir8891
www.nasirkhn.com
On Thu, Mar 21, 2013 at 10:01 AM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:
On 03/18/2013 01:29 PM, Tomasz Finc wrote:
Greetings all,
I'm pleased to announce that the mobile department
On Thu, Mar 21, 2013 at 1:04 AM, Eugene Zelenko
eugene.zele...@gmail.com wrote:
Hi!
I think will be good idea to direct some of Google Summer of Code
participants energy to help Wikidata which misses many must-be
features. Some of them like support for projects other then Wikipedia
is
Thanks to everybody who showed up!
The IRC log can be found at
https://meta.wikimedia.org/wiki/IRC_office_hours/Office_hours_2013-03-19
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
___
Wikitech-l mailing list
tl;dr
discussion start How/whether MediaWiki could use ZendOptimizuerPlus
Since a short time
* ZendOptimizerPlus (Opcode cache; source [4])
is under PHP license and planned to be integrated in PHP 5.5.
The former restrictions of this program (closed-source etc.) appear to
be gone,
so I'd
On Thu, Mar 21, 2013 at 10:14 AM, Thomas Gries m...@tgries.de wrote:
tl;dr
discussion start How/whether MediaWiki could use ZendOptimizuerPlus
Since a short time
* ZendOptimizerPlus (Opcode cache; source [4])
is under PHP license and planned to be integrated in PHP 5.5.
The former
Am 21.03.2013 15:23, schrieb Chad:
You're confusing opcode caching with shared memory caching.
thanks, as already mentioned, I anticipated that difference.
Having the Zend
Optimizer doesn't prohibit you from using APC's shared memory caching.
But APC has issues with PHP 5.4 .
What can we
On Thu, Mar 21, 2013 at 10:42 AM, Thomas Gries m...@tgries.de wrote:
Am 21.03.2013 15:23, schrieb Chad:
You're confusing opcode caching with shared memory caching.
thanks, as already mentioned, I anticipated that difference.
Having the Zend
Optimizer doesn't prohibit you from using APC's
Am 21.03.2013 15:57, schrieb Chad:
Sure, it'd be an improvement--go ahead and file a bug wherever it
belongs upstream (github?). If and when they decide to implement
it, *then* would be the time to make MW changes :)
Where can I read more (and can then refer to it) about how MediaWiki
uses the
On Thu, Mar 21, 2013 at 4:02 PM, Thomas Gries m...@tgries.de wrote:
Do I have to look for the MediaWiki source module/memcache API? Where ?
It's called BagOStuff.
Bryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On 03/20/2013 10:43 AM, Jasper Wallace wrote:
On Tue, 19 Mar 2013, MZMcBride wrote:
P.S. mailman: there's a non-ASCII character in the subject line. Attack!
Why? It's correctly encoded:
Because the way the subject line is displayed 3 different ways on the
archive page:
We've been having a hard time making photo uploads work in
MobileFrontend because of CentralAuth's third party cookies problem (we
upload them from Wikipedia web site to Commons API). Apart from the
newest Firefox [1,2], mobile Safari also doesn't accept third party
cookies unless the domain has
Ori's advice rings true with me. It's something I need to get better at.
On the email titiel sidetrack, it should not create a 4th way. Without
verifying them those all look like valid representations of the same data.
MIME encoded word syntax only has two possible encodings, quoted printable
Hey,
as you remembered, we were asking about EasyRDF in order to use it in
Wikidata.
We have now cut off the pieces that we do not need, in order to simplify
the review. Most of the interesting parts of EasyRDF regarding security
issues -- parsing, serving, etc. -- has been removed.
Our code is
On 03/21/2013 11:45 AM, Luke Welling WMF wrote:
On the email title sidetrack, it should not create a 4th way.
The pedant in me says there are at least two more ways -- different
capitalization for UTF-8. But your subject line shows another way.
My client displays all of the subjects the same.
Heh, if clients randomly change character sets than I guess there are a
very large number of possible values.
Given that RFC2047 came out in 1996 it's reasonable that people use
non-ascii characters in titles given that the means to do it in a
compatible way has been around for 17 years.
Luke
On Wed, Mar 20, 2013 at 10:43 AM, Jasper Wallace jas...@pointless.netwrote:
On Tue, 19 Mar 2013, MZMcBride wrote:
P.S. mailman: there's a non-ASCII character in the subject line. Attack!
Why? It's correctly encoded:
Subject: Re: [Wikitech-l] Gerrit =?UTF-8?B?V2Fyc+KEog==?=: a
Is sending an email to wikitech-ambassadors enough for unblocking it?
Although such should contain a timeframe expectation, which probably
only WMF can give.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On 03/21/2013 02:55 AM, Niklas Laxström wrote:
I've seen a couple of instances where changes to MediaWiki are blocked
until someone informs the community.
Someone is a volunteer.
Community is actually just the Wikimedia project communities. Or at
least the biggest ones which are expected to
Ori, now you can add another point to the concise strategy-guide:
Deeper posts tend to generate superficial and tangential replies. The
answer is Silence.
PS: thank you for the post, I enjoyed it.
--
Quim Gil
Technical Contributor Coordinator @ Wikimedia Foundation
Just to be clear, APC will not work in PHP 5.5 at all. It actually
conflicts with Zend Optimizer+, and you cannot use both at the same time.
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
Example:
We are running a fix in category sorting collations. That was a fix for the
bug (introduced by developers, 3rd party software, whatever), not an
enhancement. Anyway, notifying the community and its approval was requested.
On Thursday, March 21, 2013, Quim Gil q...@wikimedia.org wrote:
On 03/09/2013 10:00 PM, Brian Cassidy wrote:
Hello,
I'm the co-author of the WWW::Wikipedia Perl module (
https://metacpan.org/release/WWW-Wikipedia). It programmatically parses the
raw source of a Wikipedia page.
Of late, a few changes in behaviour have been reported to me -- all related
Well, as part of the community and a volunteer, I can safely say that I
don't think I (or anybody else) needs notification before bug fixes. :P
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
Tim rolled back wmf12 after a nasty bug last night:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46397
Our fix we deployed to test2 didn't fix it:
https://gerrit.wikimedia.org/r/#/c/55086/
We're still diagnosing/etc.
So, we're staying on wmf11 for now (except on test2, which is running
the
On 2013-03-21 3:08 PM, Tyler Romeo tylerro...@gmail.com wrote:
Well, as part of the community and a volunteer, I can safely say that I
don't think I (or anybody else) needs notification before bug fixes. :P
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in
Your forwards to potential students / interns / mentors are welcome!
Also out of the Bay Area: we will stream the event and accept questions
via IRC.
GSoC and other open source internship programs
Wikipedia Engineering Meetup (San Francisco)
Thursday, April 11, 2013
5:00 PM
On 03/21/2013 11:05 AM, Paul Selitskas wrote:
Example:
We are running a fix in category sorting collations. That was a fix for the
bug (introduced by developers, 3rd party software, whatever), not an
enhancement. Anyway, notifying the community and its approval was requested.
Thank you, having
https://bugzilla.wikimedia.org/show_bug.cgi?id=46428
If one of the php-knowledgable peeps can take a look at this (sadly my
php-foo is quite weak).
--
Leslie Carr
Wikimedia Foundation
AS 14907, 43821
http://as14907.peeringdb.com/
___
Wikitech-l
Another example would be changing default options in core - recently I
tried to push for making the enhanced recentchanges the default, but one
of the blockers was that I'd need to let the Wikimedia communities know
as the change would be applied there as well.
Unfortunately I didn't have any
On Thu, Mar 21, 2013 at 2:13 PM, Brian Wolff bawo...@gmail.com wrote:
That depends on the bug. Some fixes do cause disruption. To pick a random
clear cut example from a while ago - consider adding the token to the login
api action. It was very important that got fixed, but it did cause
This is all due to the introduction of Wikidata http://wikidata.org.
On Thu, Mar 21, 2013 at 12:32 PM, Sumana Harihareswara
suma...@wikimedia.org wrote:
On 03/09/2013 10:00 PM, Brian Cassidy wrote:
Hello,
I'm the co-author of the WWW::Wikipedia Perl module (
On Thu, Mar 21, 2013 at 11:46 AM, Leslie Carr lc...@wikimedia.org wrote:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46428
If one of the php-knowledgable peeps can take a look at this (sadly my
php-foo is quite weak).
Hmm... ok this explains why I couldn't find Wikipedia Zero-related
On 03/21/2013 11:48 AM, Isarra Yos wrote:
Unfortunately I didn't have any idea when such a change could or would
be merged or deployed, so not only did I not have any timeframe to give
said the communities, I didn't even know when it would be appropriate to
tell them (if it happens months later,
On 21/03/13 19:15, Quim Gil wrote:
On 03/21/2013 11:48 AM, Isarra Yos wrote:
Unfortunately I didn't have any idea when such a change could or would
be merged or deployed, so not only did I not have any timeframe to give
said the communities, I didn't even know when it would be appropriate to
Quim, you seem to be answering the question how does one communicate
changes, but the question of this thread is who is responsible for
doing so. It's quite a difference.
Usually volunteers know the communities better and have less problems
with the how than others, but that's not the point.
Thank you for the feedback!
Brion co, I have tried to distill the essence of your comments and
write it down as generic guidelines at
https://www.mediawiki.org/wiki/Summer_of_Code_2013#Project_ideas
On 03/20/2013 04:53 PM, Luca de Alfaro wrote:
Would there be interest in integrating the
On 03/21/2013 02:12 PM, Federico Leva (Nemo) wrote:
Quim, you seem to be answering the question how does one communicate
changes, but the question of this thread is who is responsible for
doing so. It's quite a difference.
I don't think there is a single name for this responsibility. As it
I'd like to push for a codified set of minimum performance standards that
new mediawiki features must meet before they can be deployed to larger
wikimedia sites such as English Wikipedia, or be considered complete.
These would look like (numbers pulled out of a hat, not actual
suggestions):
-
On Mar 20, 2013, at 11:59 AM, Niklas Laxström niklas.laxst...@gmail.com wrote:
On 1 March 2013 23:46, Chad innocentkil...@gmail.com wrote:
Bug: 1234
Change-Id: Ia90.
So when you do this, you're able to search for bug:1234 via Gerrit.
By doing this, you're also removing it from the
On 21/03/13 20:55, Niklas Laxström wrote:
I've seen a couple of instances where changes to MediaWiki are blocked
until someone informs the community.
Someone is a volunteer.
Community is actually just the Wikimedia project communities. Or at
least the biggest ones which are expected to
Asher,
Do we know what our numbers are now? That's probably a pretty good baseline
to start with as a discussion.
p99 banner request latency of 80ms
Fundraising banners? From start of page load; or is this specifically how
fast our API requests run?
On the topic of APIs; we should set similar
From where would you propose measuring these data points? Obviously
network latency will have a great impact on some of the metrics and a
consistent location would help to define the pass/fail of each test. I do
think another benchmark Ops features would be a set of
latency-to-datacenter values,
API is fairly complex to meassure and performance target. If a bot requests
5000 pages in one call, together with all links categories, it might take
a very long time (seconds if not tens of seconds). Comparing that to
another api request that gets an HTML section of a page, which takes a
45 matches
Mail list logo