Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?

2016-02-12 Thread bawolff
> Last thoughts on the thread, I got bigger fish to fry than array syntax
> sugar :D
>
> -Chad

+1 to that.

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Relations Weekly Summary

2016-02-02 Thread bawolff
In regards to follow-up on the meeting about code review - That's
partially my bad that there hasn't been any yet. During the meeting I
said that I would pursue exploring the various options discussed, nag
people, start further discussions and try to ensure the ball kept
rolling. However, I haven't had much time recently to actually do any
of that.

(I mean from a social/community perspective only, purely as a
"volunteer" developer. From a WMF planning perspective, that's
somebody else's problem ;) and I have no opinion on what the WMF
should be doing)

--
-bawolff

On Tue, Feb 2, 2016 at 12:33 PM, Pine W  wrote:
> Hi Rob,
>
> Code review seems to be a choke point and the subject of numerous concerns,
> so I'd like to see SMART goals for improvements to code review, and one or
> more individuals (probably WMF individuals) taking leadership for meeting
> those goals. Perhaps the conversation at the Summit will inform the
> creation of these goals in WMF Q4 and in the WMF 16-17 Annual Plan.
>
> Thanks,
> Pine
> On Feb 2, 2016 6:41 AM, "Rob Lanphier"  wrote:
>
>> On Thu, Jan 28, 2016 at 2:57 PM, Pine W  wrote:
>>
>> > Is there, or will there be, a page somewhere that describes the outcomes
>> of
>> > the Developer Summit?
>> >
>>
>> I would love for the working group chairs to make the outcomes available
>> from the main WikiDev16 page:
>> <https://mediawiki.org/wiki/WikiDev16>
>>
>> The meeting notes from most of the meetings are there, but please add
>> anything you feel is missing, and/or ask the chairs of each of the meetings
>> to document their next steps.  I have this on my TODO list to continue
>> following up with various working group chairs (and helping with this), but
>> I also have been busy with other work, so I don't fault them at all for not
>> getting to it.
>>
>> Pine, could you describe what you hope to get out of the information?
>>
>> Thanks
>> Rob
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google Code-in 2015 is over. Congratulations everybody!

2016-02-02 Thread bawolff
On Thu, Jan 28, 2016 at 12:48 PM, Andre Klapper  wrote:
> Google Code-in 2015 has come to an end.
>
> Thanks to our students for resolving 461 Wikimedia tasks. Thanks to our
> 35 mentors for being available, also on weekends & holidays. Thanks to
> everybody on IRC for your friendliness, patience, and help provided to
> new contributors.
>
> Some more achievements, apart from those already mentioned in
> https://lists.wikimedia.org/pipermail/wikitech-l/2015-December/084421.html :
>
>  * The CommonsMetadata extension parses vcards in the src field
>  * The MediaWiki core API exposes "actual watchers" as in "action=info"
>  * MediaWiki image thumbnails are interlaced whenever possible
>  * Kiwix is installable/moveable to the SD card, automatically opens
>the virtual keyboard for "find in page", (re)starts with the last
>open article
>  * imageinfo queries in MultimediaViewer are cached
>  * Twinkle's set of article maintenance tags was audited and its XFD
>module has preview functionality
>  * The RandomRootPage extension got merged into MediaWiki core
>  * One can remove items from Gather collections
>  * A new MediaWiki maintenance script imports content from text files
>  * Pywikibot has action=mergehistory support implemented
>  * Huggle makes a tone when someone writes something
>  * Many i18n issues fixed and strings improved
>  * Namespace aliases added to MediaWiki's export dumps
>  * The Translate extension is compatible with PHP 7
>
> The Grand Prize winners & finalists will be announced on February 8th.
>
> Again congratulations everybody, and thanks for the hard work.
>
> See you around on IRC, mailing lists, Gerrit, and Phabricator!
>
> Cheers,
> andre
>
> --
> Andre Klapper | Wikimedia Bugwrangler
> http://blogs.gnome.org/aklapper/
>
>
>

I'd also like to congratulate Unicornisaurous specifically, who found
a security issue in MediaWiki core [details aren't public yet].
Although technically that accomplishment isn't part of GCI, to my
knowledge it is the first time someone participating in GCI has done
that, and I thought it deserved to be mentioned.

Congratulations to everyone who participated.

Cheers,
Brian

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Relations Weekly Summary

2016-02-02 Thread bawolff
On Tue, Feb 2, 2016 at 9:40 AM, Rob Lanphier  wrote:
> On Thu, Jan 28, 2016 at 2:57 PM, Pine W  wrote:
>
>> Is there, or will there be, a page somewhere that describes the outcomes of
>> the Developer Summit?
>>
>
> I would love for the working group chairs to make the outcomes available
> from the main WikiDev16 page:
> 
>
> The meeting notes from most of the meetings are there, but please add
> anything you feel is missing, and/or ask the chairs of each of the meetings
> to document their next steps.  I have this on my TODO list to continue
> following up with various working group chairs (and helping with this), but
> I also have been busy with other work, so I don't fault them at all for not
> getting to it.
>
> Pine, could you describe what you hope to get out of the information?
>
> Thanks
> Rob


Were the videos from the sessions in the main room ever uploaded somewhere?

Thanks,
Brian

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Developer Relations Weekly Summary

2015-10-22 Thread bawolff
> There has been a lot of discussion recently about the Education Exension's
> problems. Any chance of getting Developer Relations resources in making
> that code at least secure and maintainable enough to keep it deployed on
> its current wikis in the short term and in order to buy some time for
> development of long term solutions?
>

I can't speak for Quim, but normally I'd expect that to be pretty far
outside what developer relations does.
Based on recent commits, community-tech seems to be the current dumping
ground for please-fix-other-people's-messes type bugs (Although that might
just be coincidental. I doubt that has anything with what the team wants to
do so much as there is overlap between what they want to do and other
people's messes).

--
-Bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LDAP extension ownership

2015-09-19 Thread bawolff
maintain is an ambiguous word. WMF has some responsibility to all the
extensions deployed on cluster (imo). If Devunt (and any others who
were knowledgeable of the Josa extension) disappeared, WMF would
default to becoming responsible for the security and critical issues
in the extension (However, I wouldn't hold them responsible for
feature requests or minor bugs).

LDAP is used on wikitech, and some similar services. It would be nice
if teams that most directly interact with the extension (I suppose
that's labs, maybe security) help with maintenance [Maybe they already
do]. I don't necessarily think they have a responsibility to (beyond
critical issues, security, etc), but if the teams in question aren't
too busy, it is always nice to give back to projects that we use.

--
-bawolff

On Sat, Sep 19, 2015 at 5:25 AM, Yongmin Hong  wrote:
> Deployed on wmf cluster does not necessarilly means wmf has to maintain it.
> A simple example: [[mw:Extension:Josa]]. It's maintained by 3rd party
> developer independant from WMF. For example, (IIRC/AFAIK,) WMF has no
> staffs with Korean knowledge.
>
> [[Extension:Josa]]: https://www.mediawiki.org/wiki/Extension:Josa
>
> --
> revi
> https://revi.me
> -- Sent from Android --
> 2015. 9. 19. 오후 5:27에 "Thomas Mulhall" 님이 작성:
>
>> Since this is deployed on wikitech It should be maintained by Wikimedia
>> since it is unmaintained and its better that Wikimedia maintain it because
>> they have more staff that can review and because they have more experience
>> in doing reviews and code review.
>>
>>
>>  On Saturday, 19 September 2015, 8:39, Keegan Peterzell <
>> kpeterz...@wikimedia.org> wrote:
>>
>>
>>  On Sat, Sep 19, 2015 at 2:13 AM, Keegan Peterzell <
>> kpeterz...@wikimedia.org>
>> wrote:
>>
>> >
>> >
>> > On Sat, Sep 19, 2015 at 2:03 AM, Risker  wrote:
>> >
>> >> Well, bluntly put, since LDAP is how most non-WMF staff sign into
>> >> phabricator, I'd say it's become an essential extension.
>> >>
>> >
>> Even more technically, this (LDAP) is for people committing to Gerrit and
>> adding to Wikitech. These LDAP accounts can tie in CentralAuth through
>> oAuth.
>>
>> But again, yes, LDAP should be some sort of "maintained" in the sense that
>> Greg G. describes, and I think it will be.
>>
>> --
>> Keegan Peterzell
>> Community Liaison, Product
>> Wikimedia Foundation
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Min php version

2015-07-23 Thread bawolff
>
> (Also, that only catches incompatibilities in code that has unit tests. ;)
> I've ran into 5.3 compat breakages in the past when I added new features
> with tests, that used some existing code without tests.)
>

Just for reference, in this case, I'm pretty sure the code in question
has unit tests

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Min php version

2015-07-21 Thread bawolff
On Tue, Jul 21, 2015 at 2:44 AM, Moritz Muhlenhoff
 wrote:
> Hi,
>
> On Tue, Jul 21, 2015 at 8:13 AM, Tyler Romeo  wrote:
>
>> Just as a counter-argument (and, to be clear, I do support raising our
>> minimum version), just because PHP has EOL'ed a version does not mean that
>> some distributions (esp. Debian, Ubuntu) are not providing additional
>> support and security updates.
>>
>> If I remember from the last time we had this discussion, it will still be
>> a couple more months before PHP 5.3 is no longer supported by most major
>> distros, and it will be a while before 5.4 is no longer supported.
>>
>
> That's true, here's the specific EOL dates for common distros with PHP 5.3
>
> Debian 6.0 is supported until February 2016 and has PHP 5.3.3
>
> Ubuntu 12.04 is supported until April 2017 and has PHP 5.3.10
>
> RHEL 6/Centos is supported until June 2017 (and limited supported until
> 2020) and has PHP 5.3.3 (but they also provide officially supported 5.4/5.5
> packages)
>
> Cheers,
> Moritz
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

https://wikiapiary.com/w/index.php?title=Special:SearchByProperty&limit=500&offset=0&property=Has+PHP+Version&value=5.3.3
is also something to keep in mind

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Min php version

2015-07-19 Thread bawolff
According to our docs/internal checks, our min php version is 5.3.3.
However as of 6e283d394f31, MediaWiki doesn't work with php 5.3.3 (You
aren't allowed to implement an interface using an abstract method, on
that version of PHP so you get "Fatal error: Can't inherit abstract
function IDatabase::getType() (previously declared abstract in
DatabaseBase) in git/includes/db/Database.php on line 32").

Is it time we up'd our version requirements (And does anyone know if
that would affect lots of third parties?) Or should that change be
reverted?

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki user survey and un-reachable users

2015-07-18 Thread bawolff
>
> Can we add a link to the survey to the top of
> https://www.mediawiki.org/wiki/Download until the end of July?
>

Sounds reasonable. I added a link to the top of that page.

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [ Writing a MediaWiki extension for deployment ]

2015-07-07 Thread bawolff
Yes, you can add things to the Special:Preferences page. See
https://www.mediawiki.org/wiki/Manual:Hooks/GetPreferences

--
bawolff

On Tue, Jul 7, 2015 at 11:43 AM, Paula  wrote:
> I'm talking about an extension for MediaWiki not WikiMedia :)
> My question is if I can add an input text in the mediawiki user page under
> "prefferences".
> Something the example tuenti.png that I attached in the previous email.
>
>
> Cheers,
> Paula.
>
> On Tue, Jul 7, 2015 at 7:14 PM, Chris Steipp  wrote:
>
>> On Tue, Jul 7, 2015 at 9:17 AM, Paula  wrote:
>>
>> > Hello again,
>> > May I have the contact of somebody from the developing team under the
>> OAuth
>> > extension?
>> >
>>
>> Hi Paula, I'm one of the developers on that extension. As bawolff said,
>> feel free to ask here. If you're curious about something, someone else
>> probably is too, so let's keep the conversation in public.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [ Writing a MediaWiki extension for deployment ]

2015-07-07 Thread bawolff
--
- Brian
Caution: The mass of this product contains the energy equivalent of 85
million tons of TNT per net ounce of weight.


On Tue, Jul 7, 2015 at 10:17 AM, Paula  wrote:
> Hello again,
> May I have the contact of somebody from the developing team under the OAuth
> extension?
>
> I have some questions to ask them and as this a maillist on lots of
> different topics regarding wikimedia I think it would be better if I can
> contact them directly.
>
>
> Thanks for your time.
>

For what its worth, technical discussion about implenting extensions
is in scope for this mailing list (imo), so its perfectly fine if you
ask here, (Of course you should also feel free to email people
directly if you prefer that).


>So does this mean that there is no chance my implementation with Latch will
>also be available for MediaWiki users?

Just to be clear, do you mean MediaWiki or Wikimedia users (Wikimedia
= Wikipedia and related sites. MediaWiki = anyone who uses the
MediaWiki software)?

You can make any sort of MediaWiki extension you want, even if it
duplicates another extension, and people may use it. For Wikimedia,
probably only one extension would be chosen. Wikimedia also generally
has very picky standards for using extensions, and it can be a lot of
work (and politics!) to get an extension used on Wikimedia wikis. The
existing 2 factor auth extension is (AFAIK) only used on wikitech,
which is kind of different from being used on other Wikimedia wikis,
so most wikis have no 2 factor auth solution deployed to them.

I would definitely reccomend coordinating with people working on
similar extensions.

Pine said:
>I for one would be interested in moving that extension out of beta and
>having an option in site configuration to require some or all users to have
>two factor authentication enabled.

I think you misunderstand the extension status bar on mw.org. That is
self-reported by the author, and the requirements are not consistently
enforced. The contents of this status really has nothing to do with if
the extension is actually used on Wikimedia wikis, so "taking it out
of beta" would basically mean making a (rather meaningless) edit to
the extension description page.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] The end of the Roadmap Phabricator project

2015-07-04 Thread bawolff
>
> From what I read here, the current roadmap software is difficult to use and
> is not being used consistently;

In fact, until this thread, I didn't even know that people were making
a roadmap in phab...

--
bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Why doesn't en.m.wikipedia.org allow framing?

2015-05-15 Thread bawolff
On Fri, May 15, 2015 at 3:14 PM, Jacek Wielemborek  wrote:
> Hello,
>
> I tried to discuss this on #wikimedia-mobile on Freenode, but nobody
> could explain this to me:
>
> I'm building a website that allows the users to view Wikipedia changes
> correlated to rDNS names of their editors and I wanted to implement a
> "random mode" that allows thm to see all edits made by a given rDNS
> domain - the user would just press F5 and see the editor in context like
> this:
>
> http://wikispy.wmflabs.org/by_rdns_random/plwiki/.gov.pl
>
> I would definitely prefer to use the mobile version of Wikipedia though
> or at least Special:MobileEdit, but both disallow framing. Is there any
> specific reason for that? I would guess that this is for security, but I
> have to admit I don't know what could be gained by showing the
> MobileDiff in a frame.
>
> Cheers,
> d33tah
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I don't know about normal mobile page views, but edit views are not
allowed to be framed to prevent click-jacking attacks [1]

--bawolff

[1] https://en.wikipedia.org/wiki/Clickjacking

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Security and Maintenance Releases: 1.19.24, 1.23.9, and 1.24.2

2015-04-01 Thread bawolff
On Tue, Mar 31, 2015 at 6:20 PM, Chris Steipp  wrote:
> I would like to announce the release of MediaWiki 1.24.2, 1.23.9 and
> 1.19.24. These releases fix 10 security issues, in addition to other bug
> fixes. Download links are given at the end of this email.
>
>
> == Security fixes ==
>
> * iSEC Partners discovered a way to circumvent the SVG MIME blacklist for
> embedded resources (iSEC-WMF1214-11). This allowed an attacker to embed
> JavaScript in the SVG. The issue was additionally identified by Mario
> Heiderich / Cure53. MIME types are now whitelisted.
> <https://phabricator.wikimedia.org/T85850>
>
> * MediaWiki user Bawolff pointed out that the SVG filter to prevent
> injecting JavaScript using animate elements was incorrect.
> <https://phabricator.wikimedia.org/T86711>
>
> * MediaWiki user Bawolff reported a stored XSS vulnerability due to the way
> attributes were expanded in MediaWiki's Html class, in combination with
> LanguageConverter substitutions.
> <https://phabricator.wikimedia.org/T73394>
>
> * Internal review discovered that MediaWiki's SVG filtering could be
> bypassed with entity encoding under the Zend interpreter. This could be
> used to inject JavaScript. This issue was also discovered by Mario Gomes
> from Beyond Security.
> <https://phabricator.wikimedia.org/T88310>
>
> * iSEC Partners discovered a XSS vulnerability in the way api errors were
> reflected when running under HHVM versions before 3.6.1 (iSEC-WMF1214-8).
> MediaWiki now detects and mitigates this issue on older versions of HHVM.
> <https://phabricator.wikimedia.org/T85851>
>
> * Internal review and iSEC Partners discovered (iSEC-WMF1214-1) that
> MediaWiki versions using PBKDF2 for password hashing (the default since
> 1.24) are vulnerable to DoS attacks using extremely long passwords.
> <https://phabricator.wikimedia.org/T64685>
>
> * iSEC Partners discovered that MediaWiki's SVG and XMP parsing, running
> under HHVM, was susceptible to "Billion Laughs" DoS attacks
> (iSEC-WMF1214-13).
> <https://phabricator.wikimedia.org/T85848>
>
> * Internal review found that MediaWiki is vulnerable to "Quadratic Blowup"
> DoS attacks, under both HHVM and Zend PHP.
> <https://phabricator.wikimedia.org/T71210>
>
> * iSEC Partners discovered a way to bypass the style filtering for SVG
> files (iSEC-WMF1214-3). This could violate the anonymity of users viewing
> the SVG.
> <https://phabricator.wikimedia.org/T85349>
>
> * iSEC Partners reported that the MediaWiki feature allowing a user to
> preview another user's custom JavaScript could be abused for privilege
> escalation (iSEC-WMF1214-10). This feature has been removed.
> <https://phabricator.wikimedia.org/T85855>
>
>
> Additionally, the following extensions have been updated to fix security
> issues:
>
> * Extension:Scribunto - MediaWiki user Jackmcbarn discovered that function
> names were not sanitized in Lua error backtraces, which could lead to XSS.
> <https://phabricator.wikimedia.org/T85113>
>
> * Extension:CheckUser - iSEC Partners discovered that the CheckUser
> extension did not prevent CSRF attacks on the form allowing checkusers to
> look up sensitive information about other users (iSEC-WMF1214-6). Since the
> use of CheckUser is logged, the CSRF could be abused to defame a trusted
> user or flood the logs with noise.
> <https://phabricator.wikimedia.org/T85858>
>
[..]

Sounds like MediaWiki came through the security audit better than I
expected. The most serious issue they found seems to be the js preview
one (imho. I'm assuming you can't really do much with the svg exploits
beyond phishing since they live in upload.wikimedia.org and only have
access to the geoip cookies). So congratulations all!

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What does it take to have a project hosted on the Wikimedia git server?

2015-03-13 Thread bawolff
>
> I am personally a bit worried about the complexity of the process on
> gerrit, but I hope that as long as we don't require formal code review
> it should be as simple as git pull/git push, right?

The code review bit is optional, when asking for your repo you can ask
for a "straight push model".

Or you can do the code review thing, but then decide that everyone is
allowed to review their own patch and essentially ignore that aspect.

Gerrit may be a usability trainwreck, but its not that bad once you
get used to it.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread bawolff
On Tue, Mar 10, 2015 at 11:23 AM, Chris Steipp  wrote:
> Jacob Applebaum made another remark about editing Wikipedia via tor this
> morning. Since it's been a couple months since the last tor bashing thread,
> I wanted to throw out a slightly more modest proposal to see what people
> think.
[..]

If enwiki doesn't like this, lets start with other wikis. We run
something like 700 wikis, I'm sure at least some of them would like
the idea. Having some other wiki then enwiki go first and demonstrate
that this is workable without vandals taking over may help alleviate
enwiki fears.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New tabs for images

2014-06-01 Thread bawolff
On Jun 1, 2014 5:12 AM, "ENWP Pine"  wrote:
>
> So here I am working late night / early morning trying to get the
Signpost published, and I see something new at the top of image pages on
English Wikipedia such as https://en.wikipedia.org/wiki/File:Wikidata.png
>
> We now have "View on Wikimedia Commons" and "Add local description" tabs.
Cool!

I agree. Thank you This,_that_and_the_other for making that feature happen.

>
> Can we get the local description tab to appear on Commons also?

Which project would that link to? Commons files are used on over 300
projects.

> Also, because the local description tab opens a free-text entry space,
can that tab be split into "Add location" and "Add description" with the
former opening up places to enter geolocation data, nearby landmarks, or an
address?

That is actually the type of info that is supposed to go on commons not
locally. There are templates for that sort of thing. Direct entry is a bit
harder, maybe the use of wikidata on commons that is planned for the
nearish future will make entry of such discrete things easier. Otherwise
maybe we could have a "hotcat" style thing for ammending file information.
it was recently suggested to me that we would benefit from something
similar for adding translations to file descriptions.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Multimedia] ogv.js media player update: Flash and GPU acceleration

2014-03-30 Thread bawolff
>
>
> I've also compared performance to the Cortado Java applet we currently
use -- Cortado is still a little faster in terms of CPU usage, but getting
the applet to actually *run* with the current version of Java is a
nightmare -- even the signed version of the applet from theora.org requires
adding a security exception -- whereas JS or Flash "just works".

For the curious, i looked into the java applet a while back. The reason the
caortado applet from theora.ogg doesnt work without a security exception is
that they signed it wrong. It needs to be signed and an attribute set
saying that it doesnt need extra privs. That attribute isnt set on the
theora.ogg version. Even then it still needs a click through prompt.

Anyways, great work on all this.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Going full-screen on image click

2014-03-29 Thread bawolff
>
> The WMF Engineering Team is kindly working on Media Viewer, which would
show a pop-up of some sort when you click an image.

Sorry for being overly pendantic, but probably better to say WMF multimedia
team is working on media viewer. WMF engineering is a very big group of
people - the vast majority of whom are not working on multimedia viewer.



Cheers,
Bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Preview of the proposal for MediaWiki Homepage

2014-02-24 Thread bawolff
> -
> > >
> > > What's the Publish/Discuss/Translate/etc blocks for? They look like
> > > navigation but don't seem to go anywhere or correspond to anything.
> >
> > They attempt to summarize the best features that MediaWiki can offer.
> > Indeed, there are no detailed product descriptions to link to, but this
> > is because we don't have them. I would say this is better than nothing.
> > Currently you either know what MediaWiki plus selected extensions can
> > offer, or you guess it by becoming a Wikipedia power user, or you need
> > to connect many pages in mediawiki.org.
>

When i read that page it makes it seem like these features are available
out of the box, which is kind of misleading. In particular claiming we have
wysiwyg editing without mentioning visual editor is extremely difficult to
install doesnt seem like a good idea.

Personally i thought the translate box meant that the mediawiki interface
is translated into many languages (something we can certainly brag about).

Overall i like the idea of the redesign that you have in the uploaded file.

> > > Bugzilla should have a prominent link. Sysadmins and other users who
> > > found bugs are not necessarily looking to 'get involved'. They found
> > > bugs and want to report them or find fixes. They're looking for a bug
> > > thing. Where is that?
>

+1 bugzilla is very important for downstream users who have bugs.

> > "Support" links to https://www.mediawiki.org/wiki/Project:Support_desk ,
> > which is where many MediaWiki sysadmins go when they have/find problems.
> > Bugzilla is not visibly featured there, and it probably should be.

Getting support is different from filing bugs.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lower-resolution .ogv video transcodes coming

2014-02-21 Thread bawolff
On Thu, Feb 20, 2014 at 7:07 PM, Brion Vibber  wrote:
[..]
>
>
> Files should gradually populate at the smaller sizes as they get referenced
> and the new sizes are automatically added to the transcoding queue.
>
> Please give a shout if there's any problems.
>
> -- brion

Looks like this actually adds them to the queue all at once - 23,386
160p videos queued, 11,745 160p transcodes already done (!), which
means about 85% of all videos are either already transcoded to 160p,
or in the queue.

This might cause some delays in transcoding newly uploaded files, but
given that in a single day there's already been almost 20,000 new
transcodes, it looks like it won't take that long to be done with all
of them. I'm really quite surprised how fast the transcoding is
proceeding.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Template transclusion on Wikimedia Commons

2014-01-27 Thread bawolff
On Mon, Jan 27, 2014 at 2:12 PM, Tuszynski, Jaroslaw W.
 wrote:
> I lately switched Commons coordinate template
> https://commons.wikimedia.org/wiki/Template:Location from using dozens
> of subtemplates to much cleaner Lua code. Everything works great, except
> for Special:WhatLinksHere links. The no longer used subtemplates like
> https://commons.wikimedia.org/wiki/Template:Location/layout still show
> over 3M files transcluding it and files, like
> https://commons.wikimedia.org/wiki/File:Estadio_Pacaembu.jpg  which
> still claims to transclude [[Template:Location/layout]] ("Templates used
> on this page:" section in edit mode). The transclusion dependencies do
> not correct themselves even with a Purge, but page edit fixes it. Also
> several times I got some sort of internal database errors when using
> Special:WhatLinksHere.
>
>
>
> In the past when I run into this kind of trouble I waited, but in some
> cases it took over half a year for the database to update itself. In
> other cases those issues never clear (or I did not wait long enough),
> like with Category:Pages with malformed coordinate tags
> <https://commons.wikimedia.org/wiki/Category:Pages_with_malformed_coordi
> nate_tags>  , which due to a long fixed bug in  {{#coordinates:
> <https://www.mediawiki.org/wiki/Extension:GeoData> }} was filled with
> 40k files which did not have any issues. Then the only solution I know
> is  Pywikibot/touch.py
> (https://www.mediawiki.org/wiki/Manual:Pywikipediabot/touch.py ), which
> I recently used to clear {{#coordinates:
> <https://www.mediawiki.org/wiki/Extension:GeoData> }} issues.
>
>
>
> Is there a better way to synchronize database with reality?
>
>
>
>
>
> Jarek T.
>
> (user:jarekt <http://commons.wikimedia.org/wiki/User:Jarekt> )
>
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It takes a lot of work to reparse 4 million pages. If it makes you
feel better, the toolserver transclusion count you linked to is about
2 pages higher than the real number of transclusions.

>but in some
> cases it took over half a year for the database to update itself

If something is taking 6 months, it probably means that some sort of
bug happened, and the pages aren't being refreshed at all.

> The transclusion dependencies do
> not correct themselves even with a Purge, but page edit fixes it.

Yes, null edits update links tables (e.g. Categories, whatlinkshere,
Special:LinkSearch, etc) purges do not.

> Also
> several times I got some sort of internal database errors when using
> Special:WhatLinksHere.

That shouldn't happen. What sort of things were you doing when the
error happened?


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Review Milestone reached

2014-01-17 Thread bawolff
Aaron Schulz has become the first person to have approved (+2'ed) >=
1000 patchsets to mediawiki core [1]. I thought this nice round number
deserved a note, and a "good job". Thank you Aaron for all your hard
work reviewing things.

--bawolff


[1] https://toolserver.org/~nemobis/crstats/core.txt

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] trace gitblit, was: Re: Wikimedia's anti-surveillance plans: site hardening

2013-08-17 Thread bawolff
> yes ken, you are right, lets stick to the issues at hand:
> (1) by when you will finally decide to invest the 10 minutes and
> properly trace the gitblit application? you have the commands in the
> ticket:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=51769
>
> (2) by when you will adjust your operating guideline, so it is clear
> to faidon, ariel and others that 10 minutes tracing of an application
> and getting a holistic view is mandatory _before_ restoring the
> service, if it goes down for so often, and for days every time. the 10
> minutes more can not be noticed if it is gone for more than a day.


What information are you hoping to get from a trace that isn't currently known?

I'm not involved with the issue, and don't know specifics, but reading
the bug it sounds like there isn't any information we need from a
trace at the moment. (It sounds like there was a point in the
debugging stage where that would be useful, but that's in the past)

If you want people to trace something, you should justify why it will
help you fix the issue/discover what's going on/etc. I would consider
it inappropriate for ops to wait 10 minutes before restarting
something in order to get  a stack trace, if we didn't need the info
in the trace.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs

2013-07-22 Thread bawolff
On Mon, Jul 22, 2013 at 10:40 AM, Bartosz Dziewoński
 wrote:
> This isn't an appropriate list for this, but MaxSem and hashar told me to
> post it here anyway, so here goes.
>
> There's a patch[1] to remove 'visualeditor-enable' from $wgHiddenPrefs,
> essentially allowing for disabling VE on a per-user basis again. It has
> overwhelming community support, but the VisualEditor team is refusing to
> acknowledge it, and ops say it's "none of their business".
>
> Can something be done about it?
>
> [1] https://gerrit.wikimedia.org/r/#/c/73565/
>
>
> --
> Matma Rex
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I'm confused, did we really disable the option to opt out of visual
editor solely because that was easier than updating the description
for the preference?

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OPW intern looking for feedback!

2013-03-27 Thread bawolff
On Wed, Mar 27, 2013 at 9:31 AM, Daniel Friesen
 wrote:
> On Wed, 27 Mar 2013 00:19:53 -0700, Brian Wolff  wrote:

>
>
> Please don't. I've been trying to slowly move us away from depending on
> wgSecretKey's secrecy for security. Eventually I hope to try an eliminate
> dependence on it from extensions too. And in an ideal case, eventually stop
> setting it in the installer (unless you have an edge case where a little
> more entropy for CryptRand could be useful; Or maybe not, I need to double
> check which case that was, but it might not even exist anymore with our
> version requirements).
>
> I see people over and over asking for help and inadvertently handing that
> information which is supposed to remain secret right over in public.
>
> Instead of trying to make the paths a secret just don't put that data inside
> of public /tmp directories.
> I recommend setting your git director config to false and in an extension
> setup function set it to some path based on the upload directory.
> This is basically what we used to do with $wgTmpDirectory which was used by
> CACHE_DBA.
>
>
>
> --
> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Getting slightly offtopic, but a world where people stop spamming us
with $wgSecretKey would be nice ;)

However, you're still going to have $wgUpgradeKey, and $wgDBpass ...
Perhaps it'd be cool to split LocalSettings.php into LocalSettings.php
and PrivateSettings.php


> I recommend setting your git director config to false and in an extension
> setup function set it to some path based on the upload directory

Given that the upload directory is web accessible (and many people
don't even turn off php_engine in that directory [speaking of which,
why don't we add that to the default .htaccess for that directory]),
having arbitrary git checkouts in such a directory seems kind of scary
too.


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSOC 2013

2013-03-27 Thread bawolff
On Wed, Mar 27, 2013 at 11:50 AM, Nadeem, IIT Kgp
 wrote:
> Hello,
>
> I am Nadeem Anjum, a third year bachelor's student of the Department of
> Computer Science and Engineering, IIT Kharagpur.
>
> I am really interested in becoming a part of MediaWiki for GSoC 2013. I
> have browsed through the project ideas and got interested in Automatic
> Category Redirects.
>
> As far my skill set, I am well versed in PHP, MySQL, JavaScript, jQuery,
> HTML, CSS, Java, C, C++ and Python.
>
> I have been an active contributor to numerous development and open-source
> projects: http://cse.iitkgp.ac.in/~nanjum/OpenSourceProjects.html
>
> Please guide me on how I should proceed towards my proposal for GSoC.
>
> Thanking you,
> Nadeem Anjum.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Awesome.

Some things you could do would be read up on the area that your
project would be on. I'm not sure how familiar you are with MediaWiki,
but if you're not too familiar, as a first step start playing  with
categories in MediaWiki (Both the normal web interface and the
list=categorymembers api module) just to get an idea of how categories
work in MediaWiki. Once you got the basic idea, read about how
categories are stored (
http://www.mediawiki.org/wiki/Manual:Categorylinks_table ). See if you
can figure out what type of queries are used to get lists of
categories, etc. You could also try skimming through some of the
relevant php files (In this case, CategoryViewer.php would be a good
place to start. Title.php is another good thing to skim regardless of
the chosen project). Some of these php files will be confusing at
first, so don't worry if you don't understand, the point is more to
just explore MediaWiki so you're exposed to the code.

Other things you can do, is follow the instructions on How to become a
hacker page, get a labs account, make a git clone, Install a copy of
MediaWiki locally, etc.  If you can manage to find a simple bug to fix
(Try looking from the list
https://bugzilla.wikimedia.org/buglist.cgi?keywords=easy&keywords_type=allwords&list_id=166192&bug_severity=normal&bug_severity=minor&bug_severity=trivial&bug_severity=enhancement&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED
), and submit a patch for it, that will probably give you major bonus
points on your gsoc proposal.

So basically, try reading docs and code, see how far you get. Its not
expected that you will just be able to start reading the php code and
everything will make sense instantly, so if you read some of the php
code and nothing makes sense, don't be discouraged, but seeing how
much of it you can make sense of, can be quite beneficial.

Last of all, don't be afraid to ask questions. There's lots of people
on irc who can probably answer any questions you may have.

--
-Bawolff (Brian Wolff)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] job queue

2013-03-25 Thread bawolff
On Mon, Mar 25, 2013 at 1:34 PM, dan entous  wrote:
> context
> ---
> i’m working on a mediawiki extension, 
> http://www.mediawiki.org/wiki/Extension:GWToolset, which has as one of its 
> goals, the ability to upload media files to a wiki. the extension, among 
> other tasks, will process an xml file that has a list of urls to media files 
> and upload those media files to the wiki. our ideal goal is to have this 
> extension run on http://commons.wikimedia.org/.
>
>
> job queue goals
> ---
> 1. setup the processing of the xml file as a job queue job
> 2. each media file upload to be setup as a job queue job
>
>
> current implementation
> --
> i have been able to achieve goal 2 and will sort out goal 1 shortly.
>
>
> issues/questions
> 
> 1. each of these jobs can take several seconds to complete. i have noticed in 
> my local wiki that each of these jobs is picked up with each wiki visit and 
> slows down the response of the wiki by however many seconds the job takes to 
> run, a sleep in the job shows that if the job takes 15 seconds to run the 
> wiki will be slowed down by that amount of time; i don't want this to happen 
> on my local wiki or on commons.
>
>a. are jobs on commons run as part of each wiki visit?
>b. is there a cron job that takes care of the job queue on commons instead 
> of using each wiki visit?
>c. if not, is there a way to indicate that the job should only be run as a 
> cron job and not with a wiki visit?
>
> 2. if there's no solution to running the job with each wiki visit and slowing 
> down the site, what other suggestions are there on processing the xml file 
> and individual media file uploads?
>
>
> thanks in advance!
> dan
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Given your use-case, it seems like making a maintenance script that
reads the xml file and imports images might make more sense. Mass
uploads are rare enough events that I imagine going through the
process of somebody running the maintinance script would not be
prohibitive (But that's just a guess).

>a. are jobs on commons run as part of each wiki visit?
>b. is there a cron job that takes care of the job queue on commons instead 
> of using each wiki visit?

No, commons uses runJobs.php (I believe. At the very least jobs are
not run from webrequests). I doubt cron is used, I imagine there are
enough jobs that there is a web server (probably more than one)
dedicated solely to dealing with the job queue at all times.

--bawolff

p.s. As an aside, looking for this source code I noticed its developed
on github "until it is ready for full review". You do realize if you
wanted to you could put the code in Wikimedia's gerrit and ignore the
review features until you're ready for the full review.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikiversity: Request for MediaWiki extension

2013-03-15 Thread bawolff
On Fri, Mar 15, 2013 at 12:15 PM, Hendrik Weimer
 wrote:
> Hello,
>
> I am going to teach a graduate-level course on quantum physics (quantum
> simulation) at the University of Hannover, Germany. At the moment, I am
> evaluating Wikiversity for collaboration with other lecturers and
> students on the creation of lecture notes.
>
> As the notes will contain a significant amount of mathematical content,
> it would be extremely useful to export into LaTeX format for
> distributing printed copies. As far as I know, the Wiki2LaTeX extension
> to MediaWiki [1] provides this function. Would it be possible to include
> this extension into Wikiversity?
>
> Best regards,
>
> Hendrik
>
> [1] <http://www.mediawiki.org/wiki/Extension:Wiki2LaTeX>
>
> --
> Dr. Hendrik WeimerPhone:  +49-511-762-4836
> Institut für Theoretische Physik  Fax:+49-511-762-3023
> Leibniz Universität Hannover  E-Mail: hwei...@itp.uni-hannover.de
> Appelstr. 2, 30167 Hannover, GERMANY  http://www.itp.uni-hannover.de/~weimer/
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

That's a large extension, which means the review effort would be
non-trivial (Although just looking at it, I noticed a couple lines
which look suspicious). Its probably unlikely to be deployed unless
people _really_ wanted it (due to the amount of review required). If
it was to be reviewed, the process of reviewing it would probably take
a long time (So probably out of your time frame).

Thus I don't think its likely to be installed :(


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Indexing structures for Wikidata

2013-03-08 Thread bawolff
On Thu, Mar 7, 2013 at 12:50 PM, Denny Vrandečić
 wrote:
> As you probably know, the search in Wikidata sucks big time.
>
> Until we have created a proper Solr-based search and deployed on that
> infrastructure, we would like to implement and set up a reasonable stopgap
> solution.
>
> The simplest and most obvious signal for sorting the items would be to
> 1) make a prefix search
> 2) weight all results by the number of Wikipedias it links to
>
> This should usually provide the item you are looking for. Currently, the
> search order is random. Good luck with finding items like California,
> Wellington, or Berlin.
>
> Now, what I want to ask is, what would be the appropriate index structure
> for that table. The data is saved in the wb_terms table, which would need
> to be extended by a "weight" field. There is already a suggestion (based on
> discussions between Tim and Daniel K if I understood correctly) to change
> the wb_terms table index structure (see here <
> https://bugzilla.wikimedia.org/show_bug.cgi?id=45529> ), but since we are
> changing the index structure anyway it would be great to get it right this
> time.
>
> Anyone who can jump in? (Looking especially at Asher and Tim)
>
> Any help would be appreciated.
>
> Cheers,
> Denny
>
> --
> Project director Wikidata
> Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
> Tel. +49-30-219 158 26-0 | http://wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/681/51985.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

AFAIK sql isn't particularly good for indexing that type of query.

You could maybe have a bunch of indexes for the first couple letters
of a term, and then after some point hope that things are narrowed
down enough that just doing a prefix search is acceptable. For
example, you might have an indexes on (wb_term(1), wb_weight),
(wb_term(2), wb_weight), ..., (wb_term(7), wb_weight) and one on just
wb_term. That way (I believe) you would be able to do efficient
searches for a prefix ordered by weight, provided the prefix is less
than 7 characters. (7 was chosen arbitrarily out of a hat. Performance
goes down as you add more indexes from what I understand. I'm not sure
how far you would be able to take this scheme before that becomes an
issue. You could maybe enhance this by only showing search suggestion
updates for every 2 characters the user enters or something).

--bawolff

p.s. Have not tested this, and talking a bit outside my knowledge area, so ymmv

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Identifying pages that are slow to render

2013-03-07 Thread bawolff
On 2013-03-07 4:06 PM, "Matthew Flaschen"  wrote:
>
> On 03/07/2013 12:00 PM, Antoine Musso wrote:
> > Le 06/03/13 23:58, Federico Leva (Nemo) a écrit :
> >> There's slow-parse.log, but it's private unless a solution is found for
> >> https://gerrit.wikimedia.org/r/#/c/49678/
> >> https://wikitech.wikimedia.org/wiki/Logs
> >
> > And slow-parse.log is probably going to be kept private unless proven it
> > is not harmful =)
>
> Why would it be harmful for public wikis?  Anyone can do this on an
> article-by-article basis by copying the source their own MediaWiki
> instances.
>
> But it ends up being repeated work.
>
> Matt
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

+1 . I have trouble imagining how making this public could be harmful.
There are plenty of well known slow to parse pages already. There's also
more than a couple of ways to convince mw to make slow queries (longer than
the php time limit), we publically release detailed profiling data, etc.
Well that sort of thing isnt exactly proclaimed to the world, its also not
a secret. If someone wanted to find slow points on mediawiki, theres a lot
worse things just floating around the internet than a slow to parse page
list.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] linking to wikidata pages

2013-02-22 Thread bawolff
On Fri, Feb 22, 2013 at 10:46 AM, Lydia Pintscher
 wrote:
> On Fri, Feb 22, 2013 at 3:43 PM, Tuszynski, Jaroslaw W.
>  wrote:
>> Two separate users on Commons complained that interproject links from 
>> Wikimedia Commons to Wikidata, like [[d:Q7186]] produce hyperlinks in form 
>> "http://wikidata.org/wiki/Q35548"; or " http://en.wikidata.org/wiki/Q35548"; 
>> which do not allow editing of wikidata. Only external link to 
>> http://www.wikidata.org/wiki/Q35548 leads to editable page. See 
>> http://commons.wikimedia.org/wiki/User_talk:Jarekt#a_small_modification_to_.7B.7BCreator.7D.7D
>>  and
>> http://commons.wikimedia.org/wiki/Template_talk:Creator#Wikidata_hyperlink .
>>
>> I cannot reproduce those problems, since all 3 types of links seems to lead 
>> to editable pages for me. Did anybody on this list, know what might cause 
>> issues for those users. One clue might be that both use "French" as their 
>> native language.
>>
>> Any ideas?
>
> It's a known problem, yes. The pages are editable but the edit doesn't
> actually save. It's one of the main pain points we have atm. Fix
> waiting for review and deployment by ops:
> https://gerrit.wikimedia.org/r/#/c/49069/
> https://bugzilla.wikimedia.org/show_bug.cgi?id=45005
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Community Communications for Wikidata
>
> Wikimedia Deutschland e.V.
> Obentrautstr. 72
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It seems like there's more than one problem here. Besides the things
should redirect to canonical url:
*Why are interwikis going to the non-canonical url. That seems like
the interwiki map is misconfigured. We can make both commons: and
wikidata: link to the proper url, why does "d:" have a lang subdomain?
*Why do edits not work on the non-canonical url (is it because the api
requests go to a fully qualified url based on $wgServer instead of of
a relative url? Is there a reason for doing this?)
--
- bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who is responsible for accepting backported patch sets for maintained versions?

2013-02-21 Thread bawolff
>
> * Work to get bugfixes backported to 1.19.  I don't have Gerrit
>   rights to commit to the REL1_19 branch, but that will keep me from
>   fixing "bugs" by fiat.

I think we should give you such rights.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corrupt pages on English Wikipedia

2013-02-20 Thread bawolff
On Wed, Feb 20, 2013 at 9:59 AM, Sumana Harihareswara
 wrote:
> Brian, would you take a look at
> https://www.mediawiki.org/wiki/How_to_report_a_bug and maybe update it
> to clarify what sorts of information to try to hold on to for debugging
> purposes?

What sort of debugging information is useful depends on the situation.
In most cases the type of information I mentioned would be overkill.

>>
>> A little gratitude to someone trying to help you fix a problem
>> wouldn't go amiss...

We appreciate the bug report, we just can't do anything about it
without more information. To give a  (not entirely fair) comparison,
imagine someone posted on your talk page that there was a spelling
error on Wikipedia. I assume you would respond to such a report with
"where?", it wouldn't be because you're ungrateful that you respond
like that, but simply that you cannot fix the issue without more
information (Wikipedia is a big place). The situation here is somewhat
similar. We're grateful for the report, but would need more
information before we can do anything about it.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-02-18 Thread bawolff
On Mon, Feb 18, 2013 at 1:53 PM, Waldir Pimenta  wrote:
> On Mon, Feb 18, 2013 at 5:17 PM, Krinkle  wrote:
>
>> But before more bike shedding (have we had enough these last 2 months
>> yet?), is there a problem with having a directory?
>>
>
> It somewhat breaks the pattern, considering that all the other access
> points (and their corresponding php5 files) are located in the root. So
> that leaves only overrides.php, which I'm not sure why it was kept in
> mw-config, considering that (quoting Platonides) "the installer used to be
> in the config folder, until the rewrite, which *moved the classes* to
> includes/installer" (emphasis mine). If the classes were moved to
> includes/installer, why did those of overrides.php's remain? So they can be
> easily deleted? I don't think they're the only files that are kept around
> that people won't use, so the convenience of an easy deletion doesn't seem
> that much of a big advantage, especially at the expense of a logical
> organization of the code.
>
> By the way, there's also an INSTALL file in the root, so even if
> overrides.php was moved there, it wouldn't be the only installation-related
> non-access-point to be kept there.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Moving around files causes confusion as all docs have to be updated,
and some docs will be missed. Sometimes that sort of confusion can't
be avoided, but if there's no problem with where the installer is, I
would be opposed to moving.

Some very paranoid people may still want to delete the installer entry
point after install, as one can still use it to do maintenance stuff
(db upgrades) if folks have the right password. Putting it in
/mw-config also provides a nice separation from mediawiki proper imo.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Maria DB

2013-02-13 Thread bawolff
Umm there was a thread several months ago about how it is used on several
of the slave dbs, if I recall.

-bawolff
On 2013-02-13 8:28 AM, "Petr Bena"  wrote:

> Hi,
>
> I have installed Maria DB to all my servers, including production servers
> few weeks ago, and I found it quite stable and I like it (even the command
> line tool for working with sql is far better than the one included in mysql
> pack)
>
> It's supported on all latest ubuntu versions from 10.04 UP (maybe even
> older) - so my question is, are we going to use it on wikimedia production?
>
> I think we could migrate beta cluster for now - it has terrible performance
> and this could help. It could be a first step to migrate wikimedia
> production cluster.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Announcement: Ed Sanders joins Wikimedia as Visual Editor Software Engineer

2013-02-12 Thread bawolff
>
> s/11/9/
>
> *Sumana slinks off quietly*
>
> --
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation

9 is still an extremely impressive number. When did sysops get
introduced? I have a feeling that 11 years ago there was no such
thing.

Welcome Ed!

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Stable PHP API for MediaWiki ?

2013-02-11 Thread bawolff
I don't think she means what we call the api - but more methods random
extensions are likely to use.

We could start documenting certain stable methods with a special code
comment ( say @stable) which would mean something like this method will not
be removed, and if the need arises we'll remove the @stable and wait 2
versions before removing the method. Key candidates for this would include
title::newFromText, parser::recursiveTagParse, etc. Ideally one would wait
for the method to stand the test of time before tagging.

>I
> am sure WMF developers are facing >similar issues especially

I don't think that's the case. It used to be the responsibility of the
person making the breaking change to fix all callers in the wikimedia
extension repo. Im not sure if that's still the case but nonetheless I do
not feel this is a significant problem for deployed extensions.(im sure
someone will correct me if im wrong)

-bawolff
On 2013-02-11 9:14 AM, "Yuri Astrakhan"  wrote:

> Mariya,
>
> Could you be more specific? What types of changes caused extensions to
> break? I might be mistaken but the vast majority of the API framework
> classes have been established over 5 years ago, with very few breaking
> changes since. Most changes were related to adding new functionality (new
> actions, new query submodules, new parameters), but that shouldn't have
> significantly affected extension development.
>
> I do plan to introduce a few breaking changes (majority of the extensions
> should not be affected) in 1.21, such as the introduction of versioning,
> further modularization to allow pageset reuse, etc.
> See http://www.mediawiki.org/wiki/Requests_for_comment/API_Future
>
> Please note that in a rare case some features might be purposefully removed
> due to the security or scalability issues.
>
> --Yuri
>
>
> On Mon, Feb 11, 2013 at 7:58 AM, Mariya Nedelcheva Miteva <
> mariya.mit...@gmail.com> wrote:
>
> > Hi all,
> >
> > I have been talking to many third-party yas part of my WMF internship in
> > the last few weeks and one the main concerns they express is the lack of
> > stability in the PHP classes MediaWiki exposes from version to version.
> The
> > frequent changes in what I would call the PHP-API makes extentions
> > developement and maintenance much more difficult with compatibility from
> > version to version becoming a problem. Solving the problem would probably
> > result in the development of more extensions, easier maintenance, less
> > hacks to core and more users upgrading to the latest MediaWiki version. I
> > am sure WMF developers are facing similar issues especially with projects
> > like WikiData going on.
> >
> > My question is: With the given technical heritage that MediaWiki carries,
> > is it possible to have a (relatively) stable set of PHP classes defined,
> > with a pledge not to change them in the next X releases or at least with
> > some longer deprecation time? What would maintaining such a PHP-API
> entail?
> > How difficult is it given the vast number of dependancies in MediaWiki
> > code? Does it require restructuring a lot of the current core code? Do
> you
> > think it should be a definite goal for WMF?
> >
> > Thank you.
> >
> > Mariya
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-08 Thread bawolff
On 2013-02-08 2:28 PM, "Sumana Harihareswara"  wrote:
>
> On 02/08/2013 01:23 PM, Daniel Barrett wrote:
> >> also we have SemanticMediaWiki.
> >
> > We started looking into Semantic MediaWiki - it has impressive features.
> > But we got scared off by stories that it slows down the
> > wiki too much. Maybe we should give it another look.
> >
> > DanB
>
> A recent improvement in SMW is the new database structure for Semantic
> MediaWiki, SMWSQLStore3 -- this makes SMW faster and more efficient.
>
> http://semantic-mediawiki.org/wiki/Semantic_MediaWiki_1.8
>
> It got released 2 December 2012.  So yeah, check it out.
>
> (Shout-out to Nischay Nahata who led that work as his 2012 Summer of
> Code project.)
>
> --
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I know nothing of smw, but surely using an rdf store backend ( which from
what i understand has been supported for quite some time) would be more
efficient than a relational db backend, no matter how optimized that
backend might be.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] URLs for autogenerated documentation

2013-02-08 Thread bawolff
Whichever way we chose, could we have http redirects from the old
svn.wikimedia.org? There's a lot of urls that link there.

I prefer doc.mediawiki.org/// (aka
doc.mediawiki.org/core/master/php ) as in my mind, the hierarchy makes
more sense like that, as the type of code is something more
fine-grained than what version, and also something that belongs to the
version number in a sense. I also like keeping the names MediaWiki and
Wikimedia separate. At the end of the day it doesn't really matter
which way though.

It would also be cool if puppet docs were on doc.wikimedia.org, but if
you had doc.mediawiki.org in the url, things auto redirected (and vice
veras: if you went to doc.wikimedia.org/core/master/php things
redirected to doc.mediawiki.org/core/master/php )

Cheers,
Bawolff


On Fri, Feb 8, 2013 at 11:18 AM, Antoine Musso  wrote:
> Hello,
>
> We have historically generated MediaWiki documentation on the Subversion
> server known as formey:
>
>  https://svn.wikimedia.org/doc/
>
> That the result of running doxygen once per day against the master
> branch of mediawiki/core.git.
>
> We would like to move the documentation to another host and I think it
> is a good time to change the URL as well to something a bit more
> meaningful than svn.mediawiki.org :-]
>
> We also have auto generated documentation for our puppet manifest at:
> http://doc.wikimedia.org/puppet/
>
>
> Timo and I kind of disagree on the URL scheme to use to publish the
> documentation, so instead of starting a revert war in Gerrit, I thought
> it would be a good idea to ask some more people to participate in.
>
>
> For the context:
>
> We would like to have documentation generated for:
>  - puppet
>  - mediawiki branches and tags (PHP + JS + CSS)
>  - mediawiki extensions (PHP + JS + CSS)
>
> The hosts doc.wikimedia.org and doc.mediawiki.org points to the same
> machine.
>
>
> I guess puppet could land at doc.wikimedia.org/puppet
>
> For MediaWiki we need the following parameters:
>
>  - project (core / extension name)
>  - version (tag / release branch / master)
>  - type of documented source (php / js / css)
>
> What kind of magic ordering can we agree on?
>
>
> A) http://doc.wikimedia.org/mediawiki-core//php
>
> Would bring:
>
>   doc.wikimedia.org/mediawiki-core/master/php
>   doc.wikimedia.org/mediawiki-core/master/js
>   doc.wikimedia.org/mediawiki-core/1.20.2/php
>   doc.wikimedia.org/mediawiki-core/REL1_20/js
>   doc.wikimedia.org/mediawiki-AbuseFilter/1.20.2/php
>
>
> B) doc.mediawiki.org
>
> Would bring:
>
>   doc.mediawiki.org/core/php/master
>   doc.mediawiki.org/core/js/master
>   doc.mediawiki.org/core/php/1.20.2
>   doc.mediawiki.org/core/js/REL1_20
>   doc.mediawiki.org/AbuseFilter/php/REL1_20
>
> Other thoughts?
>
> I would prefer having the MediaWiki doc hosted on the mediawiki.org
> domain.  As for the ordering I guess we can bikeshed for a long time but
> most probably some ordering will seem natural for most people there :-]
>
> Thanks!
>
>
> --
> Antoine "hashar" Musso
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1-29 Bug Day Results

2013-02-06 Thread bawolff
On Wed, Feb 6, 2013 at 12:37 PM, bawolff  wrote:
> On Wed, Feb 6, 2013 at 11:37 AM, Valerie Juarez
>  wrote:
>> Last week we had our first Bug Day of the year.
>>
>> *---How it Went*---
>> We looked at bugs (excluding enhancements) that had not seen any changes in
>> over a year, a little over 250 bugs. The bugs came from a number of
>> different products and components. We started looking at the bugs in
>> "General/Unknown." Attendance was lower than what we wanted. Andre,
>> Matanya, and I triaged bugs and had help from developers in #wikimedia-dev.
>
> Probably a good idea to send an email to wikitech-l and mediawiki-l
> about a day before these things happen, and another one immediately
> when they start.


My apologies, you did do that. I didn't notice because I didn't
realize this was already 8 days ago.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1-29 Bug Day Results

2013-02-06 Thread bawolff
On Wed, Feb 6, 2013 at 11:37 AM, Valerie Juarez
 wrote:
> Last week we had our first Bug Day of the year.
>
> *---How it Went*---
> We looked at bugs (excluding enhancements) that had not seen any changes in
> over a year, a little over 250 bugs. The bugs came from a number of
> different products and components. We started looking at the bugs in
> "General/Unknown." Attendance was lower than what we wanted. Andre,
> Matanya, and I triaged bugs and had help from developers in #wikimedia-dev.

Probably a good idea to send an email to wikitech-l and mediawiki-l
about a day before these things happen, and another one immediately
when they start.


>- Hosting the event in a different IRC channel
>   - We held the event in #wikimedia-dev. We were able to get help from
>   developers on the channel, but it was hard to tell if users joining were
>   there for the Bug Day. We felt greeting each user that joined could have
>   increased noise on the channel, and could have been annoying to other
>   users. We may hold the next even in #wikimedia-office if there
> are no other
>   meetings scheduled for that time.

Wouldn't that further reduce participation?

>
> Thank you for your participation and support! We hope the coming Bug Days
> will get better and better.
>

Thank you for hosting this. While I couldn't make this one, I hope to
help out in a future bug day!

Cheers,
Bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Tab as field delimiter in logging format of cache servers

2013-01-25 Thread bawolff
Just to clarify, will this affect the stats at
http://dumps.wikimedia.org/other/pagecounts-raw/ ? Changing the format
of that will probably break third party scripts.
--
-bawolff


On Fri, Jan 25, 2013 at 1:41 PM, Diederik van Liere
 wrote:
> Apologies for crossposting
>
> Heya,
>
> The Analytics Team is planning to deploy "tab as field delimiter" to
> replace the current space as fielddelimiter on the varnish/squid/nginx
> servers. We would like to do this on February 1st. The reason for this
> change is that we need to have a consistent number of fields in each
> webrequest log line. Right now, some fields contain spaces and that require
> a lot of post-processing cleanup and slows down the generation of reports.
>
> What is affected and maintained by Analytics
>
> * udp-filter already has support for the tab character
> * webstatscollector: we compiled a new version of filter to add support for
> the tab character
> * wikistats: we will fix the scripts on an ongoing basis.
> * udp2log: we have a patch ready for inserting sequence numbers separated
> by tab.
>
> In particular, I would like to have feedback to three questions:
>
> 1) Are there important reasons not to use tab as field delimiter?
>
> 2) Are there important pieces of logging that expect a space instead of a
> tab and that need to be fixed and that I did not mention in this email?
>
> 3) Is February 1st a good date to deploy this change? (Assuming that all
> preps are finished)
>
>
> Best,
>
> Diederik
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Naming our developer events

2013-01-23 Thread bawolff
On Wed, Jan 23, 2013 at 3:13 PM, Quim Gil  wrote:
[snip]
>
> But there is another point here, which is how narrowed / inclusive are we
> with the "MediaWiki" word. You can see it as the name of a CMS. You can see
> it as a name of a wider community. Looking at the content at mediawiki.org
> it is obvious that such community does a lot more things than developing a
> CMS.
>
> Why not calling all those things under the MediaWiki umbrella, and refer to
> the CMS as MediaWiki Core?
>

I don't really like that idea. It may be because I'm just cranky and
dislike change ;) but MediaWiki and the rest of Wikimedia's technical
stuff are fairly orthogonal. I can work on MediaWiki without caring
about Wikipedia and friends. I could also work on non-Wikimedia
technical infrastructure without caring about MediaWiki. (However,
non-MediaWiki Wikimedia tech stuff needs a more concise name (or
names). There's a lot of things in this category including
"wiki-templates", local gadgets, puppet/ops related stuff, some of the
mobile stuff, and there is no good name to describe it.)

We use names to describe things. If the names become too broad they
could lose their usefulness. I would also be concerned that by making
MediaWiki (the CMS) subordinate to general wikimedia technical
activities by renaming MediaWiki to MediaWiki-core [and having
mediawiki=wikimedia-technical-thingies] it could alienate some
contributors who are primarily interested in MediaWiki and not
Wikimedia. On the other hand that could quite possibly be an imagined
problem.

As for actual hackathon naming (whatever happened to hackaton, I
thought that was a cute name) . I don't think it really matters. Call
it MediaWiki if its primarily focused on MediaWiki, call it Wikimedia
if its more focused on Wikimedia things. Call it Wikipedia if you
really must [As someone who originally came to Wikimedia land via a
non-Wikipedia project, calling Wikimedia things "Wikipedia" makes me
go grrr, but I recognize that Wikipedia is a more recognizable brand].
To be honest making naming requirements sounds like a bikeshed
discussion.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] An actual bikeshed

2013-01-23 Thread Bawolff Bawolff
On 2013-01-23 3:38 PM, "Jon Robson"  wrote:

> Suggested solution:
> Maybe some kind of voting system might be of use to force some kind of
> consensus rather than leaving problems unsolved. I'm fed up of
> receiving emails about the same problem I discussed weeks before that
> never got solved. It makes my mailbox ill.
>
> I mean if the question is really what colour is the bikeshed it would
> be good for people to propose colours, people to vote on preferred
> colours and at the end of say a week the majority colour wins and gets
> implemented (or in cases where there is no majority we discuss the
> front runners and other possible solutions).
>

What colour should the polling booth be?

I don't think the answer is voting. Perhaps there are some sheds that don't
need to be painted.

-bawolff

P.s. if someone built a bikeshed in the wmf office they would be my hero
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Completed ! Re: More Update on Ashburn data center switchover / migration – target date is week of 1/22/13

2013-01-22 Thread Bawolff Bawolff
Good work to everyone involved!

-bawolff
On 2013-01-22 6:53 PM, "Ct Woo"  wrote:

> All,
>
> The switchover work is done.
>
> The site was was available to readers throughout the migration work though
> it was in read-only mode for about 32 minutes, when Asher and Mark had to
> migrate the database masters over from Tampa to Ashburn.
>
> We will cancel the reminding maintenance windows.
>
> Thank you all for your patience and understanding.
>
> Regards,
> CT Woo
>
> On Sat, Jan 19, 2013 at 10:57 AM, Ct Woo  wrote:
>
> >  All,
> >
> > We will be proceeding with the datacenter switchover plan this coming
> > Tuesday (Jan 22, 2013), unless we discover some unexpected and
> > insurmountable issues in our tests between now and then.
> >
> > During the 8-hour migration window on the 22nd, 23rd and 24th (from 17:00
> > UTC to 01:00 UTC hours  / 9am to 5pm PST),  there would be times (lasting
> > about 30 minutes) where the site would be set to "read-only" mode, to
> > facilitate master database switchovers from one datacenter to another.
> > While the site should be available to readers, no new contents could be
> > created, edited or uploaded.
> >
> > We are aware of the inconvenience and we have put together plans to
> > minimize such annoyances, e.g., automating much of the procedures,
> > mitigating known risks,  and performing tests to identify issues prior to
> > deployment. Given the scale and complexity of this migration, we do
> realize
> > not all operational impact is predictable.  Some users could experience
> > intermittent site unavailability and/or performance issues unfortunately.
> >
> > You can follow the migration on chat.freenode.net<
> http://irc.freenode.net>
> > (and not irc.freenode.org as mentioned in previous email) in the
> > #wikimedia-operations channel.
> >
> > Thanks,
> > CT Woo
> >
> > -- Forwarded message --
> > From: Ct Woo 
> > Date: Fri, Jan 11, 2013 at 12:07 PM
> > Subject: Update on Ashburn data center switchover / migration – target
> > date is week of 1/22/13
> > To: Wikimedia developers , Development
> > and Operations Engineers 
> >
> >
> > All,
> >
> > The Migration team is in the last lap on completing the remaining tasks
> to
> > ready our software stack and Ashburn infrastructure for the big
> > switchover day.
> >
> > Per my last update,<
> http://lists.wikimedia.org/pipermail/wikitech-l/2012-October/063668.html>
> > with the Fundraising activity behind us now, the team has scheduled the
> *week
> > of 22nd January*, 2013 to perform the switchover. We are going to block a
> > 8-hour migration window on the *22nd, 23rd and 24**th*.  During those
> > periods, *17:00 UTC to 01:00 UTC hours (9am to 5pm PST*), there will be
> > intermittent blackouts and they will be treated as 'planned' outages.
>  You
> > can follow the migration on irc.freenode.org in the
> #wikimedia-operations
> > channel.
> >
> > The team is putting the finishing touches to the last few tasks and we
> > will make the final Go/No decision on 18th Jan, 2013. An update will send
> > out then. For those interested in tracking the progress, the meeting
> notes
> > are captured on this wikitech page<
> http://wikitech.wikimedia.org/view/Eqiad_Migration_Planning#Improving_Switchover
> >
> > .
> >
> > *Please note that we will be restricting code deployment during that
> > week, allowing only emergency and critical ones only.*
> >
> > Thanks.
> >
> > CT Woo
> >
> >
> >
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?

2013-01-22 Thread Bawolff Bawolff
On 2013-01-22 6:03 PM, "Alex Brollo"  wrote:
>
> 2013/1/22 Paul Selitskas 
>
> > What do you mean by
> > >> any wikicode (template call, parameter, link) present into
> > >> the value of infobox parameter breaks the stuff, since it is parsed
and
> > >> expanded by parser with unpredictable results.
> >
> > If your {{{author}}} doesn't have anything and it's aсceptable, then
make
> > it {{{author|}}}, or {{#if:{{{author|}}}| > statement above.
>
>
> Imagine that my infobox had a parameter author=, and imagine a "clean"
> content as this:
>
> author=Alessandro Manzoni
>
> With my template code:
> 
>
> I get into parsed html:
> 
>
> Perfect!
>
> But imagine that my template parameter is:
> author=[[Alessandro Manzoni]]
>
> When I pass the parameter content to  data-author="{{{author}}}">, I dont' get into html page what I'll
> like:
> 
>
> since wikicode [[Alessandro Manzoni]] will be interpreted by the server,
> and parsed/expanded into a html link as usual, resulting into a big mess.
>
> The same occurs for any wikicode and/or html passed into a infobox
template
> parameter.
>
> Alex brollo
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Have you tried {{#tag:nowiki|{{{author} to prevent interpretation?

There may still be issues with quotes. Im not sure.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coding conventions

2013-01-22 Thread Bawolff Bawolff
On 2013-01-22 6:05 PM, "Jeroen De Dauw"  wrote:
>
> Hey,
>
> I hereby admit defeat. My thread was clearly not the ultimate bikeshed.
>
> Cheers
>
> --

On this bikeshed, allowing both styles sounds perfectly acceptable to me.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Why are we still using captchas on WMF sites?

2013-01-22 Thread Bawolff Bawolff
On 2013-01-22 3:30 PM, "aude"  wrote:
>
> On Tue, Jan 22, 2013 at 8:18 PM, Luke Welling WMF wrote:
>
> > That was not the end of the problem I was referring to. We know our
> > specific captcha is broken at turning away machines. As far as I am
aware
> > we do not know how many humans are being turned away by the difficulty
of
> > it.
>
>
> It's at least impossible for blind users to solve the captcha, without an
> audio captcha.  (unless they manage to find the toolserver account
creation
> thing and enough motivated to do that)
>
> I am not convinced of the benefits of captcha versus other spam filtering
> techniques.
>
> Cheers,
> Katie
>
>
>

Someone should write a browser addon to automatically decode and fill in
captchas for blind users. (Only half joking)

-bawolff
__**_
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/**mailman/listinfo/wikitech-l<
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l>
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> @wikimediadc / @wikidata
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Coding conventions

2013-01-22 Thread bawolff
On 2013-01-22 3:01 PM, "Brad Jorsch"  wrote:
>
> Our coding conventions for PHP are currently ambivalent on whether we
> should write "if (" or "if(". It's probably time to pick one.
>
> Discussion at
>
https://www.mediawiki.org/wiki/Manual_talk:Coding_conventions/PHP#Control_structures
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

For some reason I thought the spacy version was the convention. Guess its
never really been specified.

Also wasn't the stuff about efFuncName that I see in the current coding
convention doc reverted out a while back for not being a "real"
(recommended) convention?

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Why are we still using captchas on WMF sites?

2013-01-21 Thread Bawolff Bawolff
Given that there are algorithms that can solve our captcha presumably they
are mostly preventing the lazy and those that don't have enough knowledge
to use those algorithims. I would guess that text on an image without any
blurring or manipulation would be just as hard for those sorts of people to
break. (Obviously that's a rather large guess). As a compromise maybe we
should have straight text in image captchas.

-bawolff
On 2013-01-21 7:40 PM, "Anthony"  wrote:

> On Mon, Jan 21, 2013 at 3:00 AM, David Gerard  wrote:
> > I mean, you could redefine "something that doesn't block all spambots
> > but does hamper a significant proportion of humans" as "successful",
> > but it would be a redefinition.
>
> It's not a definition, it's a judgment.
>
> And whether or not it's a correct judgment depends on how many
> spambots are blocked, and how many productive individuals are
> "hampered", among other things.
>
> After all, reverting spam hampers people too.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Why are we still using captchas on WMF sites?

2013-01-21 Thread Bawolff Bawolff
On 2013-01-21 3:56 AM, "Andre Klapper"  wrote:
>
> On Mon, 2013-01-21 at 07:48 +, David Gerard wrote:
> > On 21 January 2013 05:13, Victor Vasiliev  wrote:
> > > On 01/20/2013 04:22 PM, David Gerard wrote:
> >
> > >> The MediaWiki captcha is literally worse than useless: it doesn't
keep
> > >> spambots out, and it does keep some humans out.
> >
> > > I don't see how the spambot statement is true. Do you have evidence
for it?
> >
> >
> > That spambots get through at all.
>
> Evidence is not provided by simply repeating the statement. :)
>

Does http://elie.im/publication/text-based-captcha-strengths-and-weaknessescount
as evidence? (Copied and pasted from the mailing list archives)

Sure captchas do prevent some limitted attacks - it makes it more effort
then a 5 minute perl script. Most spammers are more sophisticated than that.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Why are we still using captchas on WMF sites?

2013-01-20 Thread Bawolff Bawolff
> This question is something we've also been asking ourselves on the E3
team,
> as part of our work on account creation. I think we all agree that
CAPTCHAs
> are at best a necessary evil. They are a compromise we make in our user
> experience, in order to combat automated attacks.

That's kind of missing the point of the original poster. The point being
that they are an *un*nessary evil and do not prevent automated attacks
whatsoever.

[Snip]
> To get more numbers on how much taking away the CAPTCHA might gain us in
> terms of human registrations, we have considered a two hour test (to start
> with) of removing the CAPTCHA from the registration page:
> https://meta.wikimedia.org/wiki/Research:Account_creation_UX/CAPTCHA That
> kind of test would probably not be an accurate measurement of what kind of
> spam would be unleashed if we permanently removed it, but the hourly
volume
> of registrations on enwiki is enough to tell us the human impact.

That would be interesting. Remember that captchas arent just on the user
reg page though.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Caching of images in varnish

2013-01-19 Thread Bawolff Bawolff
There are reports everywhere of uploading new versions of images failing
(upload works but new version does not show up).

Last time this happened all that needed to be done was fot varnishhtcpd to
be restarted on the image cache servers. [1] could someone with the ability
to check,  check if that needs to be done again? Imho this type of issue is
a rather serious one which causes lots of frustration and confusion.

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=41130

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Huggle is now in git

2013-01-18 Thread Bawolff Bawolff
English is a useless language anyhow. There's not even a compilier for it!

-bawolff
On 2013-01-18 11:49 AM, "Petr Bena"  wrote:

> * in case anyone is interested in *
>
> common mistake done by me. One day I will hopefully master english :)
>
>
> On Fri, Jan 18, 2013 at 4:48 PM, Petr Bena  wrote:
>
> > Hi,
> >
> > I would like to remind even here that we have moved the source code to
> > github this week.
> >
> > In case anyone is interesting in improving huggle or joining the project,
> > you are welcome to do so: https://github.com/benapetr/huggle
> >
> > Please note that branch csharp is the branch containing latest version
> you
> > probably want to work on. Trunk is huggle 2x.
> >
> > Have fun
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A testing & bug management wheel

2013-01-17 Thread bawolff
On Thu, Jan 17, 2013 at 6:55 PM, Quim Gil  wrote:
> If we serve paella every week in a timely manner, people will come. If they
> enjoyed it they will repeat another week, bringing more guests.

That is a good point. From what I've seen, these types of events also
tend to attract hanger-ons who just happen to be idling in #mediawiki
at the time. This brings more people into doing more mediawiki things,
which is a good thing.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Google Summer of Code 2013

2013-01-17 Thread Bawolff Bawolff
One thing I would like to see is code from projects being merged into core
at a regular basis instead of just at the end. Obviously that might not be
possible for all projects depending on what your project is, but many that
modify core can be done in incremental steps. I don't know about last year
specificly but in other years there have been gsoc projects coded away
happily in branches, getting code review but not held to the same standard
as core was. When they tried to merge it the student gets a rather rude
awekening with all sorts of objections to their code they didnt expect.

Tl; dr: good in depth feedback early and often is critical for success. If
we make people merge their projects in small steps as they complete
independant features (like once every 2 weeks) gsocers get better feedback
and no giant painful merge at the end.

-bawolff
On 2013-01-17 3:47 PM, "Petr Bena"  wrote:

> hi,
>
> Can you explain the roles of mentors and admins? Also what is requirement
> for participants? I suppose it's for students?
>
>
> On Thu, Jan 17, 2013 at 8:32 PM, Quim Gil  wrote:
>
> > Surprised? Me too!
> >
> > Please read / watch / discuss
> > https://www.mediawiki.org/**wiki/Summer_of_Code_2013<
> https://www.mediawiki.org/wiki/Summer_of_Code_2013>
> >
> > *Nothing* about GSOC 2013 is confirmed at this point, but there is no
> harm
> > in starting collecting ideas and recruiting participants.
> >
> > Your feedback is welcome at the wiki page - or here if you are really
> > really lazy. Reason: potential participants visiting that page in the
> near
> > future will have an easier time following background discussions if they
> > are take place there.
> >
> > Thank you!
> >
> > --
> > Quim Gil
> > Technical Contributor Coordinator @ Wikimedia Foundation
> > http://www.mediawiki.org/wiki/**User:Qgil<
> http://www.mediawiki.org/wiki/User:Qgil>
> >
> > __**_
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/**mailman/listinfo/wikitech-l<
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l>
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages

2013-01-16 Thread Bawolff Bawolff
On 2013-01-16 7:20 PM, "Chad"  wrote:
>
> On Wed, Jan 16, 2013 at 6:07 PM, Tim Starling 
wrote:
> > On 17/01/13 00:14, Chad wrote:
> >> Really, I think the whole thread is moot with the pending upgrade.
> >> Typos should always be fixed before merging (I think we all agree?),
> >> and the new abilities to fix these from the UI means we won't need
> >> to mark people as -1 to do so.
> >
> > I didn't mention commit summaries in my post. My interest is in
> > nitpicking in general. Jeroen calls arguments over commit summaries
> > the /ultimate/ bikeshed, which they may or may not be; there are
> > plenty of other examples which may compete for that title.
> >
>
> Indeed, I had missed that.
>
> > Nitpicking is the minor end of the negative feedback spectrum. By
> > definition, it has the smallest concrete payoff when advice is
> > followed, in exchange for complex, context-dependent social costs. You
> > should think carefully before you do it.
> >
>
> *nod* I agree. And really, nitpicks in code can always be cleaned
> up later (heck, we did it for years with SVN).
>
> It's only nitpicks in commit messages that should always be fixed,
> since they're  immutable after submission. And it's *that* that I think
> won't be a big deal anymore (since any drive-by contributor could
> fix a typo on the spot).
>
>

If we're talking nitpicks in general. Ive seen -1 for things like
someFunc($a, $b) instead of someFunc( $a, $b ) which I agree does more harm
than good.

I imagine how much someone considers spelling issues to be a minor
"nitpick" varries quite a lot between people.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Generating documentation from JavaScript doc comments

2013-01-16 Thread Bawolff Bawolff
> Would it be possible/difficult to get something similar working for
> gadgets on WMF wikis?
>
> Helder

What would be really cool would be if the js content handler code detected
code doc comments and formatted them nicely. Something similar to how back
in the old days people used to have things like
/*
==header ==
*/
That would be picked up by mw and formatted as headers. But automatic and
more complete.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages

2013-01-16 Thread bawolff
On 2013-01-16 3:07 PM, "Luke Welling WMF"  wrote:
>
> I don't mind getting dinged for typos.  If I'm being sloppy it's fair to
> point it out.
>
> However, I think the social contract should be that after I fix the typos
> you requested then you owe me a real code review where you look at the
> merits of the code.  Code review is an awesomely useful but time consuming
> thing to provide.  Patrolling for typos is not.
>
> Put it this way, if I concede and agree that the bikeshed can be green,
the
> people who spent three hours arguing for green should feel obligated to
> turn up to the working bee to help with the painting.
>
> Luke Welling
>
>

Well put. That sounds entirely fair to me.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fwd: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program

2013-01-16 Thread bawolff
Interesesting. Could this possibly work like gsoc but aimed at experianced
devs instead of newbies? For example if I (in theory) had an idea for a new
feature or extension that could have a large impact on how wikimedians use
mediawiki could I potentially get a grant to work on such an idea over the
summer?

-bawolff
On 2013-01-16 1:48 PM, "Quim Gil"  wrote:

> Hi, next week I will have a casual chat with Siko about the new Wikimedia
> Individual Engagement Grants and how MediaWiki contributors could
> theoretically benefit from them.
>
> If you have specific questions or feedback there is nothing stopping you
> from contact her directly, but maybe it's more useful to start sharing
> here. This way I can go with more consolidated questions and feedback.
>
>
>  Original Message 
> Subject: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual
> Engagement   Grants program
> Date: Tue, 15 Jan 2013 18:03:42 -0800
> From: Siko Bouterse 
> Reply-To: wikimedia-l@lists.wikimedia.**org
> To: 
> wikimediaannounce-l@lists.**wikimedia.org
>
> *Hi all,
> I’m pleased to announce the launch of a new grantmaking program at the
> Wikimedia Foundation: Individual Engagement Grants. These grants will
> support Wikimedians as individuals or small teams to complete projects that
> benefit the Wikimedia movement, lead to online impact, and serve the
> mission, community, and strategic priorities.  This new program is intended
> to complement WMF’s other grantmaking programs as well as the grants that
> chapters and affiliate organizations provide.
>
> The first round of proposals will be accepted from now until 15 February
> 2013.  We’re also seeking committee members to help select the first round
> of grantees.  Please help spread the word to other lists!
>
> To get involved, share your thoughts, submit a proposal, or join the
> committee:
> https://meta.wikimedia.org/**wiki/Grants:IEG<https://meta.wikimedia.org/wiki/Grants:IEG>
>
> For more information on all of WMF’s grantmaking programs:
> https://meta.wikimedia.org/**wiki/Grants:Start<https://meta.wikimedia.org/wiki/Grants:Start>
>
> Best wishes,*
> Siko
>
> --
> Siko Bouterse
> Head of Individual Engagement Grants
> Wikimedia Foundation, Inc.
>
> sboute...@wikimedia.org
>
> *Imagine a world in which every single human being can freely share in the
> sum of all knowledge. *
> *Donate <https://donate.wikimedia.org> or click the "edit" button today,
> and help us make it a reality!*
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Somebody Explain File Repositories

2013-01-15 Thread bawolff
On Tue, Jan 15, 2013 at 10:27 AM, Tyler Romeo  wrote:
> Hey,
>
> Are there any resources that explain how MediaWiki's file repositories
> work? I've been going through the code, but between the various FileRepo
> classes and their corresponding File classes, it's way too confusing. I'm
> trying to make a new FileRepo/File class to allow storage of uploads on a
> different device.
>
> *--*
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2015
> Major in Computer Science
> www.whizkidztech.com
>
>  | tylerro...@gmail.com
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Have you read includes/filerepo/README ?

Very broadly, filerepo is for stuff generic to the filerepo, and file
objects contain info on individual files. Methods on the file objects
get called to get info about a specific file, and often those methods
call more general methods in the filerepo class to get the needed
information out of the repository.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2

2013-01-15 Thread bawolff
While it may be true that there are better methods to call for this
purpose, an article's id should be 0 if and only if it does not exist
(or perhaps if its in a fake namespace like special).

-bawolff

On Tue, Jan 15, 2013 at 10:15 AM, Tyler Romeo  wrote:
> I think the best thing to do would be to just avoid getting the article ID
> in the first place. If you have a Title object, you can just pass that
> object directly to either Article::newFromTitle or to WikiPage::factory.
>
> *--*
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2015
> Major in Computer Science
> www.whizkidztech.com | tylerro...@gmail.com
>
>
> On Tue, Jan 15, 2013 at 9:11 AM, Harsh Kothari 
> wrote:
>
>> Hi Andreas
>>
>> Try this
>>
>> $someobj = WikiPage::newFromId(  $ID );
>>
>> if(is_object( $someobj ) ){
>> $text = $someobj->getRawText(); or you can use $text =
>> $someobj->getText();
>>
>> }
>> else{
>>
>> return true;
>> }
>>
>> Thanks
>> Harsh
>> ---
>> Harsh Kothari
>> Research Fellow,
>> Physical Research Laboratory(PRL).
>> Ahmedabad.
>>
>>
>> On 15-Jan-2013, at 7:14 PM, Andreas Plank wrote:
>>
>> > Hi,
>> >
>> > I'm using MW 1.20.2  and I want to get the content of a page for
>> > further parsing in a PHP application. The PHP application is triggered
>> > via a special page (Special:MobileKeyV1) and parses nature guides for
>> > mobile devices.
>> >
>> > I tried to get the content via getArticleID() ...
>> > $titleObj=Title::newFromText("Existing page");
>> > $articleID=$titleObj->getArticleID();
>> > Article::newFromID($articleID)->fetchContent();
>> > etc.
>> > ... but it returns $articleID=0 although the page exits. With MW 1.18
>> > this approach worked fine, but after upgrade to MW 1.20.2 it does not
>> > any more.
>> >
>> > How do I get the page content correctly?
>> > Article::newFromID($titleObj->getArticleID())->fetchContent(); does
>> > not work because getArticleID() returns 0 or -1 although the page
>> > exits
>> > Or can sombody post a hint, what I'm doing wrong? Is there any context
>> > class needed?
>> > Or where there some big changes (MW 1.18 → 1.20) that are not
>> > described yet on http://www.mediawiki.org/wiki/Manual:Title.php ?
>> >
>> > I did also a
>> > sudo php ./maintenance/rebuildall.php --conf ./LocalSettings.php
>> > But it did not help either
>> >
>> > Thanks for your help!
>> >
>> > Kind regards
>> > Andreas
>> >
>> > ___
>> > Wikitech-l mailing list
>> > Wikitech-l@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages

2013-01-15 Thread bawolff
--
- Brian
Caution: The mass of this product contains the energy equivalent of 85
million tons of TNT per net ounce of weight.


On Tue, Jan 15, 2013 at 10:57 AM, Daniel Kinzler  wrote:
> On 15.01.2013 15:06, Tyler Romeo wrote:
>> I agree with Antoine. Commit messages are part of the permanent history of
>> this project. From now until MediaWiki doesn't exist anymore, anybody can
>> come and look at the change history and the commit messages that go with
>> them. Now you might ask what the possibility is of somebody ever coming
>> across a single commit message that has a typo in it, but when you're using
>> git-blame, git-bisect, or other similar tools, it's very possible.
>
> And then they see a typo. So what? If you look through a mailing list archive 
> or
> Wikipedia edit comments, you will also see typos.
>
> I'm much more concerned about scaring away new contributors with such 
> nitpicking.

On the other hand, new users may be attracted to the fact that we have
high standards.

I agree that spelling is a valid reason for a -1. After all, -1 is not
the same as a revert in the svn days, it simply means that the commit
is not yet "perfect". (Even in svn a revert was supposed to be no big
deal).

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Extension Bundles and Template Bundles

2013-01-14 Thread bawolff
On Mon, Jan 14, 2013 at 12:12 PM, Mark A. Hershberger  
wrote:
> On 01/14/2013 10:20 AM, Yuvi Panda wrote:
>
> ParserFunctions is already included, but I was going to bundle the
> following extensions in 1.21:
>
> * LocalisationUpdate

How do you plan to get users to set it up? AFAIK it requires setting
up a cron job, so its not exactly something that can automatically be
set up by the installer.

>
> As far as template bundles, I think this would be better served with an
> extension like LocalisationUpdate that could fetch a new copy of the
> desired templates from your chosen Wikipedia.
>
> Another area that could benefit most non-WMF wikis is a way to import
> some documentation for how to *use* a wiki.

Both of those two things would probably be solved if we had good
interwiki transclusion support.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2013-01-04 Thread bawolff
On Fri, Jan 4, 2013 at 9:53 AM, Tyler Romeo  wrote:
[..]
> As far as a solution goes, I have a complete codebase for
> Extension:TokenAuth, which allows users to have MediaWiki sign a blinded
> token, which can then be used to bypass a specific IP block in order to log
> in and edit. It is almost ready; there are just a few functionality
> problems with the JavaScript crypto library.

That sounds really cool. However I'm not sure how it solves the
problem. If we allow people to get tokens signed that lets them bypass
the TOR blocks, we may as well just not hand out tor blocks in the
first place (if everyone can get a blinded token), or hand out the
overrides via IP block exempt group (If we limit who can get such
tokens).

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?

2012-12-29 Thread bawolff
On Sat, Dec 29, 2012 at 6:59 PM, Platonides  wrote:
>> Is there any sound reason to strip html comments away? If there is no sound
>> reason, could such a stripping be avoided?

Comments can sometimes be used to get XSS in unexpected ways (like
conditional comments for IE). I think they're stripped because that
was easier then writing a sanitizer for them, and they're pretty
useless.

If all else fails, you can do the hacky thing of stuffing information
into either a class attribute or title attribute of an element. (data
even better, but I don't know if that's allowed in wikitext or not)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fwd: Re: [Wikimedia-l] No access to the Uzbek Wikipedia in Uzbekistan

2012-12-29 Thread bawolff
On Thu, Dec 27, 2012 at 7:35 PM, Marco Fleckinger
 wrote:
>
>
>
> Do we have one extra machine left. Then we could set up this as NAT-Router. 
> This will replace another machine if we do not have one extra IP left. The 
> original ports need to be forwarded to that then.
>
> Cheers
>
> Marco
>
>  Original-Nachricht 
> Von: Leslie Carr 
> Gesendet: Fri Dec 28 00:03:33 MEZ 2012
> An: Wikimedia Mailing List 
> Betreff: Re: [Wikimedia-l] No access to the Uzbek Wikipedia in Uzbekistan
>
> On Thu, Dec 27, 2012 at 2:37 PM, Marco Fleckinger
>  wrote:
>>
>>
>>
>>
>> Leslie Carr  schrieb:
>>
>>>On Thu, Dec 27, 2012 at 1:39 PM, Marco Fleckinger
>>> wrote:
>>
>>>> Just an idea, which is not very beautiful: What about a router
>>>forwarding ports to the correct machine by using iptables? Would that
>>>also work in connection with search engines?
>>>
>>>Are you suggesting we use different nonstandard ports for each
>>>different wiki/language combo that resides on the same IP ?
>>>
>> Yes exactly!
>>
>
> I guess that is theoretically possible with a more intrusive load
> balancer in the middle. We need the HOST information from the http
> header to be added as we have our varnish caches serving multiple
> services, not one(or more) per language/project combo.  I'm pretty
> sure that lvs doesn't have this ability (which we use).  Some large
> commercial load balancers have the ability to rewrite some headers,
> but that would be a pretty intensive operation (think lots of cpu
> needed, since it needs to terminate SSL and then rewrite headers) and
> would probably be expensive.  If you have another way you think we can
> do this, I am all ears!
>
> We may want to move this discussion to wikitech-l as all the technical
> discussions probably bore most of the people on wikimedia-l
>
> Leslie
>


Wikimedia is a pretty big player. Has anyone from the foundation with
some sort of fancy sounding title called up the ISP in question and
asked "wtf?". The original email on wikimedia-l made it sound like the
issue is unintentional.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Unit tests scream for attention

2012-12-28 Thread bawolff
On Thu, Dec 27, 2012 at 11:48 PM, Sumana Harihareswara
 wrote:
> On 12/07/2012 01:13 PM, Niklas Laxström wrote:
>> Now that tests need +2 to be run, at least temporarily, I'm going to
>> point out that I've not been able to run tests on my development
>> environment in ages. I mentioned broken unit tests in Oct 4 on this
>> list. 
>> http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/64390
>
> Niklas, are you still having these problems, or are they mostly resolved?
>
> The ideal is for all the regular developers to automatically run the
> test suite locally before submitting changesets, so if anyone's had
> problems that stop them from doing that, we ought to learn why, and fix
> those obstacles.


I used to run unit tests at regular intervals. I stopped because it is
such a pain to install a version of phpunit that actually works (Also
its less critical to do it yourself now that jenkins does it for you).

When I used to run unit tests, there were quite regularly issues where
the unit tests assumed you had the default configuration, where they
really should not assume such a thing. (That was of course a while
ago, so things may have changed).

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTTPS Wikipedia search for Firefox - update?

2012-12-28 Thread bawolff
On Fri, Dec 28, 2012 at 1:50 PM, Ryan Lane  wrote:
> On Fri, Dec 28, 2012 at 9:37 AM, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>> Hi,
>>
>> Mozilla are asking for update about bug
>> https://bugzilla.mozilla.org/show_bug.cgi?id=758857
>>
>> Can anybody help?
>>
>>
> There's no change. We're still waiting on MediaWiki changes to occur before
> we switch logged-in users to HTTPS by default.
>
> - Ryan

Which changes would those be (You said MediaWiki not WikiMedia, so I
assume there are bugs for them)?

Furthermore, what does "making the firefox search box be https" have
to do with having users log in to secure by default. I suppose we
might not want people to loose their login if they're logged into
insecure and search via firefox with secure login - is that what
you're concerned about, or is it something else?

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we help Tor users make legitimate edits?

2012-12-28 Thread bawolff
>IP block exemption is rarely given because it allows someone to keep
>editing on their main account when a sock is blocked.
>
>Tor exemption should be separate from IP block exemption.

Note - that's just a config setting away. The rights are already
separate rights, they just happen to be in the same group on
Wikimedia.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Distinguishing disambiguation pages

2012-12-28 Thread bawolff
>>
>> Then we may want to get rid of tracking categories generated by code
>> and use pageprops instead?
>
> I have no opinion on that one.
>

Tracking categories are nice for certain errors. They more explicitly
show there is an error, because there is a category at the bottom of
the page. User's can edit the category pages to give helpful
information and even change the category name to localize it.

Otoh I suppose such a special page could would also have a system
message to provide useful introduction to what the property in
question means.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Distinguishing disambiguation pages

2012-12-24 Thread bawolff
You could always have a magic word that disables the features, and let
the users put it in the relevant templates. Then the users could
disable it on all the pages they feel it would be inappropriate for.

--bawolff


On Mon, Dec 24, 2012 at 2:34 PM, Bináris  wrote:
> Perhaps this helps something if you are familiar with javascript:
>
> http://de.wikipedia.org/wiki/MediaWiki:Gadget-bkl-check.js
>
> We in huwiki use an adopted version of this script to display links to
> disambpages with a different colour.
>
> --
> Bináris
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] question about UNIQ-QINU exception and parser

2012-12-24 Thread bawolff
The issue is you are not allowed to call the parse method while
already parsing something. In 99.9% of cases, using the
recursiveTagParse method is what you want. (There are a couple cases -
like in Babel extension when auto-creating categories where you have
to do something more complicated, but they are few and far between).

It's hard to say more without being familar with what exactly you are coding.

--bawolff


On Mon, Dec 24, 2012 at 4:38 PM, Sumana Harihareswara
 wrote:
> By the way, Yury I just wanted to ask you to list any UNIQ issues you
> run into here: https://bugzilla.wikimedia.org/show_bug.cgi?id=26213
> (tracking bug)
>
> thanks,
> Sumana
>
> On 12/24/2012 10:24 AM, Yury Katkov wrote:
>> Hi guys!
>>
>> During writing my tag extension I've face with the problem: sometimes the
>> UNIQ-QINU MWException has been throwed after my code is finished executing.
>> I've found the fix
>>
>> http://www.mediawiki.org/wiki/QINU_fix
>>
>> that recommends using
>>
>> $parser->parse($input, $parser->mTitle, $parser->mOptions, false, false);
>> //in my case $parser is from the function argument.
>>
>> Does anyone know more about this QINU exception, or where I can read more
>> about it? And also what are the last three parameters of the
>> Parser::parse() function? [1]
>>
>> [1]
>> http://svn.wikimedia.org/doc/classParser.html#a0e3f2edd4bd47376953dfad4dcfc9c74
>>
>> Cheers,
>> -
>> Yury Katkov, WikiVote
>
>
> --
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] jenkins: unit test whitelist

2012-12-20 Thread bawolff
On Thu, Dec 20, 2012 at 8:10 PM, Krenair  wrote:
> Security concerns when running tests with arbitrary code (from anyone since
> labsconsole account registration opened...) from what I understand.
>
> I have requested that I am added to the whitelist
> here:https://gerrit.wikimedia.org/r/#/c/39712/2
> Hoo also did so:https://gerrit.wikimedia.org/r/#/c/39711  (yay, merge
> conflicts)
>
> I really think that everyone who has +2 on a WMF-deployed extension should
> be on this whitelist. Hashar told me in #mediawiki that he thinks this as
> well.
>
> Alex
>


Seems kind of odd to only run unit tests for code from experienced
people. While everyone benefits from the unit tests, I imagine
inexperienced new developers would benefit the most (One assumes
people with @wikimedia.org emails ought to be experienced ;)

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Anonymous user id on wikipedia?

2012-12-18 Thread bawolff
On Tue, Dec 18, 2012 at 5:41 PM, Kevin Israel  wrote:

>
> Even if you do not check "Remember my login on this browser", the
> username is saved for 180 days (which, by the way, is four times the
> duration set out in the WMF privacy policy). As far as I can tell, this
> "feature" has existed at least since the phase3 reorg in 2003, if not
> before then.

Not really. The cookie expiration was bumped to 180 days back in
August of 2011.  Before that we had a shorter expiry. See
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/94430 . Given
that the user has to agree to the remember me function, I do not feel
this is a privacy concern.

>Ideally, an anonymous user, whether or not they have ever been logged in as a 
>>registered user, will not transmit any personally identifying information in 
>their >requests.

I'm not sure, but I thought I heard somewhere that we give logged out
users cookies to ensure that some local caching is invalidated.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla: "Waiting for merge" status when patch is in Gerrit?

2012-12-13 Thread bawolff
In my opinion Patch-in-gerrit is a distinct stage in the life cycle of
a bug, and deserves its own status.

A patch-in-gerrit does not mean the same thing as assigned. Assigned
bugs are being worked on by someone. There work may or may not be
publically visible yet. They are probably not at the stage where they
want review of their work so far on the bug (obviously there are
exceptions to that for complex bugs), etc.

A patch-in-gerrit does mean that there is a fix for the bug available.
It has not been reviewed yet. It needs people to test the
patch/review/comment. It does not mean the bug is fixed (and
definitely not deployed, but I agree that is a different discussion).
If I downloaded a nightly version of MediaWiki the patch is not there.
Some people may want to look for bugs with pending patches. At the
very least, many people would want to know that there's a pending
patch when bugzilla is displaying the list of bugs in the search
window.

In different life stages of a bug, different types of love need to be
given to a bug. Thus the different stages should get different
statuses.

tl;dr /me really likes Andre's plan.

p.s. This is not the first time this has come up -
https://www.mediawiki.org/wiki/Thread:Talk:Git/Workflow/Bugzilla

--
-Bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bikol Wikipedia logo issue

2012-12-13 Thread bawolff
On Thu, Dec 13, 2012 at 11:35 AM, Butch Bustria  wrote:
> Hello,
>
> Can you help us fix the Wikipedia Logo on bcl.wikipedia.org ?
>
> The local community is asking for assistance.

Updating the file at [[bcl:File:Wiki.png]] is indeed the right way to
change the logo. However the version recently uploaded has two
problems:
*It's too big, which means it will mostly be cut off. The image as
uploaded must have the dimensions of 135x155.
*There's no alpha channel. This is a minor issue, but will make the
background not show through. How to add an alpha channel varies with
software, and I'm sure that someone at commons would be willing to
help with that aspect if needed.

There's a second problem, in that the old logo is still showing up.
This appears to be due to issues with squid cache not being purged
(I've noticed there's been quite a few reports of this for re-uploads
at commons. Additionally it seems like redirects really aren't getting
their squid cache purged either. I don't know if anyone is currently
investigating this or not...)

Someone from ops might be able to manually purge the url
//upload.wikimedia.org/wikipedia/bcl/b/bc/Wiki.png - or if all else
fails $wgLogo could be changed to a different url to get around the
caching issue.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 8:49 PM, Quim Gil  wrote:
> On 12/11/2012 03:44 PM, bawolff wrote:
[..]
>
> One starting point in your city / region would be to check
> http://en.wikipedia.org/wiki/Wikipedia:Meetup , attend the next meetup and
> start infiltrating the MediaWiki / tech agenda there. There is no point in
> keeping the traditional divide between readers/editors and tech/coders
> forever.

Hey I would if there was one in either of the two cities I currently
live in (University and "home" are in different cities). Heck no city
from either of the two provinces I live in even make it anywhere on
that page. [before I walk into the whole - you should start one
yourself, I'm much too lazy ;) ]


>
>> To be honest though, I kind of feel that if such "groups" were going
>> to form, they probably would have already. Formality rarely makes
>> people come together that wouldn't by themselves.
>
>
> Time will tell. We are not attempting to convince you.  :)  As long as you
> point to the right URL anybody interested in forming a group and you attend
> the activities happening near you, it's all fine.

By all means, if stuff actually happens I'll be just as happy as the
rest of you :)


--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki Groups are official: start your own!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 1:14 PM, Quim Gil  wrote:
> MediaWiki Groups are now official - and recognized by the Wikimedia
> Affiliations Committee:
>
> https://www.mediawiki.org/wiki/Groups/
>
> Who wants to start one?
>
> I just created
>
> https://www.mediawiki.org/wiki/Groups/Proposals/San_Francisco
>

I'm actually quite curious to see if there are actually enough MW devs
in a single city (Other then WMF's home town) to form a group.

To be honest though, I kind of feel that if such "groups" were going
to form, they probably would have already. Formality rarely makes
people come together that wouldn't by themselves.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome our six OPW interns!

2012-12-11 Thread bawolff
On Tue, Dec 11, 2012 at 4:04 PM, Quim Gil  wrote:
> I’m glad to announce that Kim Schoonover, Mariya Miteva, Priyanka Nag,
> Sucheta Ghoshal, Teresa Cho and Valerie Juarez will join the MediaWiki
> community as full-time interns between January and March 2013. They have
> been selected as part of the FLOSS Outreach Program for Women.
>
> Check the details at
>
> http://blog.wikimedia.org/2012/12/11/welcome-to-floss-outreach-program-for-women-interns/
>
> We wish a happy landing to our new interns and the best luck in their
> projects! You’ll be hearing more from them over the next few months.
>
> --
> Quim Gil
> Technical Contributor Coordinator @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Good luck and congrats to everyone accepted.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] "Nobody" & "Wikidata bugs": notify when you start working on a bug

2012-12-07 Thread bawolff
On Fri, Dec 7, 2012 at 12:54 PM, Quim Gil  wrote:
> Let's see. No mater how quick you fix is: isn't there a moment when you are
> sitting in front of the bug report to see what needs to be fixed?

Well yes - but just because I know how to fix a bug in theory, doesn't
mean I'm actually going to fix the bug :P



So hypothetical situation

*I'm bored one day and decide to fix a bug. For sake of argument lets
say I'm not in an internet zone.
*I go somewhere to get wifi. upload my patch to gerrit
*I comment on bug that there's a patch in gerrit. I also assign the
bug to myself.

In this situation I'm really unclear on what the point of taking the
bug is. By the time I have the ability to mark the bug assigned, I've
already done it. If someone happens to fix the bug in the intermediate
time, while that happens sometimes. For a quick fix bug, I'm not going
to be upset about it.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] the skin change in 1.21wmf5, display breakage, & fix retrospective

2012-11-29 Thread bawolff
>"Don't change the HTML in incompatible ways" is probably a good
>general rule to live by

Not necessarily - I've only read your summary and not looked into what
happened in depth, but the issue seems to have occurred from changing
the CSS [and possibly js] in a non backwards compatible way. Changing
the HTML didn't really matter.

>but having an easy way to say "start purging
>all pages on $theseWikis from Squid/Varnish" would also be nice.

That sounds like something that could hurt the server kitties unless
done rather slowly [at least for enwiki]...

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Priorities in Bugzilla

2012-11-28 Thread bawolff
On Wed, Nov 28, 2012 at 9:32 AM, Andre Klapper  wrote:
> On Tue, 2012-11-27 at 22:23 +, Tim Landscheidt wrote:
>> If at the moment the priority field neither necessarily
>> triggers action nor reflects the actual state of affairs,
>> why even bother and not just delete/hide it from view?  This
>> would free more time to fix bugs.
>
> I don't see how dropping it all together helps planning or how it "frees
> more time". Obviously anybody is free to work on anything but some stuff
> simply is more important than other stuff.
> I understand that there are many options and ways to express that
> importance though, and that "severity", "priority", "target milestones",
> "blocker bugs" have some ambiguity to discuss in the long run.
> Right now I'd like to introduce a clear way to mark issues that should
> be handled immediately.
>

I think the priority field is important. And I sometimes use it for
finding bugs to fix (or you know, did back when I had more free time
and went around fixing random bugs).

What I would really like to see is banning users from touching the
priority field. The field is made rather useless by bug reporters who
feel their pet issue is the most important thing to ever happen - and
suddenly there's a whole lot of issues at highest. Of course I would
still like to see triaging people setting the field - just not the
randoms who think by marking their bug highest priority it will
actually be treated like highest priority.

Some of the suggestions mentioned earlier for priority meanings seem a
bit inflated to me. At most times there's not a huge number of high
priority issues (Limited person power = not everything can be fixed
instantly) - Thus the field should be more distinguishing on the lower
end of the spectrum. I personally think that the following would more
reflect reality:
lowest = nobody cares. If somebody cares they can fix it, and we won't stop them
low = Not very important. Maybe one day if I'm very bored. If this
issue never got fixed I wouldn't loose too much sleep
Normal = Your average bug (Realizing that your average bug isn't very
important). This should get fixed at some point. Doesn't have to be at
next release - But if I was browsing Wikipedia 5 years from now I hope
I don't encounter the issue
High = This was kind of bad. Unless there is some very good reason we
cannot, this should be fixed by next release. Preferably in the next
month if it is not a large amount of work
Highest = This is a major issue. Somebody should be working on this
right now. If somebody sets this to high they should either be working
on the issue, or working to find someone to fix the issue.

An alternative way of looking at priority, is instead of how long it
should take to fix - look instead at how long it should take before
somebody starts to look into/begin fixing the issue.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] LevelUp (sequel to "What do you want to learn?" & 20% time)

2012-11-21 Thread bawolff
On Wed, Nov 21, 2012 at 8:10 PM, Sumana Harihareswara
 wrote:
> LevelUp is a mentorship program that will start in January 2013 and that
> replaces the "20% time" policy
> https://www.mediawiki.org/wiki/Wikimedia_engineering_20%25_policy for
> Wikimedia Foundation engineers.  Technical contributors, volunteer or
> staff, have the opportunity to participate; see
> https://www.mediawiki.org/wiki/Mentorship_programs/LevelUp for more details.
>
> We started 20% time to ensure that Wikimedia Foundation engineers would
> spend at least 20% of each week on tasks that directly serve the
> Wikimedia developer and user community, including bug triage, code
> review, extension review, documentation, urgent bugfixes, and so on.  It
> had various flaws. 1 day every week, I made people task-switch and it
> got in the way of their deadlines, and it was perceived as a chore that
> always needed doing.
>
> It felt like enforcing a rota to do the dishes.  So instead, let's build
> a dishwasher.  :-)  We can cross-train each other and fill in the empty
> rows on the maintainership table
> https://www.mediawiki.org/wiki/Developers/Maintainers so our whole
> community gains the capacity to get stuff done faster.
>
> If you've been frustrated because of code review delays, I want you to
> sign up for LevelUp -- by March 2013 you could be a comaintainer of a
> codebase and be merging and improving other people's patchsets, which
> will give them more time and incentive to merge yours. :-)
>
> When I asked what people wanted to learn, I got a variety of responses
> -- including "MediaWiki in general", "puppet", "networking", and "JS,
> PHP, HTML, CSS, SQL" -- all of which you can learn through LevelUp.
> When I asked how you wanted to learn, all of you said you wanted
> real-life, hands-on work with mentors who could answer your questions.
> Here you go. :-)
>
> I won't be starting the matchmaking process in earnest till I come back
> from the Thanksgiving break on Monday, but I will reply to talk page
> messages and emails then. :-)
> --
> Sumana Harihareswara
> Engineering Community Manager
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I know they just came from the bugzilla descriptions (which really
need to be updated in some cases), but some of the component
descriptions are just funny:
*API: RESTful Web-based API that lets people interact with MediaWiki
programmatically
*Job queue (available since 1.21)
(They're funny because the API isn't RESTful and the Job Queue isn't
new in 1.21).

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A Newbie!

2012-11-21 Thread bawolff
On Wed, Nov 21, 2012 at 6:18 PM, Sébastien Santoro
 wrote:
> Hello,
>
> Thank you for the interest in this outreach program.
>
> You also need to ask a developer access on the following page:
> https://www.mediawiki.org/wiki/Developer_access
>
> Once you have your creditentials, I will be happy to assist you to
> configure a developer environment (we use Git as source control and
> Gerrit as code review system) on our IRC channel #mediawiki on
> Freenode.
>

As an important note on that, one does not need developer access to
play with the code (although I highly recommend you do get dev
access). One can make anonymous checkouts to start exploring
immediately without having to wait.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Proposed new WMF browser support framework for MediaWiki

2012-11-21 Thread bawolff
>If I want to do things on Wikipedia from Lynx [0] I should
>be able to do as much as Lynx supports, not less because my useragent doesn't
>match something that we support.

Also if we cut lynx support Jidini will come and kill us in our sleep.
Dead developers = productivity loss
;)

-
On a serious note, I agree with what others have said about making
sure that there is a link that you can click and get to an edit form
that will submit on essentially all browsers is not hard and something
that should always work. More fancy features don't need to work on all
browsers. Personally doing something like support last two latest
versions of browser X doesn't really make sense to me. We should
support what people use. If no one uses latest and greatest browser X,
we shouldn't bother with it. If (a significant number of) people are
still using firefox 3.5 for some unknown reason, we should still
support it. Thus I think we should stick to using percentages, perhaps
with varying levels of support for different percentages. I'm really
unclear on what benefit we would get from declaring a list of
supported browsers [that are somewhat unrelated to viewership
percentages] instead of directly looking at the user-agent statistics.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please take this survey about new contributors

2012-11-19 Thread bawolff
On Mon, Nov 19, 2012 at 1:40 PM, Quim Gil  wrote:
> Hi, if you joined the MediaWiki / Wikimedia tech community in 2010 or
> later please consider taking this survey:
>
> Newcomer experience and contributor behavior in FOSS communities
> https://limesurvey.sim.vuw.ac.nz/index.php?sid=65151&lang=en
>
> The survey is open for sporadic contributors or full time Wikimedia
> employees, developers or any other profile. Anybody is welcome to
> leave their feedback as long as you have started contributing to this
> community in the past 3 years.
>
> 11 mature and well established open source projects are taking part in
> this survey: Debian, FreeBSD, GNOME, Gentoo, KDE, Mozilla, NetBSD,
> OpenSUSE, Python, Ubuntu and Wikimedia. Some of them started some days
> ago and have more than hundred responses by now. The data of this
> survey is anonymous and will be released under a ‘share-alike’ Open
> Data Commons Open Database License (ODbL).
>
> Some background:
>
> From Kevin Carillo, the researcher:
> http://kevincarillo.org/survey-invitation/
>
> From OpenHatch, a non-profit working on the bridge between free
> software projects and new contributors:
> https://openhatch.org/blog/2012/a-research-project-to-understand-what-does-it-take-to-retain-newcomers/
>
>
> PLEA
> If you, like me, became a bit tired of survey requests like this
> please consider filling this one anyway. It focuses in a specific area
> where we don't have much data. As fresh technical contributor
> coordinator at the WMF I'm looking forward to the results of this
> research and the lessons it will bring.
>
> Thank you.  :)
>
> --
> Quim Gil
> Technical Contributor Coordinator
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Hmm, some of the questions were a little unclear - talking about the
Wikimedia community, but Wikimedia (as a whole) is not a FOSS project.
I became a developer roughly 3 years ago. I've been a member of
Wikimedia land since roughly 2005. Similarly, does being a gsoc
participant count as being paid to work on MediaWiki - after all gsoc
students do get money for doing MediaWiki things, etc.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] editing channels - "How was this edit made?"

2012-11-17 Thread bawolff
>
>
>
> True, sorry, I didn't look closely enough to realize that tag_summary is
> denormalized change_tag.
>
> However, this doesn't deal with the problem that [[Special:Tags]] will get
> cluttered with this approach. It might work for a "mobileedit" tag, but
> valid_tag cannot grow arbitrarily.
>

True, but that's already a problem with the tag system. EN Wikipedia's
Special:tags is full of things with description "This tag is
inactive.". In the long run we will probably have to find some way of
managing having a very long list of tags.

Another problem though is people might want to track what edits are
mobile (or whatever else) but they may not want each one to be shown
in RC as mobile edit (since it adds clutter). Perhaps using the
currently unused ct_params to be able to make certain tags hidden (be
able to filter by them, but not show up in the line in RC) would be a
solution.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] editing channels - "How was this edit made?"

2012-11-16 Thread bawolff
>
> I misread this, I didn't realize MZMcBride is talking about 
> RecentChanges.
>
> How unreasonable would it be to call ChangeTag::AddTags('mobile', 
> $rc_id); for mobile edits? On first blush, the only major consequences is 
> extracting the data since it'd be buried in a ts_tags blob?

Why would you look at ts_tags? change_tag table is much easier to pull
out as it uses a more normalized layout.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enable Page View Statistic

2012-11-13 Thread bawolff
>>
>> > On Tue, Nov 13, 2012 at 4:21 AM, Harsh Kothari
>> >  wrote:
>> >> Hi All
>> >>
>> >> I found that on Gujarati Wikipedia there is no page view statistic. So
>> please enable that features and also give us data of How many hits per day
>> or per month on Gujarati Wikipedia.
>> >>
>> >

Also you may be interested in http://stats.grok.se/gu/top and
http://stats.grok.se/gu/201012/%E0%AA%AE%E0%AB%81%E0%AA%96%E0%AA%AA%E0%AB%83%E0%AA%B7%E0%AB%8D%E0%AA%A0
(The interface doesn't work well in non-popular languages, so you have
to make the urls by hand
http://stats.grok.se/gu// (with gu being the
language code). Additionally a couple clicks in to the link Chad gave
you gets to this page: http://stats.wikimedia.org/EN/SummaryGU.htm
which has a graph of page views over time near the bottom.

There are raw page view stats available at
http://dumps.wikimedia.org/other/pagecounts-raw/ which one can
filter/visualize/etc in whatever form is convenient (If you're into
statistical stuff - requires some effort to get usable information).

Hope that helps,
-Bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work flow for bugfixing

2012-11-08 Thread bawolff
>
> Maybe we need a Waiting_merge status in bugzilla.
>

I would like that. I find the "patch-in-gerrit" keyword very easy to
miss, and really "patch in gerrit" and "open" are two very different
stages of a bugs lifestyle.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fwd: [Tech/Product] Engineering/Product org structure

2012-11-07 Thread bawolff
+1. I love the fact that these sorts of things are being discussed
publicly, but I have no idea what the difference between "product" and
"engineering" is. I have a vague idea, like that ops probably falls
under engineering, and AFT like things probably fall under product.
However I wouldn't know where general core mediawiki work work would
fall under, etc.

-bawolff

On Wed, Nov 7, 2012 at 2:00 PM, Quim Gil  wrote:
> Hi, am I the only one having difficulties understanding the proposal and
> what it implies?
>
>
> On 11/05/2012 07:03 PM, Erik Moeller wrote:
>>
>> we need to split the current department into an engineering dept
>> and a product dept in about 6-8 months.
>
>
> It is strange to see "engineering" and "product" side by side, since (as i
> understand them) these words belong to different categories.  :)
>
> Do you mean a "platform" team and "product" team, both filled with engineers
> and other profiles but each one focusing on different things? The MediaWiki
> (platform) team and the Wikimedia (product) teams, so to say?
>
> Or are you indeed referring to the classical separation between "product
> managers + designers" and "developers + testers"? The first ones defining
> requirements and the second ones implementing them?
>
> Or something else? Reading your email +
> http://wikimediafoundation.org/wiki/Staff_and_contractors +
> http://www.mediawiki.org/wiki/Wikimedia_Engineering wasn't enough for me to
> understand.
>
> What is clear from your email is that the current Engineering team is
> underrepresented at a high level and you Erik have too much in your bucket.
> A split and flattening getting more people in the high decision levels makes
> total sense.
>
> What also seems to be clear is that such reorganization should solve the
> slightly schizophrenic tension of priorities between Wikimedia/product and
> MediaWiki/platform, right?
>
> Whatever the result, I hope we end up with teams where software developers,
> sysadmins, product managers, designers etc are well mixed in focused teams
> going after clear common goals.
>
> --
> Quim
>
>
>> To avoid fear and anxiety, and to make sure the plan makes sense, I
>> want to start an open conversation now. If you think any of the below
>> is a terrible idea, or have suggestions on how to improve the plan,
>> I’d love to hear from you. I’ll make myself personally available to
>> anyone who wants to talk more about it. (I'm traveling a bit starting
>> tomorrow, but will be available via email during that time.) We can
>> also discuss it at coming tech lunches and such.
>>
>> There’s also nothing private here, so I’m forwarding this note to
>> wikitech-l@ and wikimedia-l@ as well. That said, there’s no urgency in
>> this note, so feel free to set it aside for later.
>>
>> Here’s why I’m recommending to Sue that we create distinct engineering
>> and product departments:
>>
>> - It’ll give product development and the user experience more
>> visibility at the senior mgmt level, which means we’ll have more
>> conversations at that level about the work that most of the
>> organization actually does. Right now, a single dept of ~70 people is
>> represented by 1 person across both engineering and product functions
>> - me. That was fine when it was half the size. Right now it’s out of
>> whack.
>>
>> - It’ll give us the ability to add Director-level leadership functions
>> as appropriate without making my head explode.
>>
>> - I believe that separating the two functions is consistent with Sue’s
>> recommendation to narrow our focus and develop our identity as an
>> engineering organization. It will allow for more sustained effort in
>> managing product priorities and greater advocacy for core platform
>> issues (APIs, site performance, search, ops improvements, etc.) that
>> are less visible than our feature priorities.
>>
>> A split dept structure wouldn’t affect the way we assemble teams --
>> we’d still pull from required functions (devs, product, UI/UX, etc.),
>> and teams would continue to pursue their objectives fairly
>> autonomously.
>>
>> It’s not all roses -- we might see more conflict between the two
>> functions, more us vs. them thinking, and more communications
>> breakdowns or forum shopping. But net I think the positives would
>> outweigh the negatives, and there are ways to mitigate against the
>> negatives.
>>
>> The way we’d get there:
>>
>> I’m prepared to resign from my engineering management responsibilities
>> and to fo

Re: [Wikitech-l] Unit-testing a tag extension (parser blows up)?

2012-11-01 Thread bawolff
On Thu, Nov 1, 2012 at 6:11 PM, Daniel Barrett  wrote:
> I'm trying to test a parser tag extension with phpunit and have run into a
> strange problem. Whenever my extension calls $parser->recursiveTagParse(), 
> the unit
> test blows up in Parser.php, complaining that $parser->mOptions is a 
> non-object.
>
> The tag callback looks pretty normal:
>
> static function render($input, $argv, $parser, $frame) {
>   // ...
>   $parser->recursiveTagParse("something");
>   // ...
> }
>
> and I have unit tests that call render()directly:
>
> public function testMyTag() {
>   global $wgParser;
>   $this->assertEqual(MyTag::render("some text", array(), $wgParser, false));
> }
>
> (I don't like using $wgParser here, and maybe that's the root of my problems?)
> The tag works perfectly in the browser. Just not when unit-testing on the 
> command
> line.
>
> The blowup occurs in Parser.php::replaceVariables, when it calls
> $this->mOptions->getMaxIncludeSize().
>
> Any advice appreciated!!
> Thanks,
> DanB
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

You're calling the recursiveTagParse while the parser is not in a
"parsing" state.

The easiest way to do this is to have some text "yadda
yaddafoo" and pass it to
$wgParser->parse, and have the parser being the thing calling your
callback. (If you're going to do that
approach you can even use older style parser tests text file with
$wgParserTestFiles if you wanted
as its basically doing the same thing)

You can also work around this (I believe anyhow) by doing something
complicated with
Parser->startExternalParse first and then doing what you are doing,
 but I've never really used that method and am not sure how it works.

Hope that helps,
-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please fix the PHP unit tests

2012-10-05 Thread bawolff
It would be nice if jenkins also did a run with some non-default
options. Obviously we have a lot of options, but doing at least 1 run
with some of the common variations (say $wgCapitalLinks) would help
quite a bit I imagine.


-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] my $wgHooks['SkinTemplateNavigation'] broke again

2012-10-01 Thread bawolff
On Mon, Oct 1, 2012 at 2:59 PM, Antoine Musso  wrote:
> Le 01/10/12 16:50, Daniel Friesen a écrit :
>> Wait, all you've been doing this tim is removing redlinks... why don't
>> you just use css for that?
>
> That will not work on text browsers which is what Jidanni uses.
>
> --
> Antoine "hashar" Musso
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

In which case the links wouldn't be red in the first place :P

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] request: "average code review wait time" dashboard

2012-09-12 Thread bawolff
On Mon, Sep 10, 2012 at 10:14 PM, MZMcBride  wrote:
> Sumana Harihareswara wrote:
>> I would love for someone to integrate that kind of wait-time indicator
>> into https://toolserver.org/~bawolff/gerrit-stats.htm or
>> http://gerrit-stats.wmflabs.org/ .  My suggested stats: the
>> min/median/average/maximum time between a patchset's submission and its
>> merge or abandonment, and the min/median/average/maximum time between
>> patchset submission and any +1 or -1, divided up by Gerrit repository.
>
> Is there a Gerrit API?
>
> MZMcBride

Yes. Its kind of weird though. Basically you can make an ssh
connection to gerrit, give it a search query, and it will return
results in json.

See 
http://gerrit.googlecode.com/svn-history/r3021/documentation/2.1.4/cmd-query.html

> I would love for someone to integrate that kind of wait-time indicator
>into https://toolserver.org/~bawolff/gerrit-stats.htm ...

Me too! :)

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review statistics and trends

2012-09-01 Thread bawolff
On Sat, Sep 1, 2012 at 6:17 AM, Bartosz Dziewoński  wrote:
> 2012/9/1 bawolff :
>> Slightly hijacking this thread but is related.
>>
>> I tried my hand at creating some gerrit related statistics:
>> https://toolserver.org/~bawolff/gerrit-stats.htm
>
> The "Wall of Shame" links don't work for me – neither "View gerrit
> query" or number-links. (Using Opera.)
>
> -- Matma Rex

Sorry I was encoding the url the wrong way. Should be fixed now. No
idea why it was broken in Opera and not Firefox

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code review statistics and trends

2012-08-31 Thread bawolff
Slightly hijacking this thread but is related.

I tried my hand at creating some gerrit related statistics:
https://toolserver.org/~bawolff/gerrit-stats.htm

Statistics might be the wrong word, its more the first 25 results of a
saved search in reverse order - but also includes the total number of
changesets with no review whatsoever, and some other things. There may
be more things on it in the future.

Last of all it includes a wall of shame, because I missed the old wall.

This is still a rather rough version (As can be noted by lack of css
or anything pretty). It also will update rather sporadically (I don't
feel all that good about putting my private key on the toolserver to
make this into a cron script on that server, so for now it is run by
hand).

Anyhow, feedback appreciated :)

[Also source isn't public at the moment, but if anyone wants it let me
know. It is a fairly hacky php script at the moment]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


<    1   2   3   >