Re: [Wikitech-l] Linker::link() rewrite

2016-05-16 Thread Chris Steipp
Is there any way we can default to having the body of the link not be
passed as html? It's called $html, well documented that it's raw html, and
I've lost track of the number of times people pass unsanitized text to it.
I'd rather it not be something developers have to worry about, unless they
know they need to handle the sanitization themselves. Maybe typehint a
Message object, and people can add raw params if they really need to send
it raw html?

On Sun, May 15, 2016 at 4:28 PM, Legoktm 
wrote:

> Hi,
>
> For the past few weeks I've been working[1] on rewriting Linker::link()
> to be non-static, use LinkTarget/TitleValue and some of the other fancy
> new services stuff. Yay!
>
> For the most part, you'd use it in similar ways:
>  Linker::link( $title, $html, $attribs, $query );
> is now:
>  $linkRenderer = MediaWikiServices::getInstance()
>->getHtmlPageLinkRenderer();
>  $linkRenderer->makeLink( $title, $html, $attribs, $query );
>
> And there are makeKnownLink() and makeBrokenLink() entry points as well.
>
> Unlike Linker::link(), there is no $options parameter to pass in every
> time a link needs to be made. Those options are set on the
> HtmlPageLinkRenderer instance, and then applied to all links made using
> it. MediaWikiServices has an instance using the default settings, but
> other classes like Parser will have their own that should be used[2].
>
> I'm also deprecating the two hooks called by Linker::link(), LinkBegin
> and LinkEnd. They are being replaced by the mostly-equivalent
> HtmlPageLinkRendererBegin and HtmlPageLinkRendererEnd hooks. More
> details are in the commit message. [3] is an example conversion for
> Wikibase.
>
> The commit is still a WIP because I haven't gotten around to writing
> specific tests for it (it passes all the pre-existing Linker and parser
> tests though!), and will be doing that in the next few days.
>
> Regardless, reviews / comments / feedback on [1] is appreciated!
>
> [1] https://gerrit.wikimedia.org/r/#/c/284750/
> [2] https://gerrit.wikimedia.org/r/#/c/288572/
> [3] https://gerrit.wikimedia.org/r/#/c/288674/
>
> -- Legoktm
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reviving SVG client-side rendering task

2016-05-11 Thread Chris Steipp
On Thu, May 5, 2016 at 6:49 AM, Brion Vibber  wrote:

>
> And then there are long term goals of taking more advantage of SVGs dynamic
> nature -- making things animated or interactive. That's a much bigger
> question and has implementation and security issues!


Sorry for the late response (and if this is covered in one of the linked
bugs), but getting back to this-- are you envisioning SVG's inlined into
the html? Or would they be served from a domain like upload.wm.o, like we
currently use for uploaded images?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] REL1_27 branches up

2016-05-05 Thread Chris Steipp
On Thu, May 5, 2016 at 8:50 AM, Chad  wrote:

> On Thu, May 5, 2016 at 8:19 AM Gergo Tisza  wrote:
>
> > On Thu, May 5, 2016 at 4:31 PM, Chad  wrote:
> >
> > > Well then it sounds like it won't make the 1.27 release. We've known
> > > this branching was coming for the last 6 months :)
> > >
> >
> > Is there a way to do a backport before the release candidate gets
> > published? The problem with doing major changes after an LTS release is
> > that both the old and the new version has to be maintained for the full
> > support cycle.
> >
> >
> If it can get done in time, sure. I know there's a strong desire to land it
> prior to the release but we can't hold it up forever :)
>
>
There were very minor changes I suggested to the patches that Gergo was
cleaning up earlier this week (like, can we not call the api module
ResetPassword when the special page is PasswordReset). At this point, I'm
fine if it juts gets merged.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Docs, use of, and admin privileges for wikimedia github project?

2016-04-25 Thread Chris Steipp
On Mon, Apr 25, 2016 at 8:34 AM, Bryan Davis  wrote:

> Not that I am aware of. Rights there tend to work a lot like getting
> elevated rights on mediawiki.org: the rights are handed out by
> existing admins when somebody asks for something that will be easily
> solved by giving them rights. I think there was some amount of cleanup
> done a few months ago that got admins to either add 2fa to their
> github account or be removed.
>

Correct, all admins should have two-factor setup. I believe everyone who is
an admin there has +2 in gerrit, and a reason to have the rights in Github.
I'd propose those 3 things as a minimal standard, since I don't think we
ever defined one.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikitech two-factor authentication

2016-03-26 Thread Chris Steipp
Hi all,

tl,dr; if you enabled two-factor authentication on your
wikitech.wikimedia.org account this past week (since 23 March, 22:03 UTC),
the second factor may have been removed, and you should re-enable it.

The long version:
Several users in the past few days reported that they had 2FA required for
their wikitech account, but had not enabled it. This was my fault-- we
converted the database token format and didn't account for users who had
previously clicked on the "enable two-factor authentication" but never
finish the enabling process. These users (about 400, as best as we can
tell) were unfortunately locked out of their accounts as a result, after we
updated the token format.

I've disabled 2FA for accounts that looked like they were affected by our
update. Unfortunately, this may have disabled 2FA for users who recently
enabled 2FA also. So if you enabled 2FA on your account this past week, you
may need to re-enable it-- please login to wikitech and verify that 2FA is
still enabled for you account.

Apologies to everyone affected.

Chris
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to do redirect 'the right way' when OutputPage::prepareErrorPage is triggered

2016-03-07 Thread Chris Steipp
On Mon, Mar 7, 2016 at 10:32 AM, Victor Danilchenko <
vdanilche...@cimpress.com> wrote:

> My simple solution to this is to forcibly invoke OutputPage::Output on the
> spot, right there in the 'BeforeInitialize' hook:
>
> $this->output->redirect($https_url, 301);
> $this->output->output();
>

That's pretty much how we do the https redirect in MediaWiki::main() (and a
few other places). So while it's ugly, that's pretty standard, and should
work for you in the foreseeable future.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unable to log into phabricator

2016-01-29 Thread Chris Steipp
Hi Devang,

I see from https://phabricator.wikimedia.org/p/dg711/ that the MediaWiki
account you're associated with is
https://www.mediawiki.org/wiki/User:Devang_gaur. Just making sure that's
the account you're logging in with on wiki, right?

Due to issues with sessionmanager on wiki, you might try deleting all your
wiki cookies and logging in on wiki again. Just to make sure that's not an
issue.

On Fri, Jan 29, 2016 at 12:27 PM, Devang Gaur  wrote:

>  I registered into phabricator with my Mediawiki account initially . Now
> when I opt for "login or register via mediawiki" . It takes me to a new
> account registration page whereas I have an existing phabricator account
> (@dg711) already linked with that same mediawiki account . Is that a bug or
> what ?
>
> Please help me resolve this .
>
> Thanks ,
> Devang Gaur
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tech Talk: Secure Coding For MediaWiki Developers: December 09

2015-12-09 Thread Chris Steipp
Just a reminder this is starting in one hour!

On Thu, Dec 3, 2015 at 1:54 PM, Rachel Farrand 
wrote:

> Please join for the following tech talk:
>
> *Tech Talk**:* Secure Coding For MediaWiki Developers
> *Presenter:* Darian Patrick
> *Date:* December 09, 2015
> *Time: *23:00 UTC
> <
> http://www.timeanddate.com/worldclock/fixedtime.html?msg=Tech+Talk%3A+Secure+Coding+For+MediaWiki+Developers&iso=20151209T2330&p1=1440&ah=1
> >
> Link to live YouTube stream 
> *IRC channel for questions/discussion:* #wikimedia-office
> Google+ page
> <
> https://plus.google.com/u/0/b/103470172168784626509/events/cv74aqvumuu4k4cflh0gcfcv33g
> >,
> another
> place for questions
>
> *Summary: *This talk will present material to aid MediaWiki developers in
> secure programming practices. The talk will draw on information from
> https://www.mediawiki.org/wiki/Security_for_developers, the CWE-25 (
> https://cwe.mitre.org/top25/), other resources, in an attempt to elucidate
> topics pertinent to core and extension developers.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The case for a MediaWiki LTS release

2015-12-03 Thread Chris Steipp
On Thursday, December 3, 2015, Chad  wrote:

> On Thu, Dec 3, 2015 at 1:25 AM Legoktm  > wrote:
>
> > I think it would be helpful if other people who use LTS could share
> > their motivations for doing so, and if the release/security teams could
> > share what issues make LTS release support problematic or difficult (a
> > few things have been mentioned on IRC, but I'll let those people speak
> > for themselves).
> >
> >
> The main problem with supporting LTS in security releases is after they
> start to get old, backporting of those patches becomes a real chore for
> each release. 1.19 especially required a *lot* of wrangling on each
> release to get things back-portable because of the amount of code that
> had changed in the meantime. Most of the comments I've made re:
> LTSes have happened when working on a nasty backport so take them
> with a grain of salt.
>

When I was doing backports to 1.19, one of the most labor intensive parts
was having to ensure the patch was php 5.2 compatible, when the security
patches included 5.3 syntax in the other branches. So if we do LTS's, I
would vote that we try to keep the php version the same for the LTS and
following 3 non-LTS releases, if possible.

I personally like the idea of LTS. The last non-work wiki I setup I
initially installed with 1.23, in the hopes that it would be stable with no
new features. It was for collaboration on a specific project, and the users
were not wikipedians-- they didn't want new features, just wanted it to
work, and to be able to add stuff there once they had figured out how to
use it. In then end, they also wanted VE, so I had to install a -wmf branch
that mostly worked with the current parsoid, and I pretty much didn't touch
it after I got it working except to apply any security patches that
addressed an actual risk.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-30 Thread Chris Steipp
On Sat, Nov 28, 2015 at 1:36 PM, Yeongjin Jang 
wrote:

> > *Privacy concerns - Would a malicious person be able to force
> > themselves to be someone's preferred peer, and spy on everything they
> > read, etc.
> >
> > *DOS concerns - Would a malicious peer or peers be able to prevent an
> > honest user from connecting to the website? (I didn't look in detail
> > at how you select peers and handle peer failure, so I'm not sure if
> > this applies)
> >
> >
> Nice points! For privacy, we want to implement k-anonymity scheme on
> the page access. However, it incurs more bandwidth consumption and
> potential performance overhead on the system.
>
> Malicious peers can act as if they hold legitimate content
> (while actually not), or making null request to the peers.
> We are currently thinking about black-listing such malicious peers,
> and live-migration of mirror/peer servers if they fails,
> but more fundamental remedy is required.



Those are interesting ideas, although I'm skeptical you're going to be able
to successfully keep malicious peers from tracking users' reading habits,
in the same way that law enforcement tracks bittorrent downloads. But it
would be great to hear proposals you come up with.

I haven't looked at the code, but are you also preventing malicious peers
from modifying the content?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit +1 now executes the code you reviewed

2015-11-17 Thread Chris Steipp
Just to clarify, this is a +1 from a user who has +2 rights? Whereas a +1
from some random user will not initiate the tests?

On Tue, Nov 17, 2015 at 10:20 AM, Jan Zerebecki 
wrote:

> I just merged and deployed https://gerrit.wikimedia.org/r/#/c/184886/ ,
> which means:
> A +1 in gerrit.w.o didn't have any technical effect until now. Now it
> submits the patch for testing. That means if you +1 a patch from a
> non-whitelisted user that was not yet tested, it will then, just as if
> recheck was issued. Thus executing the code that you reviewed to not
> steal secrets or compromise security in other ways.
>
> --
> Regards,
> Jan Zerebecki
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Random rant

2015-10-28 Thread Chris Steipp
On Wed, Oct 28, 2015 at 9:10 AM, Aaron Halfaker 
wrote:

> Is there a clearly good reason that we need to continue this review
> process?  If not, I find it very frustrating that we're slowing things down
> so much because of imagined boogie-men.  The idea of
> permission-just-in-case-someone-does-a-bad-thing is opposed to the wiki
> model of keeping things as open as possible and addressing problems as they
> happen.  In the meantime, we're encouraging bad behavior by making the
> OAuth system such a pain to work with.  I understand that you're doing this
> in your free time csteipp, but the pain of delays is still inflicted on
> tool developers all the same.  Maybe it is inappropriate that such a key
> infrastructure (and official requirement for Labs-based tools) is left up
> to volunteer time of someone who is apparently overworked.
>
>
I'm very happy for other people to join this process. I believe there's an
open bug about making approvals automatic for non-controversial rights.
Patches welcome.


>1. How long is this transition process supposed to take?
>

Not defined yet.


>2. Should I start making my argument to the Stewards now?
>

About what? If you have something that's not controversial, ping one of the
admins, and I'm sure you can get your Consumer approved today.


>3. Is there a public conversation about this transition that I can
>participate in?
>
>
The RFC is the correct place. The Stewards are just getting back from
travelling so I don't think we've started updating it to account for our
conversations last week, but that is where we will work out the details.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Random rant

2015-10-28 Thread Chris Steipp
On Tue, Oct 27, 2015 at 11:23 PM, Brian Wolff  wrote:

> On 10/27/15, Ricordisamoa  wrote:
> > ALL of my OAuth applications expired without anyone noticing. Whom am I
> > supposed to lobby to get one approved?
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> I suppose these people:
>
> https://meta.wikimedia.org/w/index.php?title=Special%3AListUsers&username=&group=oauthadmin&limit=50


Yes, bug one of us for now. I talked with the Stewards about taking on the
process last week, and we're in the process of making that transition.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth issue -- adding new consumer

2015-10-16 Thread Chris Steipp
Ivo,

Can you maybe describe what issues you're having? There are several people
who can help with OAuth, but finding the right person based on, what
language your Consumer is written, what framework you're using, or the
exact issue you're having, will be easier with more details.

On Fri, Oct 16, 2015 at 11:20 AM, Jon Katz  wrote:

> On Thu, Oct 15, 2015 at 12:48 PM, Jon Katz  wrote:
>
> > haha, awesome.  I'll actually take a look :)
>
>
> Scratch that last comment.  My wires got crossed.  I am not the person to
> talk to about OAuth.  I am, however, a product manager on the reading team
> interested in exploring similar issues: ways for readers to interact with
> content and ask questions.
>
> I played around with the site yesterday and would love to chat with you
> guys about what you're doing and your goals.
> -J
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LDAP extension ownership

2015-09-21 Thread Chris Steipp
On Sep 19, 2015 11:15 AM, "bawolff"  wrote:
>
> maintain is an ambiguous word. WMF has some responsibility to all the
> extensions deployed on cluster (imo). If Devunt (and any others who
> were knowledgeable of the Josa extension) disappeared, WMF would
> default to becoming responsible for the security and critical issues
> in the extension (However, I wouldn't hold them responsible for
> feature requests or minor bugs).
>
> LDAP is used on wikitech, and some similar services. It would be nice
> if teams that most directly interact with the extension (I suppose
> that's labs, maybe security) help with maintenance [Maybe they already
> do]. I don't necessarily think they have a responsibility to (beyond
> critical issues, security, etc), but if the teams in question aren't
> too busy, it is always nice to give back to projects that we use.

Ideally the security team would take on this extension. Unfortunately we
just don't have the capacity to address anything except significant
security issues right now.

>
> --
> -bawolff
>
> On Sat, Sep 19, 2015 at 5:25 AM, Yongmin Hong  wrote:
> > Deployed on wmf cluster does not necessarilly means wmf has to maintain
it.
> > A simple example: [[mw:Extension:Josa]]. It's maintained by 3rd party
> > developer independant from WMF. For example, (IIRC/AFAIK,) WMF has no
> > staffs with Korean knowledge.
> >
> > [[Extension:Josa]]: https://www.mediawiki.org/wiki/Extension:Josa
> >
> > --
> > revi
> > https://revi.me
> > -- Sent from Android --
> > 2015. 9. 19. 오후 5:27에 "Thomas Mulhall" 님이
작성:
> >
> >> Since this is deployed on wikitech It should be maintained by Wikimedia
> >> since it is unmaintained and its better that Wikimedia maintain it
because
> >> they have more staff that can review and because they have more
experience
> >> in doing reviews and code review.
> >>
> >>
> >>  On Saturday, 19 September 2015, 8:39, Keegan Peterzell <
> >> kpeterz...@wikimedia.org> wrote:
> >>
> >>
> >>  On Sat, Sep 19, 2015 at 2:13 AM, Keegan Peterzell <
> >> kpeterz...@wikimedia.org>
> >> wrote:
> >>
> >> >
> >> >
> >> > On Sat, Sep 19, 2015 at 2:03 AM, Risker  wrote:
> >> >
> >> >> Well, bluntly put, since LDAP is how most non-WMF staff sign into
> >> >> phabricator, I'd say it's become an essential extension.
> >> >>
> >> >
> >> Even more technically, this (LDAP) is for people committing to Gerrit
and
> >> adding to Wikitech. These LDAP accounts can tie in CentralAuth through
> >> oAuth.
> >>
> >> But again, yes, LDAP should be some sort of "maintained" in the sense
that
> >> Greg G. describes, and I think it will be.
> >>
> >> --
> >> Keegan Peterzell
> >> Community Liaison, Product
> >> Wikimedia Foundation
> >> ___
> >> Wikitech-l mailing list
> >> Wikitech-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >>
> >>
> >> ___
> >> Wikitech-l mailing list
> >> Wikitech-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [ Writing a MediaWiki extension for deployment ]

2015-07-07 Thread Chris Steipp
On Tue, Jul 7, 2015 at 9:17 AM, Paula  wrote:

> Hello again,
> May I have the contact of somebody from the developing team under the OAuth
> extension?
>

Hi Paula, I'm one of the developers on that extension. As bawolff said,
feel free to ask here. If you're curious about something, someone else
probably is too, so let's keep the conversation in public.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [MediaWiki-announce] MediaWiki bug fix release 1.25.1

2015-05-25 Thread Chris Steipp
Hello everyone,

The ConfirmEdit extension in the 1.25.0 tarball contained a syntax error in
two JSON files. We deeply apologize for this error, and thanks to Paul
Villiger for reporting the issue. A new 1.25.1 tarball has been released
which fixes the issue. Users using git can update to the latest REL1_25
branch.

Full release notes:
https://phabricator.wikimedia.org/diffusion/MW/browse/REL1_25/RELEASE-NOTES-1.25
https://www.mediawiki.org/wiki/Release_notes/1.25

**
Download:
http://download.wikimedia.org/mediawiki/1.25/mediawiki-1.25.1.tar.gz
http://download.wikimedia.org/mediawiki/1.25/mediawiki-core-1.25.1.tar.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.25/mediawiki-core-1.25.1.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.25/mediawiki-1.25.1.tar.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html
___
MediaWiki announcements mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] sshd config: using newer ciphers and protocols

2015-05-22 Thread Chris Steipp
On Fri, May 22, 2015 at 1:37 PM, MZMcBride  wrote:

> Re: , do you know if there's any
> documentation about what has replaced agent forwarding for deployments?
>

It's been replace by having deployers use a shared ssh agent (accessed
through a proxy to log usage and limit the capabilities). You can look
through modules/keyholder in puppet for more details.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Welcome Darian Patrick

2015-05-19 Thread Chris Steipp
Hi all,

I'd like to introduce Darian Anthony Patrick, our new Application Security
Engineer for the foundation! Darian joins me as a member of the newly
formed Security Team. He comes from Aspect Security, where he provided
code/architecture reviews and pen testing to large national and
international financial institutions. Darian will be working remotely from
Portland, OR. You can find him on irc as dapatrick. Darian will focus on
maintaining and improving the security of MediaWiki and other software at
the WMF.

In his own words,

"I'm super excited to join the organization, and I look forward to working
with you all."

Welcome Darian!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Why doesn't en.m.wikipedia.org allow framing?

2015-05-15 Thread Chris Steipp
On May 15, 2015 2:14 PM, "Jacek Wielemborek"  wrote:
>
> Hello,
>
> I tried to discuss this on #wikimedia-mobile on Freenode, but nobody
> could explain this to me:
>
> I'm building a website that allows the users to view Wikipedia changes
> correlated to rDNS names of their editors and I wanted to implement a
> "random mode" that allows thm to see all edits made by a given rDNS
> domain - the user would just press F5 and see the editor in context like
> this:
>
> http://wikispy.wmflabs.org/by_rdns_random/plwiki/.gov.pl
>
> I would definitely prefer to use the mobile version of Wikipedia though
> or at least Special:MobileEdit, but both disallow framing. Is there any
> specific reason for that? I would guess that this is for security, but I
> have to admit I don't know what could be gained by showing the
> MobileDiff in a frame.

We're trying to avoid various clickjacking and redressing attacks. If you
prefill an edit form and position the iframe so it only shows the submit
button bellow a "comment form" on your website, you can get other people to
submit your vandalism.

It would be great if someone compiled the styles so that you could pull the
HTML via the api and have everything look right. But I don't know if anyone
has done that.

>
> Cheers,
> d33tah
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Social-media] Improving the security of our users on Wikimedia sites

2015-04-27 Thread Chris Steipp
On Mon, Apr 27, 2015 at 2:32 PM, Strainu  wrote:

> 2015-04-27 18:51 GMT+03:00 Chris Steipp :
> > Hi Strainu,
>
> Thanks for the additional information Chris!
>
> >
> > We were trying to balance how much data vs summary information to give to
> > people, but you can find the issues vs. resolution table here:
> >
> >
> https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Check/iSEC_Assessment_2014
>
> I was happy to see that some of the issues found by iSEC was
> previously identified in-house. However, I couldn't help noticing that
> https://phabricator.wikimedia.org/T64685 lingered for almost an year
> (or more, I'm not sure if the bugzilla import kept the dates) and the
> patch was only merged after this report found the bug as well. We all
> know the WMF is slow in merging patches, especially from outsiders,
> but shouldn't you have *some* guidelines (or preferably rules) on the
> time that a security bug can stay opened?
>

Hi Stainu,

I specifically addressed this in the "What did we learn?" section of the
blog post. We are hiring to make sure we can address issues faster.

In this particular case, if we had any indication that the issue was known
outside the WMF, or being abused, I would have raised the priority and
addressed it ahead of other issues that I'm working on.



> I'm not trying to start yet another endless fight between WMF staff
> and the community, but the tendency in the last few years has been for
> the researchers to release the vulnerabilities after a certain time
> regardless if the software has been patched or not. Seeing MW exploits
> in the wild and knowing that developers had a chance to fix them and
> didn't does not help WMF's image.
>
> Regards,
>Strainu
>
>
> >
> > For the issue you pointed out in particular, we have
> > https://phabricator.wikimedia.org/T85856 where you can follow the
> > discussion. The end result was that this was a low severity issue, we're
> > definitely not going to do away with user javascript, instead we may add
> a
> > warning if we can find a useful UX experience for the user.
> >
> > On Mon, Apr 27, 2015 at 8:35 AM, Strainu  wrote:
> >
> >> I personally find one of the suggestions in the report worrying:
> >>
> >> "Eliminate custom CSS/JavaScript. iSEC found multiple issues with the
> >> custom JavaScript system.
> >> This system appears to pose significant risk for relatively small
> >> benefit. As such, iSEC recommends
> >> that Wikimedia Foundation deprecate this functionality and allow users
> >> instead to customize their
> >> experience on the client side using browser extensions such as
> >> Greasemonkey or Tampermonkey."
> >>
> >> This is related to one of the problems identified by the team: "Users
> >> can inspect each other's personal JavaScript"
> >>
> >> While the custom JS is used by a relatively small number of users, the
> >> ability to learn and copy another user's scripts has played an
> >> important part in the development(and maintenance) of scripts that are
> >> now considered essential by many Wikimedians (twinkle and wikied come
> >> to mind).
> >>
> >> Furthermore, replacing those script with Greasemonkey scripts would
> >> lead to a "black market" of Wiki-scripts shared through channels
> >> external to our sites. Those scripts would be even more prone to
> >> social engineering attacks and could endanger our user's security.
> >>
> >> I would like to know if the WMF is indeed considering completely
> >> dropping the custom JS feature and if so, what is the timeline for
> >> this change?
> >>
> >> Thanks,
> >>Strainu
> >>
> >> 2015-04-21 4:41 GMT+03:00 Pine W :
> >> > Thanks for your work on this, Chris.
> >> >
> >> > Forwarding to Wikitech-l.
> >> >
> >> > Pine
> >> > On Apr 20, 2015 4:58 PM, "Chris Steipp" 
> wrote:
> >> >
> >> >>
> >> >> On Apr 20, 2015 4:13 PM, "Andrew Sherman" 
> >> wrote:
> >> >> >
> >> >> > Hello Everyone,
> >> >> >
> >> >> > We just published "Improving the security of our users on Wikimedia
> >> >> sites" to the blog. URL:
> >> >> >
> >> >> >
> >> https://blog.wikimedia.org/2015/04/20/improving-security-for-our-users/
> >> >> >

Re: [Wikitech-l] [Social-media] Improving the security of our users on Wikimedia sites

2015-04-27 Thread Chris Steipp
Hi Strainu,

We were trying to balance how much data vs summary information to give to
people, but you can find the issues vs. resolution table here:

https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Check/iSEC_Assessment_2014

For the issue you pointed out in particular, we have
https://phabricator.wikimedia.org/T85856 where you can follow the
discussion. The end result was that this was a low severity issue, we're
definitely not going to do away with user javascript, instead we may add a
warning if we can find a useful UX experience for the user.

On Mon, Apr 27, 2015 at 8:35 AM, Strainu  wrote:

> I personally find one of the suggestions in the report worrying:
>
> "Eliminate custom CSS/JavaScript. iSEC found multiple issues with the
> custom JavaScript system.
> This system appears to pose significant risk for relatively small
> benefit. As such, iSEC recommends
> that Wikimedia Foundation deprecate this functionality and allow users
> instead to customize their
> experience on the client side using browser extensions such as
> Greasemonkey or Tampermonkey."
>
> This is related to one of the problems identified by the team: "Users
> can inspect each other's personal JavaScript"
>
> While the custom JS is used by a relatively small number of users, the
> ability to learn and copy another user's scripts has played an
> important part in the development(and maintenance) of scripts that are
> now considered essential by many Wikimedians (twinkle and wikied come
> to mind).
>
> Furthermore, replacing those script with Greasemonkey scripts would
> lead to a "black market" of Wiki-scripts shared through channels
> external to our sites. Those scripts would be even more prone to
> social engineering attacks and could endanger our user's security.
>
> I would like to know if the WMF is indeed considering completely
> dropping the custom JS feature and if so, what is the timeline for
> this change?
>
> Thanks,
>Strainu
>
> 2015-04-21 4:41 GMT+03:00 Pine W :
> > Thanks for your work on this, Chris.
> >
> > Forwarding to Wikitech-l.
> >
> > Pine
> > On Apr 20, 2015 4:58 PM, "Chris Steipp"  wrote:
> >
> >>
> >> On Apr 20, 2015 4:13 PM, "Andrew Sherman" 
> wrote:
> >> >
> >> > Hello Everyone,
> >> >
> >> > We just published "Improving the security of our users on Wikimedia
> >> sites" to the blog. URL:
> >> >
> >> >
> https://blog.wikimedia.org/2015/04/20/improving-security-for-our-users/
> >> >
> >> > Thanks to Chris for writing and helping us edit this post.
> >> >
> >> > Below are some proposed social media messages. Tweak as needed.
> >> >
> >> > Twitter
> >> >
> >> > We teamed up with @iSECPartners and @OpenTechFund to assess the
> security
> >> of our sites. Check out the report here [link]
> >> >
> >> > FB/G+
> >> >
> >> > We teamed up with iSEC Partners to assess the security of our sites
> and
> >> protect the privacy of our users. Their engineers developed attacks
> against
> >> the current version of MediaWiki to identify security flaws, in a new
> >> report sponsored by the Open Technology Fund. [link]
> >>
> >> Maybe just "MediaWiki" instead of "the current version of MediaWiki",
> >> since we did a release to specifically fix issues that they found. Might
> >> confuse some people as is.
> >>
> >> >
> >> > Thanks,
> >> > --
> >> > Andrew Sherman
> >> > Digital Communications | Wikimedia Foundation
> >> >
> >> > E: asher...@wikimedia.org
> >> > WMF: ASherman (WMF)
> >>
> >> ___
> >> Social-media mailing list
> >> social-me...@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/social-media
> >>
> >>
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Security and Maintenance Releases: 1.19.24, 1.23.9, and 1.24.2

2015-03-31 Thread Chris Steipp
I would like to announce the release of MediaWiki 1.24.2, 1.23.9 and
1.19.24. These releases fix 10 security issues, in addition to other bug
fixes. Download links are given at the end of this email.


== Security fixes ==

* iSEC Partners discovered a way to circumvent the SVG MIME blacklist for
embedded resources (iSEC-WMF1214-11). This allowed an attacker to embed
JavaScript in the SVG. The issue was additionally identified by Mario
Heiderich / Cure53. MIME types are now whitelisted.


* MediaWiki user Bawolff pointed out that the SVG filter to prevent
injecting JavaScript using animate elements was incorrect.


* MediaWiki user Bawolff reported a stored XSS vulnerability due to the way
attributes were expanded in MediaWiki's Html class, in combination with
LanguageConverter substitutions.


* Internal review discovered that MediaWiki's SVG filtering could be
bypassed with entity encoding under the Zend interpreter. This could be
used to inject JavaScript. This issue was also discovered by Mario Gomes
from Beyond Security.


* iSEC Partners discovered a XSS vulnerability in the way api errors were
reflected when running under HHVM versions before 3.6.1 (iSEC-WMF1214-8).
MediaWiki now detects and mitigates this issue on older versions of HHVM.


* Internal review and iSEC Partners discovered (iSEC-WMF1214-1) that
MediaWiki versions using PBKDF2 for password hashing (the default since
1.24) are vulnerable to DoS attacks using extremely long passwords.


* iSEC Partners discovered that MediaWiki's SVG and XMP parsing, running
under HHVM, was susceptible to "Billion Laughs" DoS attacks
(iSEC-WMF1214-13).


* Internal review found that MediaWiki is vulnerable to "Quadratic Blowup"
DoS attacks, under both HHVM and Zend PHP.


* iSEC Partners discovered a way to bypass the style filtering for SVG
files (iSEC-WMF1214-3). This could violate the anonymity of users viewing
the SVG.


* iSEC Partners reported that the MediaWiki feature allowing a user to
preview another user's custom JavaScript could be abused for privilege
escalation (iSEC-WMF1214-10). This feature has been removed.



Additionally, the following extensions have been updated to fix security
issues:

* Extension:Scribunto - MediaWiki user Jackmcbarn discovered that function
names were not sanitized in Lua error backtraces, which could lead to XSS.


* Extension:CheckUser - iSEC Partners discovered that the CheckUser
extension did not prevent CSRF attacks on the form allowing checkusers to
look up sensitive information about other users (iSEC-WMF1214-6). Since the
use of CheckUser is logged, the CSRF could be abused to defame a trusted
user or flood the logs with noise.



== Bug fixes ==

=== 1.24 ===

* Fix case of SpecialAllPages/SpecialAllMessages in SpecialPageFactory to
fix loading these special pages when $wgAutoloadAttemptLowercase is false.
* (bug T76254) Fix deleting of pages with PostgreSQL. Requires a schema
change and running update.php to fix.

== 1.23 & 1.24 ==

* (bug T70087) Fix Special:ActiveUsers page for installations using
PostgreSQL.


**

Full release notes:
https://www.mediawiki.org/wiki/Release_notes/1.24
https://www.mediawiki.org/wiki/Release_notes/1.23
https://www.mediawiki.org/wiki/Release_notes/1.19

Download:
http://download.wikimedia.org/mediawiki/1.24/mediawiki-1.24.2.tar.gz
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.9.tar.gz
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.24.tar.gz

Patch to previous version:
http://download.wikimedia.org/mediawiki/1.24/mediawiki-1.24.2.patch.gz
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.9.patch.gz
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.24.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.24/mediawiki-1.24.2.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.24/mediawiki-1.24.2.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.9.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.23/mediawiki-1.23.9.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.24.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.24.patch.gz.sig

Extensions:
http://www.mediawiki.org/wiki/Extension:Scribunto
http://www.mediawiki.org/wiki/Extension:CheckUser

Public keys:
https://www.mediawiki.org/keys/keys.html
___
Wikitech-l mailing list
Wikitech-l@lists.wi

[Wikitech-l] Pre-Release Announcement for MediaWiki 1.19.24, 1.23.9, 1.24.2

2015-03-30 Thread Chris Steipp
This is a notice that on Tuesday, March 31st between 21:00-22:00 UTC (2-3pm
PDT) Wikimedia Foundation will release security updates for current and
supported branches of the MediaWiki software. Downloads and patches will be
available at that time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [GSoC] An enhanced cross-wiki watchlist as an OAuth tool - looking for mentors

2015-03-19 Thread Chris Steipp
If any potential mentors are worried about the OAuth piece, I can help with
that. Although I think OAuth is a pretty small piece of this project.

On Thu, Mar 19, 2015 at 5:21 AM, Quim Gil  wrote:

> (Jan is looking for GSoC mentors, and the deadline for submitting proposals
> with mentors is 27 March, next week. If you are interested, hurry up!
> https://www.mediawiki.org/wiki/Google_Summer_of_Code_2015 )
>
> On Thu, Mar 19, 2015 at 11:30 AM, Jan Lebert  wrote:
>
> > Hey everyone,
> >
> > I want to build a "better" watchlist as an OAuth tool - this would
> include
> > cross-wiki watchlist & notifications support as well as distinct features
> > like inline diffs. I see the opportunity here to experiment with the
> design
> > and do things differently without breaking any existing workflows.
> >
> > I've made a basic prototype at https://tools.wmflabs.org/watchr/ which
> > currently orientates itself much on the MW watchlist design. It is
> > currently quite limited and only queries a list of hand-picked projects.
> > One of the things I would like to support is dynamic filtering (as shown
> by
> > the search box in the prototype).
> >
> > See https://phabricator.wikimedia.org/T92955 for a screenshot and more
> > details. It's build with a AngularJS frontend and a python backend.
> >
> > I'm looking for possible mentors, anyone interested? Feel free to ping me
> > in IRC under the nick sitic, I idle around in the usual channels.
> >
> > Thanks
> > sitic
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-11 Thread Chris Steipp
On Mar 11, 2015 2:23 AM, "Gergo Tisza"  wrote:
>
> On Tue, Mar 10, 2015 at 5:40 PM, Chris Steipp 
wrote:
>
> > I'm actually envisioning that the user would edit through the third
party's
> > proxy (via OAuth, linked to the new, "Special Account"), so no special
> > permissions are needed by the "Special Account", and a standard block on
> > that username can prevent them from editing. Additionally, revoking the
> > OAuth token of the proxy itself would stop all editing by this process,
so
> > there's a quick way to "pull the plug" if it looks like the edits are
> > predominantly unproductive.
> >
>
> I'm probably missing the point here but how is this better than a plain
> edit proxy, available as a Tor hidden service, which a 3rd party can set
up
> at any time without the need to coordinate with us (apart from getting an
> OAuth key)? Since the user connects to them via Tor, they would not learn
> any private information; they could be authorized to edit via normal OAuth
> web flow (that is not blocked from a Tor IP); the edit would seemingly
come
> from the IP address of the proxy so it would not be subject to Tor
blocking.
>

Setting up a proxy like this is definitely an option I've considered. As I
did, I couldn't think of a good way to limit the types of accounts that
used it, or come up with an acceptable collateral I could keep from the
user, that would prevent enough spammers to keep it from being blocked
while being open to people who needed it. The blinded token approach lets
the proxy rely on a trusted assertion about the identity, by the people who
it will impact if they get it wrong. That seemed like a good thing to me.

However, we could substitute the entire blinding process with a public page
that the proxy posts to that says, "this user wants to use tor to edit,
vote yes or no and we'll allow them based on your opinion". And the proxy
only allows tor editing by users with a passing vote.

That might be more palatable for enwiki's socking policy, with the risk
that if the user's IP has ever been revealed before (even if they went
through the effort of getting it deleted), there is still data to link them
to their real identity. The blinding breaks that correlation. But maybe a
more likely first step to actually getting tor edits?

___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Tue, Mar 10, 2015 at 5:06 PM, Kevin Wayne Williams <
kwwilli...@kwwilliams.com> wrote:

> Wikipedia isn't worth endangering oneself over, and we shouldn't encourage
> the delusion that any technical measure will change that.


How do you know today what topics are going to endanger you next week?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Tue, Mar 10, 2015 at 2:58 PM, Risker  wrote:

> >
> > 
> > >
> > > AlsoI'm a little unclear about something. If a "Tor-enabled"
> account
> > > creates new accounts, will those accounts be able to edit through Tor,
> > > too?
> >
> > The account creation would come from the proxy, so the wiki would have to
> > trust that the proxy is only handing out accounts to users who have been
>
>
Sorry about that, meant to hit save instead of send.

What I was going to say is that no, there shouldn't be a way for the
"Special Account" to even create child accounts through Tor. We can limit
that via OAuth, and we'll also have to trust the proxy to behave correctly.
If it looked like the "Special Accounts" were creating child accounts
through the proxy, I think that would be a reason to block the proxy.

I think we had different ideas about how the user would edit, which I've
addressed below. Happy to clarify if that doesn't make sense.


> > Sorry Chris, I seem to have been unclear.  For the purpose of responding
> to this, let's call the account created by the third party the "Special
> Account".  What I wanted to verify was whether or not child accounts
> created by the Special Account would also be conferred with the privileges
> of the Special Account (i.e., the ability to edit through Tor) or if they
> would be treated as any other newly created account.  Remember that all
> autoconfirmed accounts can create child accounts (I believe on enwiki it is
> throttled to 5 accounts per day, absent special permissions).
>
> To summarize the proposal as I understand it:
>
>- In addition to the existing process for experienced editors to obtain
>IPBE, which may vary from project to project, they could also request
> the
>creation of a new account, unlinked to their existing accounts, that
> will
>have the ability to edit viaTor.
>- The community will develop the process for approving which accounts
>will have this ability.  When granted, the user will be given a token
>- The user will take the token to a third party which will create for
>them a new account that has the requisite permissions to edit via Tor

   - The new, unlinked account will edit Wikipedia in the same manner as a
>regular user, subject to the same policies
>- There will be a process by which the token can be "broken" or removed
>from the account (still to be determined)
>

I'm actually envisioning that the user would edit through the third party's
proxy (via OAuth, linked to the new, "Special Account"), so no special
permissions are needed by the "Special Account", and a standard block on
that username can prevent them from editing. Additionally, revoking the
OAuth token of the proxy itself would stop all editing by this process, so
there's a quick way to "pull the plug" if it looks like the edits are
predominantly unproductive.


> In other words, the difference between the existing process and the
> proposed process is the addition of the third party and the deliberate
> separation of the two accounts.  (I'm trying to put this into plain
> language so that it can be explained to a broader audience on a project.)
>
> Do I have this right?
>
>
Almost! The accounts are deliberately separated so they can't be linked,
like you said. My proposal goes a little further by also restricting what
the accounts can do via this third-party proxy. For example, the proxy
could run each edit through the abuse filters, or another spam-scoring
service, before it even submits the edit, if we want to try and push spam
detection further up stream. It could have it's own rate limits, and refuse
to service users it feels might be be seen as spammers and could get the
whole system shut down.

If the user tries to edit using the "Special Account" directly via Tor
(skipping the proxy), Torblock will correctly prevent them from doing
anything, just like it currently does.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Mar 10, 2015 12:05 PM, "Risker"  wrote:
>
> Thanks for your responses, Chris. Regardless of what processes are
> proposed, I suspect that the strongest objections will be socially based
> rather than technically based.  Bawolff has a valid point, that success on
> a smaller wiki may have an effect on the social perception of the use of
> Tor on enwiki - but if it is started on another wiki, please ensure that
> there is actual community agreement and that there are sufficient
> administrators who are willing and able to promptly address any problems.
> We may have 700 wikis, but really only about 50-60 of them have sufficient
> daily activity and editorial community size to be able to manage any
> problems that might arise from this.
>
> To my experience, the majority of experienced editors who are asking for
> IPBE or something similar tend to be editing through VPNs that are
> hard-blocked for various reasons (most commonly spamming and/or heavy-duty
> vandalism - and if it's spamming, it's usually blocked at the global
> level).  There are some exceptions - particularly related to users working
> from countries where there are entirely valid security concerns (we could
> probably all recite the list). And IPBE does permit editing through Tor
> now.  Whether continuing with IPBE or providing an alternative, the user
> would still have to persuade the same administrators/community members of
> the legitimacy of their request.
>
> I cannot speak for the entire enwiki community  (let alone any other
> community) about whether or not there would be acceptance for the idea of
a
> user having two unlinked accounts, one "regular" account and one "Tor" one
> - given my role as a Checkuser I'm exposed to a much higher frequency of
> socking complaints than most community members - but given it's been darn
> hard to keep the community from flat-out banning multiple unlined
accounts,
> I have my doubts it will be greeted with open arms, even if it "works" on
> other wikis. (Pretty much the only exception that has received support is
> "editing in a high risk topic area", so there *may* be some support).
> Unfortunately, there's been plenty of history on enwiki of experienced
> users having multiple accounts that were used inappropriately, including
> administrator accounts, so that raises the bar even higher.
>
> AlsoI'm a little unclear about something. If a "Tor-enabled" account
> creates new accounts, will those accounts be able to edit through Tor,
> too?

The account creation would come from the proxy, so the wiki would have to
trust that the proxy is only handing out accounts to users who have been

>
> Risker/Anne
>
> On 10 March 2015 at 14:33, Chris Steipp  wrote:
>
> > On Tue, Mar 10, 2015 at 10:39 AM, Risker  wrote:
> >
> > > A few questions on this:
> > >
> > >
> > >- So, this would result in the creation of a new account,
correct?  If
> > >so, most of the security is lost by the enwiki policy of requiring
> > > linking
> > >to one's other accounts, and if the user edited in the same topic
area
> > > as
> > >their other account, they're likely to be blocked for socking.
(This
> > > is a
> > >social limitation on the idea, not a technical one.)
> > >
> >
> > Registering a pseudonym through this process implies that you are an
> > existing editor (we could even limit the process to only one pseudonym
per
> > existing account, so you know there's a 1-1 mapping), just not linking
to
> > which one you are. Do you think enwiki be open to considering that?
> >
> >
> > >- Why would we permit more than one account?
> > >
> >
> > I was originally thinking that if something happened (forgotten
password,
> > etc.), you could start over. But not a hard requirement.
> >
> >
> > >- It's not usually experienced editors who seem to have an issue on
> > >English projects; most of the huffing and puffing about Tor seems
to
> > > come
> > >from people who are not currently registered/experienced editors,
so
> > the
> > >primary "market" is a group of people who wouldn't meet the
proposed
> > >criteria.
> >
> >
> > There may not be enough intersection between users who we have some
trust
> > in and those who want to edit via Tor. I'm hopeful that we can define
> > "established" to be some group that is large enough that it will include
> > productive editors who also should use Tor, but small enough t

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Tue, Mar 10, 2015 at 10:39 AM, Risker  wrote:

> A few questions on this:
>
>
>- So, this would result in the creation of a new account, correct?  If
>so, most of the security is lost by the enwiki policy of requiring
> linking
>to one's other accounts, and if the user edited in the same topic area
> as
>their other account, they're likely to be blocked for socking.  (This
> is a
>social limitation on the idea, not a technical one.)
>

Registering a pseudonym through this process implies that you are an
existing editor (we could even limit the process to only one pseudonym per
existing account, so you know there's a 1-1 mapping), just not linking to
which one you are. Do you think enwiki be open to considering that?


>- Why would we permit more than one account?
>

I was originally thinking that if something happened (forgotten password,
etc.), you could start over. But not a hard requirement.


>- It's not usually experienced editors who seem to have an issue on
>English projects; most of the huffing and puffing about Tor seems to
> come
>from people who are not currently registered/experienced editors, so the
>primary "market" is a group of people who wouldn't meet the proposed
>criteria.


There may not be enough intersection between users who we have some trust
in and those who want to edit via Tor. I'm hopeful that we can define
"established" to be some group that is large enough that it will include
productive editors who also should use Tor, but small enough to preclude
spammers. I'm assuming if we start with some guideline, then we can adjust
up (if there's too much spam) or down (if there aren't enough users)
depending on the results.


>

   - On reading this over carefully, it sounds as though you're proposing
>what is essentially a highly technical IPBE process in which there is
> even
>less control than the project has now, particularly in the ability to
>address socking and POV/COI editing. Am I missing something?
>

In a way it is, but there are couple advantages over IPBE as I see it:
* Neither the WMF nor checkusers can correlate the identities, whereas with
IPBE, it's possible that a checkuser can still see the IP that created the
account requesting the IPBE. This is less control, but also less risk if
the wmf/checkuser is coerced into revealing that information.
* Hopefully it will be a less manual process, since the only manual (which
could be automated if the right heuristics were found) step is confirming
that the requesting user is "established". There's no further rights that
have to be granted and maintained.

It also give slightly more control in that:
* We're not giving out the IPBE right
* The whole system can be blocked (hopefully temporarily) with a single
block or revoking the OAuth key, if there is ever a sudden flood of spam

Admittedly, we could do all of this (except making the identities
unlinkable) by having an edit-via-tor right that is different from IPBE,
but the unlikability I think is important for our users.


>
> Risker/Anne
>
> On 10 March 2015 at 13:16, Giuseppe Lavagetto 
> wrote:
>
> > Hi Chris,
> >
> > I like the idea in general, in particular the fact that only
> > "established" editors can ask for the tokens. What I don't get is why
> > this proxy should be run by someone that is not the WMF, given - I
> > guess - it would be exposed as a TOR hidden service, which will mask
> > effectively the user IP from us, and will secure his communication
> > from snooping by exit node managers, and so on.
> >
> > I guess the righteously traffic on such a proxy would be so low (as
> > getting a token is /not/ going to be automated/immediate even for
> > logged in users) that it could work without using up a lot of
> > resources.
> >
> > Cheers,
> >
> > Giuseppe
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Tue, Mar 10, 2015 at 10:16 AM, Giuseppe Lavagetto <
glavage...@wikimedia.org> wrote:

> Hi Chris,
>
> I like the idea in general, in particular the fact that only
> "established" editors can ask for the tokens. What I don't get is why
> this proxy should be run by someone that is not the WMF, given - I
>

It's due to a known issue with the scheme that Yan suggested-- if the same
person knows both the blinded and unblinded signatures, they can brute
force the blinding and correlate the identities. Splitting the two is
needed to prevent that.


> guess - it would be exposed as a TOR hidden service, which will mask
> effectively the user IP from us, and will secure his communication
> from snooping by exit node managers, and so on.
>
> I guess the righteously traffic on such a proxy would be so low (as
> getting a token is /not/ going to be automated/immediate even for
> logged in users) that it could work without using up a lot of
> resources.
>
> Cheers,
>
> Giuseppe
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
On Tue, Mar 10, 2015 at 7:45 AM, Kevin Wayne Williams <
kwwilli...@kwwilliams.com> wrote:

> Chris Steipp schreef op 2015/03/10 om 7:23:
>
>> Jacob Applebaum made another remark about editing Wikipedia via tor this
>> morning. Since it's been a couple months since the last tor bashing
>> thread,
>> I wanted to throw out a slightly more modest proposal to see what people
>> think.
>>
>
> The easiest way to prevent a series of Tor bashing threads is to not make
> Tor promoting threads. At least for English Wikipedia, there is no reason
> now or in the conceivable future to permit, much less endorse or formalise,
> editing via Tor.
>
>
I believe there is a strong reason for it.

Even if you use https for every connection to Wikipedia, traffic analysis
currently makes finding out what you're reading fairly easy. From a risk
perspective, if a user wants to edit Wikipedia on a subject and from a
location that could endanger themselves, I would much prefer they edit via
tor than rely on the WMF to protect their identity. We spend a lot of
effort protecting the privacy of our users, but all it would take is
compromising the right server in our cluster, and correlating which IP is
editing as which user becomes very easy. Promoting the user of Tor lets us
push some of the risk onto the Tor team, who are both experts in this and
have a strong motivation to make it work correctly.

So I think there is both a responsibility and a benefit (to the WMF) in
allowing editing via Tor.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tor proxy with blinded tokens

2015-03-10 Thread Chris Steipp
Jacob Applebaum made another remark about editing Wikipedia via tor this
morning. Since it's been a couple months since the last tor bashing thread,
I wanted to throw out a slightly more modest proposal to see what people
think.

This is getting some interest from a few people:
https://zyan.scripts.mit.edu/blog/rate-limiting-anonymous-accounts/

Which lays out a way for twitter to use an external, trusted identity
provider to verify identifies / throttle requested, and then create an
account in a way that neither twitter or the identity provider can link the
account to the request (as long as you mitigate timing attacks).

What if we turn this around a bit and let the wiki establish identity and
throttle, and open up an editing proxy that is accessible via tor which
consumes the identities?

Basically:
* Established wiki user who wants to use tor makes a blinded request (maybe
public, maybe a private queue for some group with appropriate rights) for a
tor-based account creation token.
* User gets that blinded token signed if they're in good standing, and are
limited to some number (3 total, not less than 6 months since the last
request, or something like that).
* User creates an account on the editing proxy via tor, and gives their
unblinded token to the proxy. The proxy creates an account for them, and
allows edits via OAuth token using that new account.

If the use turns it to be a spammer:
* The anonymous account can be blocked like a normal account. The user is
throttled on how many requests for accounts they can make.
* If the proxy generates to much spam, a steward can revoke the key, and we
all go home to think up the next experiment.

To make this happen, we need:
* a light editing proxy (I already have a couple of those as demo OAuth
apps) which is run by a *non-wmf* entity
* something for normal users to download and run that does the blinding for
them
* work out how to address timing attacks if the volume of requestors is low
enough that we can correlate request to first edit by the proxy.

Anyone interested in helping?

Is this conservative enough for those worried about the flood of tor spam,
while being simple enough that the average editor would be able to
understand and go through the process?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] E-mail login to wiki - needs feedback

2015-02-19 Thread Chris Steipp
On Thu, Feb 19, 2015 at 6:44 AM, Marc A. Pelletier  wrote:
> That would be a catastrophe, from a privacy standpoint; even if we restrict
> this to verified email addresses, there is no possible guarantee that the
> person who controled email address x@y in the past is the person who
> controls it today.

Not that precedent makes it right, but this is possible already with
password reset. We assume that if you control x@y, you are entitled to
control any accounts with a confirmed email of x@y.

> It would also have horrid security implication if you allow further creation
> of accounts sharing an email (which would be necessary to make that feature
> useful): create an account with the email of someone you want to find the
> Wikimedia account of, log in, be presented with the accounts.

If it's limited to accounts with a confirmed email, and the passwords
all match, then this isn't an issue (unless I'm misunderstanding your
concern). As an attacker, I can't confirm the email of my victim for
my account, and it's unlikely that I can set the same password
(otherwise I'd just login as them).

But those requirements do require hashing the password per user, which
does leak timing information when we run this in php with our current
password system-- maybe we can find a service to do all the hashing in
parallel. But to start, just not allowing that case would cover the
90% (99.9% probably) use case.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Who moved my cheese?

2015-02-12 Thread Chris Steipp
I don't think we need to announce every change that requires running
update.php-- that's pretty common, and (most importantly, imho) the
error messages you get when that happens make it pretty obvious what
you need to do.

But +1 for standardizing where breaking changes are announced. I hit
the issue yesterday with profiling. It's been updated, the change
wasn't announced on wikitech-l, and the wiki page about it is wrong.
So I'd also like to also suggest that if you make a breaking change:
* please make sure mediawiki.org is updated to reflect the change
* please fail in a way that tells the user what went wrong

I know I'm guilty of making breaking changes that don't comply with
this, so I'm preaching to myself too. But I think that would generally
help.

On Thu, Feb 12, 2015 at 6:18 AM, Amir E. Aharoni
 wrote:
> I do have a lot of respect towards the people who work on modularization
> and librarizatin and vagrant and all that, but yes - I generally agree.
> There's the API mailing list, and many emails on it are about breaking
> changes, but it has relatively low traffic in general, so it's OK to mix
> it. Wikitech-L has very high traffic, and as Andrew says, such
> announcements can get lost, if they are sent at all. So a separate
> MediaWiki-breaking-changes-L list sounds quite reasonable to me.
>
> And I offer some simple yardsticks for defining a "breaking change":
> * It's definitely a breaking change if your local site stops working after
> running `git pull`.
> * It's definitely a breaking change if it's in core or in an extension used
> by Wikimedia, and it requires running any of the following:
> ** update.php
> ** composer update (not every minor new version of an external library, but
> a MediaWiki feature that depends on that new version)
> * It's definitely a breaking change if it's in core or in an extension used
> by Wikimedia, and it requires changing Git configuration.
>
> Other suggestions are welcme.
>
> A recent example of such change is the series of changes in the way that
> skins' source is managed. It broke my installation several times and I had
> to figure out how to change LocalSettings myself time after time. The
> result was pretty awesome, because modularization is usually a good thing,
> but I don't remember that the changes were announced in a way that was
> convenient, at least to me.
>
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> ‪“We're living in pieces,
> I want to live in peace.” – T. Moore‬
>
> 2015-02-12 15:40 GMT+02:00 Andrew Garrett :
>
>> Hey folks,
>>
>> I'd to modestly propose that we talk about managing/announcing breaking
>> changes to core MediaWiki architecture.
>>
>> I want to have this chat because I spent an hour or two yesterday trying to
>> figure out why changing default configuration options for an extension in
>> MyExtension.php wasn't working. Apparently, now, you also have to change it
>> in extension.json for it to work on Vagrant.
>>
>> I understand that this was a change that was announced on wikitech-l, but I
>> don't think that I'm the only one who thinks that reading wikitech-l isn't
>> a good use of time, compared to, say, doing my job (irony upon ironies, I
>> know). If you want to change the way that things have worked for 11 years,
>> then making engineers understand what they need to do differently is your
>> responsibility, not mine.
>>
>> So, besides huffing and puffing, I have two small proposals:
>>
>> 1. We should have a low-volume list/RSS feed/twitter account/something
>> where we announce major breaking changes like this, that doesn't involve
>> reading 20 emails per day of other stuff that doesn't affect the way I do
>> my job.
>>
>> 2. If we make breaking changes, the change should be really obvious so that
>> I can't spend hours trying to find out what changed.
>>
>> For example, when we did the i18n JSON migration (everybody seems to love
>> JSON these days), and I went to change/add a message, I found that the
>> message file was a completely different format, and I was clued in straight
>> away to the fact that something was different.
>>
>> By contrast, in this case, the way I'd done things for years suddenly
>> stopped working. I've heard it justified in this particular case that this
>> is just a transition period; but it's not just a transition period for
>> code, it's a transition period for practices, and therefore it's the time
>> when it should be the LEAST confusing for engineers who don't care about
>> your refactoring, not the MOST confusing.
>>
>>
>> — Andrew Garrett
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l maili

Re: [Wikitech-l] Why there is no authentication mechanism for desktop applications

2015-02-11 Thread Chris Steipp
On Wednesday, February 11, 2015, Guillaume Paumier 
wrote:

> Hello,
>
> Le mercredi 11 février 2015, 16:59:45 Petr Bena a écrit :
> >
> > We have OAuth for browser based programs. But nothing for desktop
> > applications that are being used by users. (Like AWB etc).
>
> > It sounds pretty simple to me, so why we don't have anything like that?
>
> The reason currently given at
> https://www.mediawiki.org/wiki/OAuth/For_Developers#Intended_Users
> is:
>
> "... not... Desktop applications (the Consumer Secret needs to be secret!)"
>


That's why we don't use OAuth for these (see my last email on that too). We
can shift our threat model to change this, but it comes at a cost
(vandalism can't be blocked at the app-level, we have to require https for
more pieces of the protocol, etc).

Petr's current request sounds a little more like google's per-application
passwords, except they are also limited in what rights they can use. Petr,
I'm assuming you wouldn't want to do an OAuth-like signature on each
request, but instead use it to login, then use the session cookie for
future requests? Or were you thinking signed api calls like with OAuth?


>
> --
> Guillaume Paumier
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New feature: tool edit

2015-02-11 Thread Chris Steipp
On Wed, Feb 11, 2015 at 5:07 AM, This, that and the other
 wrote:
> How does a user prove that they're using a particular tool a way that can't
> be faked? Something like OAuth comes to mind. All edits made via an OAuth
> consumer are already tagged with a unique tag, and I would assume that it is
> not possible to falsely represent an OAuth consumer.

This is usually correct-- right now we discourage what Auth2  calls
"public consumers." Apps where the shared secret we setup with the app
owner can't really be considered private, e.g., it's embedded in code
that is actually running on the end user's device, either a native
application or a rich javascript application. But it's really just a
discouragement, and we leave it up to the app owner if they want to
setup things like IP whitelisting, for IP's that are allowed to use
their secret.

I've been thinking that we might implement a flag to mark some apps as
public (Petr has been wanting to use it for huggle since the
beginning), but taking the opposite approach and flagging some as
"known private", where we've verified the owner is intending to keep
the secret private, and we've limited it's use to a very small number
of IP's, might make more sense. Then we could flag the ones where this
assumption holds.

> I'm not sure whether this could work for common tools like AWB or Twinkle,
> though:
>
> * I don't know whether OAuth works for client-side downloadable programs
> like AWB.
> * JavaScript tools edit as the user from the user's browser, and as such,
> OAuth is not relevant to them. In any case, anything they do (like adding a
> specific string to edit summaries, adding a tag to their edits, or the like)
> can be easily spoofed or faked by a tech-savvy user.

So like I said, it's just by peer pressure right now. If anyone has
strong opinions about it, let me know.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changing contentmodel of pages

2015-01-24 Thread Chris Steipp
On Jan 23, 2015 8:43 PM, "Matthew Flaschen"  wrote:
>
> On 01/22/2015 10:00 PM, Legoktm wrote:
>>
>> I disagree that we need a "editcontentmodel" user right. I think all
>> users should be allowed to change the content model of a page (provided
>> they have the right to edit it, etc.).
>
>
> I think that setting a content model different from the namespace's
default only makes sense in certain cases.
>
> E.g. it may not make sense for schema-tized JSON (whether it's Zero
config, EventLogging schema, Wikidata JSON etc.) to be outside its
dedicated namespace.
>
> It probably doesn't make sense for CSS files or JS files to exist outside
of the MediaWiki and User namespaces, and even in those namespaces, the
content model should probably match the end of the title.
>
> Similarly, dedicated namespaces (e.g. Wikidata's main namespace) should
not be able to hold wikitext pages.

From a security perspective, I like limiting the content models per
namespace to a relatively small whitelist. I think it will give more
flexibility to anyone coming up with new content type definitions if we
don't have to worry about "what happen if someone changes main page to this
format" or "how do i write a lossless conversion script from a flow page
and back, just in case an admin converts a flow page to my new type and
back."

>
> Matt Flaschen
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Our CAPTCHA is very unfriendly

2014-12-04 Thread Chris Steipp
On Wed, Dec 3, 2014 at 9:15 PM, Chad  wrote:
> On Wed Dec 03 2014 at 8:18:53 PM MZMcBride  wrote:
>
>> svetlana wrote:
>> >On Thu, 4 Dec 2014, at 15:02, MZMcBride wrote:
>> >>
>> >> We disabled the CAPTCHA entirely on test.wikipedia.org a few weeks ago.
>> >> The wiki seems to be about the same. It probably makes sense to continue
>> >> slowly disabling the CAPTCHA on wikis until users start to shout.
>> >>Perhaps we'll disable the CAPTCHA on the rest of the phase 0 wikis next?
>> >
>> >It would be nice. What are these wikis? Can we ask them for permission on
>> >their village pumps?
>>
>> Err, group 0, not phase 0!
>>
>>  is the
>> list. I think we can convince the various affected communities. :-)
>>
>>
> Patch up for review: https://gerrit.wikimedia.org/r/#/c/177494/

Do we have metrics for the effect the change has had on testwiki,
before we roll this out further? Number of accounts registered, number
of those who are later blocked for spam, number of edits reverted as
spam, etc?

With SUL, disabling captcha for account creation on one wiki
effectively means we have disabled it on all wikis (create account on
testwiki, visit enwiki, you're autocreated, edit!).

I definitely support experimenting, I just want to make sure we're
collecting and watching the right numbers while we experiment.


>
> -Chad
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Visibility of "action" in API for deleted log entries

2014-12-01 Thread Chris Steipp
Hi list,

I wanted to get some feedback about https://phabricator.wikimedia.org/T74222.
In the last security release, I changed the return of the api to remove the
"action" for log entries that had been revdeleted with "Hide action and
target". However, ever since 2009 / r46917, we've assumed that "Hide action
and target" didn't mean the actual action field in the db, but rather the
target and the text of the message about the action, which might include
other parameters. So the message about what's being hidden and the intended
protection of that option could have slightly different interpretations.

I'd like to hear if anyone has intended for the actual log action to be
deleted / suppressed. If not, I'm happy to revert the recent patch, and
we'll just update the wording in the deletion UI to be more clear about
what is being removed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Our CAPTCHA is very unfriendly

2014-11-10 Thread Chris Steipp
On Sunday, November 9, 2014, Platonides  wrote:

> On 07/11/14 02:52, Jon Harald Søby wrote:
>
>> The main concern is obviously that it is really hard to read, but there
>> are
>> also some other issues, namely that all the fields in the user
>> registration
>> form (except for the username) are wiped if you enter the CAPTCHA
>> incorrectly. So when you make a mistake, not only do you have to re-type a
>> whole new CAPTCHA (where you may make another mistake), you also have to
>> re-type the password twice *and*  your e-mail address. This takes a long
>> time, especially if you're not a fast typer (which was the case for the
>> first group), or if you are on a tablet or phone (which was the case for
>> some in the second group).
>>
>
> Only the password fields are cleared (in addition to the captcha). It is
> debatable if clearing them is the right thing or not, there must be some
> papers talking about that. But I think we could go with keeping them filled
> with the user password.
>
> Another idea I am liking is to place the captcha at a different page (as a
> second step), where we could offer several options (captchas, puzzles, irc
> chat, email...) to confirm them, then gather their success rate.
>


I like both of these ideas.

On the general topic, I think either a captcha or verifying an email makes
a small barrier to building a bot, but it's significant enough that it
keeps the amateur bots out. I'd be very interested in seeing an experiment
run to see what the exact impact is though.

Google had a great blog post on this subject where they made recaptcha
easier to solve, and instead,

"The updated system uses advanced risk analysis techniques, actively
considering the user's entire engagement with the CAPTCHA--before, during
and after they interact with it. That means that today the distorted
letters serve less as a test of humanity and more as a medium of engagement
to elicit a broad range of cues that characterize humans and bots. " [1]

So spending time on a new engine that allows for environmental feedback
from the system solving the captcha, and that lets us tune lots of things
besides did the "user" sending back the right string of letters, I think
would be well worth our time.


[1] -
http://googleonlinesecurity.blogspot.com/2013/10/recaptcha-just-got-easier-but-only-if.html

>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki:Common.js and MediaWiki:Common.css blocked on Special:Login and Special:Preferences

2014-11-07 Thread Chris Steipp
On Thursday, November 6, 2014, Daniel Friesen 
wrote:

> On 2014-11-06 4:45 PM, Chris Steipp wrote:
> > On Thu, Nov 6, 2014 at 11:41 AM, Derric Atzrott
> > > wrote:
> >> This seems completely reasonable to me. I'd merge is personally.  Is
> there
> >> any reason not to?
> > It's fairly easy to inject javascript via css, so merging that patch
> > means an admin can run javascript on the login/preferences page, while
> > we specifically block javascript from Common.js, etc.
> >
> > For me, I like knowing that when I login on a random wiki in our
> > cluster, a site admin can't have (maliciously or unintentionally) put
> > javascript on the login page to sniff my password. I'd prefer Kunal's
> > patch had a feature flag so we could disable this on WMF wikis, but
> > sites with robust auditing of their common.css can enable it.
>
> I should probably take some time to remind everyone that things hiding
> any form of JS from single pages like the login pages makes them secure,
> that restrictions like those are not that hard to bypass by using JS on
> a non-login page to use AJAX and history.pushState to trick someone
> clicking the login link into thinking that they actually navigated the
> page and are safe from site-js, when in reality they're actually still
> on the same page with malicious site-js running.
>
>
Very true, but the paranoid can still type in Special:UserLogin and get the
correct page. A url parameter to disable site css/js would be just fine by
me too...




> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki:Common.js and MediaWiki:Common.css blocked on Special:Login and Special:Preferences

2014-11-06 Thread Chris Steipp
On Thu, Nov 6, 2014 at 11:41 AM, Derric Atzrott
 wrote:
> This seems completely reasonable to me. I'd merge is personally.  Is there
> any reason not to?

It's fairly easy to inject javascript via css, so merging that patch
means an admin can run javascript on the login/preferences page, while
we specifically block javascript from Common.js, etc.

For me, I like knowing that when I login on a random wiki in our
cluster, a site admin can't have (maliciously or unintentionally) put
javascript on the login page to sniff my password. I'd prefer Kunal's
patch had a feature flag so we could disable this on WMF wikis, but
sites with robust auditing of their common.css can enable it.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changing edit token length

2014-10-20 Thread Chris Steipp
On Mon, Oct 20, 2014 at 11:00 AM, Zack Weinberg  wrote:
> On Mon, Oct 20, 2014 at 1:38 PM, Chris Steipp  wrote:
>> * Tokens can be time limited. By default they won't be, but this puts
>> the plumbing in place if it makes sense to do that on any token checks
>> in the future.
>> * The tokens returned in a request will change on each request. Any
>> version of the token will be good for as long as the time limit is
>> valid (which again, will default to infinite), but this protects
>> against ssl-compression attacks (like BREACH) where plaintext in a
>> request can be brute-forced by making many requests and watching the
>> size of the response.
>>
>> To do this, the size of the token (which has been a fixed 32 bytes +
>> token suffix for a very long time) will change to add up to 16 bytes
>> of timestamp (although in practice, it will stay 8 bytes for the next
>> several years) to the end of the token.
>
> I have no objection to the change itself, but I would like to make a
> few related comments:
>
> 1) Since this is changing anyway, it would be a good time to make the
> token size and structure independent of whether the user is logged on
> or not.  (This is probably not the only place where MediaWiki leaks
> "is this user logged on?" via request or response size, but it is an
> obvious place.)  I think that would involve generating 'wsEditToken'
> whether or not isAnon() is true, which should be fine?  And then
> matchEditToken() would be simpler.  And anonymous editing tokens could
> also be time-limited.

This is the direction I'm pushing towards. The way we handle caching
at the WMF keeps this from being as simple as you have here, but yes,
it's a long over due change.

> 2) Since this is changing anyway, it would be a good time to stop
> using MD5.  SHA256 should be good for a while.

Preimage attacks on md5 are still just slightly faster than brute
force, so while I don't think we're in danger, I'm not opposed to
strengthening this. Hopefully after this, everyone will use
dynamically sized buffers, so this would be a fairly trivial change in
the future.

> 3) You're using the per-session 'wsEditToken' value as the HMAC secret
> key.  Is there anywhere that the raw 'wsEditToken' might be exposed to
> the client?  Such a leak would enable a malicious client to forge
> editing tokens and bypass the time-limiting.

There shouldn't. If any extensions do that, I would treat it as a security bug.

> 4) Architecturally speaking, does it make sense to time-limit the
> *token* rather than the *session*?

That would be nice, but it makes it harder to do rolling validity, and
this way we can also limit different types of tokens (so a checkuser
token can be limited to a few minutes, while an edit token can have
several hours) without having to track more secrets in a user's
session.

> zw
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Changing edit token length

2014-10-20 Thread Chris Steipp
Hi list,

tl;dr: If you use a fixed length buffer to store edit tokens, you'll
need to update your code.

I'm planning to +2 https://gerrit.wikimedia.org/r/#/c/156336/ in the
next day or so. That provides for two hardening measures:

* Tokens can be time limited. By default they won't be, but this puts
the plumbing in place if it makes sense to do that on any token checks
in the future.
* The tokens returned in a request will change on each request. Any
version of the token will be good for as long as the time limit is
valid (which again, will default to infinite), but this protects
against ssl-compression attacks (like BREACH) where plaintext in a
request can be brute-forced by making many requests and watching the
size of the response.

To do this, the size of the token (which has been a fixed 32 bytes +
token suffix for a very long time) will change to add up to 16 bytes
of timestamp (although in practice, it will stay 8 bytes for the next
several years) to the end of the token.

If that's a problem for anyone, please add a review in gerrit, or
respond here. Otherwise 1.25wmf5 will have the change included.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor and Anonymous Users (I know, we've had this discussion a million times)

2014-10-13 Thread Chris Steipp
On Mon, Oct 13, 2014 at 9:10 AM, Derric Atzrott
 wrote:
>> Although my suggestion is similar in kind to what had already been proposed,
>> the main object to it was that it would create too much work for our
>> already constrained resources. The addition of rate limiting is a technical
>> solution that may or may not be feasible.
>>
>> The people on this list can best answer that.
>
> Does anyone know of any extensions that do something similar to the rate
> limiting that he described?  Force edits into a queue to be reviewed
> (sort of like FlaggedRevs), but limit selected users to only a
> single edit?  I can't imagine something like that would be hard to modify
> to pull from the list of Tor nodes to get its list of users.

AbuseFilter can rate limit per account globally, and edits via tor
have an abuse filter variable set. So a global filter (and all wikis
would have to enable global filters... which is another political
discussion) could be used to rate limit tor edits, and also tag any
that are made.

The review queue I'm not sure about.. not sure if FlaggedRevs can keep
a queue of edits with a particular tag.

>
> I'll take a look at the TorBlock extension and the FlaggedRevs extension
> code and see what I can see.
>
> Thank you,
> Derric Atzrott
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Security fixes for CentralAuth and MobileFrontend extensions

2014-10-08 Thread Chris Steipp
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256


A number of security issues in MediaWiki extensions have been fixed.
Users of these extensions should update to the latest version.

* CentralAuth: Internal review found multiple issues that have been resolved:
** (bug 70469) Special:MergeAccount failed to validate the anti-csrf
token in its forms when performing actions.

** (bug 70468) The internal function to attach multiple local wiki
accounts into a single, global account did not re-check that the
requesting user owned the "home wiki" for that username, but assumed
that user did own this account. This could allow a user to add their
local account edits to a global account that they didn't own.

** (bug 71749) Incomplete fix for bug 70468. The fix wasn't applied to
the new feature where accounts were globalized automatically on login.

** (bug 70620) When globally renaming a user, the antispoof table,
which prevents similar looking names from being created, weren't
updated. This potentially allowed another user to register an account
with a name that looked identical to the username of a user who had
been globally renamed.


* MobileFrontend: (bug 70009) Sherif Mansour discovered that POST
parameters were being added to links generated by MobileFrontend,
which could reveal the user's password after login.



**
   Extension:CentralAuth
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:CentralAuth

**
   Extension:MobileFrontend
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:MobileFrontend


-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.22 (GNU/Linux)

iF4EAREIAAYFAlQ1lJoACgkQ7h9mNGLYTwGdgAD/X7q6WfaBoE2SdKjZeoLE9yvs
wg07Fs4kytmmSQDXa4IBAKBgaYuhuRt5j+G5Q9YNdfCCkvlSqnz7heCIX1Ddn5ma
=cOb1
-END PGP SIGNATURE-

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] OAuth and callbacks

2014-08-27 Thread Chris Steipp
For those who run one of our 76(!) approved OAuth apps, or are using
OAuth extension on their own wiki..

We have a patch [1] from Mitar to allow OAuth apps to pass a
configurable callback during the OAuth handshake. This will probably
make a lot of app author's lives easier, but can also open up a couple
avenues of abuse-- specifically, it's needed for covert redirect
attacks [2]. If OAuth app authors chose loose callback requirements,
which we can assume will happen if we make approvals automatic (bug
65750), and we ever allow public consumers (huggle was asking for that
for a long time), then it would be possible for attackers to abuse our
OAuth setup.

So far, I've been really conservative about how we use OAuth (there
are two other features we would have to enable to make this attack
likely). I'd like to hear other's thoughts about:

* Assuming we implement one or two of: dynamic callbacks, automatic
approval of apps, or public consumers, but not all three, which are
most desired?

* If we do implement all three, we can limit how the callback can
differ from what is registered. I put some suggestions on the gerrit
patch, but would that cause more confusion than help?


[1] - https://gerrit.wikimedia.org/r/153983
[2] - http://tetraph.com/covert_redirect/oauth2_openid_covert_redirect.html

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] News about stolen Internet credentials; reducing Wikimedia reliance on usernames and passwords

2014-08-07 Thread Chris Steipp
On Wed, Aug 6, 2014 at 8:26 AM, Tyler Romeo  wrote:
> In terms of external authentication, we need Extension:OpenID to catch up to 
> the OpenID standard in order to do that.
>
> In terms of two-factor, I have like eight patches for Extension:OATHAuth 
> attempting to make it production-worthy.
>
> https://gerrit.wikimedia.org/r/132783

Nice! I hadn't realized you had got so far on this. Maybe Ryan and I
can get those merged in...

To address Risker's comment, OATH is an open standard with lots of
tools to generate the tokens, so you can use a secure token if you
want to be more secure, or a browser plugin if you're just worried
about someone stealing your password (which would significantly help
our threat model in countries where we can't force https).

Client TLS certificates are sadly really hard to manage in any sort of
secure way, when you don't control the end user's machines.

> --
> Tyler Romeo
> 0x405D34A7C86B42DF
>
> From: svetlana 
> Reply: Wikimedia developers >
> Date: August 6, 2014 at 7:57:12
> To: wikitech-l@lists.wikimedia.org >
> Subject:  Re: [Wikitech-l] News about stolen Internet credentials; reducing 
> Wikimedia reliance on usernames and passwords
>
> On Wed, 6 Aug 2014, at 21:49, Andre Klapper wrote:
>> On Tue, 2014-08-05 at 22:05 -0700, Pine W wrote:
>> > After reading this [1] I am wondering if Wikimedia should start taking
>> > steps to reduce reliance on usernames and passwords.
>>
>> What "steps" do you refer to, or is this intentionally vague?
>> Disallowing usernames and logins?
>> Two-step authentication/verification?
>> Something else?
>>
>> andre
>
> from what i could read and parse:
> use less of external things like skype and google accounts
> so that there is only 1 username for everything
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Release Engineering team (new! improved!)

2014-07-29 Thread Chris Steipp
On Tue, Jul 29, 2014 at 2:06 PM, Pine W  wrote:
> The everyday difference that this change makes may be trivial, but it makes
> sense to me to think of QA (and Security Engineering) as being part of
> RelEng.

I doubt we disagree too much, but I'll put on my security evangelist
hat and get on my soapbox, since you phrased it that way.

It's not uncommon to see security placed (organizationally) as part of
the release process. But while security reviews and security
regression testing are important, I really hope that for MediaWiki,
security isn't just a hurdle to deployment. I believe that security
has to be a part of the entire development process to be effective. If
the features aren't designed for security, security is always going to
loose versus the need to deploy things that we've spent resources to
develop. I think MediaWiki benefited a lot from having Tim be both the
security evangelist and technical lead for so many years.

So I try to spend a significant portion of my time working early in
the development lifecycle, training developers and working towards
more secure architecture, rather than focusing on the release process
to fix all the bugs before we push something out. Sometimes that
happens, and other times (like this week) I spend most of my time
fixing issues after they are already in production. Core has been a
good place to do that work from so far.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Release Engineering team (new! improved!)

2014-07-29 Thread Chris Steipp
On Tue, Jul 29, 2014 at 11:58 AM, Pine W  wrote:
> To clarify, is the QA team now under Release Engineering as Chris' comment
> seems to imply, and how does this org change effect security engineering?

For now, I (the only security engineer) am staying in core, although
much of my role spans both groups. I'll continue working with Chris,
Greg, and other engineers across the WMF and developer community to
build security features, find and respond to vulnerabilities, release
security updates, and improve the secure development process in
general.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] logging out on one device logs user out everywhere

2014-07-23 Thread Chris Steipp
On Tuesday, July 22, 2014, MZMcBride  wrote:

> Chris Steipp wrote:
> >I think this should be managed similar to https-- a site preference,
> >and users can override the site config with a user preference.
>
> Please no. There's been a dedicated effort in 2014 to reduce the number
> of user preferences. They're costly to maintain and they typically
> indicate a design flaw: software should be sensible by default and a user
> preference should only be a tool of last resort. The general issue of user
> preferences-creep remains particularly acute as global (across a wikifarm)
> user preferences still do not exist. Of course in this specific case,
> given the relationship with CentralAuth, you probably could actually have
> a wikifarm-wide user preference, but that really misses the larger point
> that user preferences should be avoided, if at all possible.
>
> I'll start a new thread about my broader thoughts here.
>

I think we have too many preferences also, no disagreement there.

But like Risker, I too want to always destroy all my sessions when I logout
(mostly because I log in and out of accounts a lot while testing, and I
like knowing that applies to all the browsers I have open). So I'm biased
towards thinking this is preference worthy, but I do think it's one of
those things that if it doesn't behave as a user expects, they're going to
think it's a flaw in the software and file a bug to change it.

I'm totally willing to admit the expectations I have are going to be the
minority opinion. If it's a very, very small number of us, then yeah,
preference isn't needed, and we can probably get by with a gadget.

Your proposal for account info and session management is good too. I hope
someone's willing to pick that up.



>
> MZMcBride
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] logging out on one device logs user out everywhere

2014-07-22 Thread Chris Steipp
Cool. My $.02 on the feature,

I think this should be managed similar to https-- a site preference,
and users can override the site config with a user preference. I'd
prefer if we could make the site preference (logout all sessions, or
logout only the current session) to be configurable, so we can start
with keeping the setting as is (and users can opt in), then we can
change the site preference later if we decide it's a better tradeoff.

Unlike https, since this feature is for CentralAuth, let's not reuse
core's session management pages (like Special:UserLogout). If we
really have to add another page, it should be a new central auth page.

On Mon, Jul 21, 2014 at 3:03 PM, Jon Robson  wrote:
> Sounds good.
> Adding design mailing list.
>
>
> On Mon, Jul 21, 2014 at 2:02 PM, Steven Walling
>  wrote:
>> On Mon, Jul 21, 2014 at 1:20 PM, Jon Robson  wrote:
>>
>>> http://en.wikipedia.org/wiki/Special:UserLogout might be an obvious
>>> place (closest to the action)... although not sure how discoverable.
>>>
>>> Do you want to logout everywhere  
>>> [] Remember this decision
>>>
>>> It seems like we could split this into 2 features though in the
>>> interest of getting things done. Right now I'm interested in just
>>> fixing the logout behaviour - in this day and age to many people are
>>> using too many different devices and this experience seems very
>>> broken.
>>
>>
>> This seems potentially overcomplicated. Other sites doing this (Facebook,
>> Google, others) don't put this kind of "close all sessions" option directly
>> on logout. Let's get some input here from the UX team.
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Jon Robson
> * http://jonrobson.me.uk
> * https://www.facebook.com/jonrobson
> * @rakugojon
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anonymous editors & IP addresses

2014-07-11 Thread Chris Steipp
On Friday, July 11, 2014, Daniel Kinzler  wrote:

> Am 11.07.2014 17:19, schrieb Tyler Romeo:
> > Most likely, we would encrypt the IP with AES or something using a
> > configuration-based secret key. That way checkusers can still reverse the
> > hash back into normal IP addresses without having to store the mapping
> in the
> > database.
>
> There are two problems with this, I think.
>
> 1) No forward secrecy. If that key is ever leaked, all IPs become "plain".
> And
> it will be, sooner or later. This would probably not be obvious, so this
> feature
> would instill a false sense of security.
>

This is probably the biggest issue. Even if we hmac it, it's trivial to
brute force the entire ipv4 (and with intelligent assumptions about
generation, most of the ipv6) range in seconds, if the key was ever known.


>
> 2) No range blocks. It's often quite useful to be able to block a range of
> IPs.
> This is an important tool in the fight against spammers, taking it away
> would be
> a problem.
>

Range blocks, I imagine, would continue working the same way they do.
Someone would have to identify the correct range (which is very difficult
when administrators can't see IP's), but on submission, we have the IP
address to check against the blocks. (Unless someone proposes to store
block ranges as hashes, that would definitely get rid of range blocks).


>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Bug Bounty Program

2014-06-25 Thread Chris Steipp
On Wed, Jun 25, 2014 at 5:49 PM, Alex Monk  wrote:
> Chris, why don't we leave privacy policy compliance to the users posting on
> the bug? Wikimedia personal user data shouldn't be going to the security
> product.

There are a few cases where there may be legitimate private data in a
security bug ("look, sql injection, and here are some rows from the
user table!", "Hey, this was supposed to be suppressed, and I can see
it", "This user circumvented the block on this IP"). But there might
be ways to flag or categorize a report as also including private data?
Someone with more bugzilla experience would need to comment.

> Why does WMF get the right to control by access to MediaWiki security bugs
> anyway? Could we not simply host MediaWiki stuff externally? Perhaps on the
> servers of any other major MediaWiki user.

This certainly could be done. That "other major MediaWiki user" would
have to be someone everyone trusts, and preferably with a strong track
record of being able to keep their infrastructure secure. If there's a
legitimate proposal to try it, let's definitely discuss.

> Alex
> Sent from phone
>
> On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo  wrote:
>> Hey everybody,
>>
>> So today at the iSEC Partners security open forum I heard a talk from Zane
>> Lackey,
>> the former security lead for Etsy, concerning the effectiveness of bug
>> bounties.
>>
>> He made two points:
>>
>> 1) Bug bounties are unlikely to cause harm, especially for Wikipedia,
> which
>> I asked
>> him about, because the mere popularity of our service means we are already
>> being
>> scanned, pentested, etc. With a bounty program, there will be incentive
> for
>> people to
>> report those bugs rather than pastebin them.
>>
>> 2) Even without a monetary reward, which I imagine WMF would not be able
> to
>> supply,
>> crackers are motivated simply by the "hall of fame", or being able to be
>> recognized for
>> their efforts.
>>
>> Therefore, I thought it may be beneficial to take that over to Wikipedia
> and
>> start our own
>> bug bounty program. Most likely, it would be strictly a hall of fame like
>> structure where
>> people would be recognized for submitting bug reports (maybe we could even
>> use the
>> OpenBadges extension *wink* *wink*). It would help by increasing the
> number
>> of bugs
>> (both security and non-security) that are found and reported to us.
>>
>> Any thoughts? (Of course, Chris would have to approve of this program
> before
>> we even
>> consider it.)
>
> I've been thinking of at least putting up a list of top contributors
> on mediawiki.org for a while, and just hadn't had the time to do it.
> If anyone wants to compile that list from the list of closed security
> bugs, I'd be very supportive.
>
> As for a more official program, the downside that I predict we would
> quickly hit (from talking to a few people who have run these) is the
> high volume of very low quality reports that have to be investigated
> and triaged. Which is something that just takes time from a human...
> so my evil_plans.txt towards this was (I really had almost this
> exactly in my todo list):
> * Get more volunteers access to security bugs
> ** {{done}} get list of top contributors
> ** Find out from Philippe how to get a bunch of volunteers identified
> *** Doh, we're probably changing our identification process soon. On hold.
>
> So, I was planning to wait until we have a more streamlined process
> for getting volunteers access to data that could potentially be
> covered by our privacy policy, then invite some people who have
> contributed significantly to MediaWiki's security in the past to get
> access to those bugs and help triage/assign/fix bugs, then look into
> starting something official or semi-official. But if a few of you
> would be willing to deal with our current identification/NDA process
> and are willing to help out investigate report, I'm happy to start
> working on it sooner.
>
>>
>> --
>> Tyler Romeo
>> 0xC86B42DF
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Bug Bounty Program

2014-06-25 Thread Chris Steipp
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo  wrote:
> Hey everybody,
>
> So today at the iSEC Partners security open forum I heard a talk from Zane
> Lackey,
> the former security lead for Etsy, concerning the effectiveness of bug
> bounties.
>
> He made two points:
>
> 1) Bug bounties are unlikely to cause harm, especially for Wikipedia, which
> I asked
> him about, because the mere popularity of our service means we are already
> being
> scanned, pentested, etc. With a bounty program, there will be incentive for
> people to
> report those bugs rather than pastebin them.
>
> 2) Even without a monetary reward, which I imagine WMF would not be able to
> supply,
> crackers are motivated simply by the "hall of fame", or being able to be
> recognized for
> their efforts.
>
> Therefore, I thought it may be beneficial to take that over to Wikipedia and
> start our own
> bug bounty program. Most likely, it would be strictly a hall of fame like
> structure where
> people would be recognized for submitting bug reports (maybe we could even
> use the
> OpenBadges extension *wink* *wink*). It would help by increasing the number
> of bugs
> (both security and non-security) that are found and reported to us.
>
> Any thoughts? (Of course, Chris would have to approve of this program before
> we even
> consider it.)

I've been thinking of at least putting up a list of top contributors
on mediawiki.org for a while, and just hadn't had the time to do it.
If anyone wants to compile that list from the list of closed security
bugs, I'd be very supportive.

As for a more official program, the downside that I predict we would
quickly hit (from talking to a few people who have run these) is the
high volume of very low quality reports that have to be investigated
and triaged. Which is something that just takes time from a human...
so my evil_plans.txt towards this was (I really had almost this
exactly in my todo list):
* Get more volunteers access to security bugs
** {{done}} get list of top contributors
** Find out from Philippe how to get a bunch of volunteers identified
*** Doh, we're probably changing our identification process soon. On hold.

So, I was planning to wait until we have a more streamlined process
for getting volunteers access to data that could potentially be
covered by our privacy policy, then invite some people who have
contributed significantly to MediaWiki's security in the past to get
access to those bugs and help triage/assign/fix bugs, then look into
starting something official or semi-official. But if a few of you
would be willing to deal with our current identification/NDA process
and are willing to help out investigate report, I'm happy to start
working on it sooner.



>
> --
> Tyler Romeo
> 0xC86B42DF

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Browser tests for core

2014-06-24 Thread Chris Steipp
On Jun 24, 2014 6:13 PM, "Dan Garry"  wrote:
>
> On 24 June 2014 17:05, Risker  wrote:
> >
> > Sorry to be a bit OT, but if you guys are going to test, please don't
do it
> > in article space on enwiki, or this is what is going to happen to the
> > accounts.  We've had to almost kick WMF staff off enwiki before because
> > they kept testing in live article space, please don't do that.
> >
>
> Or you'll accidentally insert vandalism into articles when testing the
> abuse filter. Because I've *totally never* done that by accident. :-)
>
> If these browsers tests don't interact with the site that leaves anything
> user-facing behind (e.g. making edits or take actions that create log
> entries), there's no problem with running them on enwiki. Otherwise, they
> should be using our test or beta sites.

They do leave some artifacts (documented in the change, iirc). So yeah,
they should only be used against your dev environment or beta.

>
> Dan
>
> --
> Dan Garry
> Associate Product Manager for Platform and Mobile Apps
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Browser tests for core

2014-06-24 Thread Chris Steipp
I just +2'ed a change to add a few basic selenium tests to core [1]. I
think it will benefit us all to have a set of automated tests to
quickly make sure mediawiki is working correctly. From a security
perspective, this also takes a step towards more efficient security
testing, which I'm also a fan of (if you've tried blindly scanning
mediawiki, you know what I'm talking about..).

I think the QA group is working on vagrant-izing these, but if you
have ruby >1.9.3 and firefox, then setting up and running these tests
on your local dev system is 4-6 commands,

$ cd tests/browser
$ gem update --system
$ gem install bundler
$ bundle install

You can either set your environment variables yourself, or edit
environment_variables and run `source environment_variables` to set
them. Then it's just

$ bundle exec cucumber features/

to run the tests. They currently complete in 36 seconds on my laptop.

I'd like to see more tests added and backported to REL1_23 to make
sure we have an ongoing suite to check releases against for next few
years that we support that LTS. If anyone is interested in both
mediawiki core and browser tests, I'm sure the QA team would like to
get you involved.

Big thanks to hashar, Chris McMahon, and Dan Duvall for indulging me
and getting this done. I'll let them jump in with all the details I've
missed.


[1] - https://gerrit.wikimedia.org/r/#/c/133507/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SVG linking of external images/bitmaps - xlink:href should support http(s) resources

2014-06-20 Thread Chris Steipp
On Thu, Jun 19, 2014 at 11:15 PM, "Christian Müller"  wrote:
>> Sent: Dienstag, 27. Mai 2014 um 21:21 Uhr
>> From: "Chris Steipp" 
>> To: "Wikimedia developers" 
>> Subject: Re: [Wikitech-l] SVG linking of external images/bitmaps - 
>> xlink:href should support http(s) resources
>> On Tue, May 27, 2014 at 9:37 AM, "Christian Müller"  wrote:
>>
>> > https://bugzilla.wikimedia.org/show_bug.cgi?id=65724#c3
>>
>> [..] Trusting an image library to correctly speak http
>> without a memory corruption seems a little scary as well, but I'll admit I
>> haven't looked at librsvg's code myself.
>
> In any case, it'd be the image library to fix.  Restricting access is an
> arguably crude workaround due to diffuse fears.  It breaks the standard
> and makes technology less useful to its users.
>
>> [..], if there are any
>> major browsers that will pull those resources in and let an attacker see
>> the user's IP address, we shouldn't allow that... hmm, and now that I read
>> the bug, I see this is firefox'es behavior in the image you uploaded. We
>> probably want to block that behavior.
>
> Yeah, Firefox's decision to adhere fully to the SVG standard is right imho,
> since it has to measure itself in compatibility tests with other browsers.
>
> If WP decides to cripple the standard for security reasons, that's their
> beer, but please stop starting to cripple user browsers.  Security of that
> is in the hand of users, they have to make the decision wich browser to
> use and whether that ought to be a security enhanced one with less standard
> compliance, or a full featured one like FF.

I meant that because those browsers are fully implementing the spec,
MediaWiki needs to protect our users privacy in case that is used. We
have no influence over Firefox development, and I agree, the browsers
should implement the spec. We just need to ensure we are taking
precautions in that context.

>
>> Allowing a whitelist of WMF domains via https may be possible. In general,
>> the security checking we do on uploaded files is complex enough that I
>> don't like adding another layer of specific checks and exceptions, but if
>> we can find a relatively simple way to do it that maintains our security
>> and privacy requirements, then I wouldn't stand in the way.
>
> Ok, within WP scope, hosting external dep files on foreign servers is out
> of reach, security- and longlivety-wise - it seems everyone agrees on this.
>
> Afai am concerned, two short-term achievable issues remain:
>
> 1) allow certain WMF domains via https for thumbnail generation and librsvg
>processing in general - this is to adhere to SVG standard, as long as
>dependant files remain in wikimedia universe.
>(Is there a chance for this to make it into 1.24git?)

Like I said, if someone can find a simple way to do this, we can allow
it in MediaWiki. If someone wants to work on it, one of the first
steps is to get the security/privacy requirements defined (along with
the function requirements, like cscott brought up in the reference
below). Most have been brought up here or on that bug, but someone
should distill those somewhere.

> 2) fixing chunked upload to not bail out on chunks that are exclusively
>base64 encoded and hence make valid files that include this base64
>chunk fail on upload - with an unusable error description.

This will unfortunately require a different approach to how we do
stashed/chunked uploads. Currently, each chunk is actually available
from the server as a file. So each piece has to be checked for xss
vectors, which is why your chunks currently fail. The stash will need
to be inaccessible to end users.

> Farther off might be the need to rethink part of the file infrastructure,
> to either broadly allow formats that are not self contained OR make a
> strong and reasoned decision against that and document it for wikipedians.
> This has been suggested here:
>   http://lists.wikimedia.org/pipermail/wikitech-l/2014-May/076700.html
>
>
> Regards,
> Christian
>
> ps:
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW-Vagrant improvements at the Zürich Hackathon

2014-06-13 Thread Chris Steipp
Thanks Adam!

I'd like to hear more about your exact use case. Getting all of the
wikis to run out of a single codebase was a major part of the
challenge getting things setup. If you want a repo per wiki, that
should be a straightforward case of setting up another apache
virtualhost, or using a different subdirectory for each wiki. If you
want CentralAuth across them, that's pretty trivial too. Controlling
all that with puppet, I defer to bd808 on :)

But it sounds like you may have something else in your use case, let me know.

On Fri, Jun 13, 2014 at 4:09 PM, Adam Wight  wrote:
> Bryan and Chris,
> The multiwiki work is fantastic, a big thank you for pursuing this!  I
> tried to use your new module to provide a vagrant development environment
> for Fundraising's "payments" wiki [1], and I ran up against a large and
> very solid-looking wall that I think is worth mentioning.  We maintain a
> special release branch of MediaWiki for payments, with a bit of security
> hardening.  We cannot follow trunk development without carefully reading
> over the new features, and we need to develop against this target so that
> we catch version incompatibilities before deployment.
>
> I see that multiwiki encapsulates the various wikis by configuration only,
> and they all share the main codebase.  Do you have multiple checkouts of
> MediaWiki-core on your roadmap, or are we a fringe case?  I'd like to help
> support our development under vagrant, but this issue is a bit of a
> blocker.  Any advice would be appreciated.
>
> Thanks,
> Adam
>
> [1] https://gerrit.wikimedia.org/r/135326, production is
> https://payments.wikimedia.org
>
>
> On Wed, May 21, 2014 at 9:55 AM, Bryan Davis  wrote:
>
>> On Fri, May 16, 2014 at 2:40 PM, Arthur Richards
>>  wrote:
>> >
>> > CentralAuth/Multiwiki:
>> > Bryan Davis, Chris Steipp, and Reedy spent a lot of time hacking on this,
>> > and we now have support for multiwiki/CentralAuth in Vagrant! There is
>> > still some cleanup work being done for the role to remove
>> kludge/hacks/etc
>> > (see https://gerrit.wikimedia.org/r/#/c/132691/).
>>
>> The CentralAuth role and the associated puppet config that allows
>> creation of multiple wikis as Apache virtual hosts on a single
>> MediaWiki-Vagrant virtual machine have been merged! Go forth and
>> debug/extend CentralAuth. :)
>>
>> I'd love to see additional roles created that use the multwiki::wiki
>> Puppet define to add interesting things for testing/debugging like RTL
>> wikis or other complex features such as WikiData that use a
>> collaboration between multiple wikis in the WMF production cluster.
>> If you're interested in working on something like this and get stuck
>> with the Puppet code needed or find shortcomings in the setup that
>> Chris and I developed I'd be glad to try and help work through the
>> issues.
>>
>> Bryan
>> --
>> Bryan Davis  Wikimedia Foundation
>> [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
>> irc: bd808v:415.839.6885 x6855
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Getting phpunit working with Vagrant

2014-06-13 Thread Chris Steipp
On Fri, Jun 13, 2014 at 10:44 AM, Jon Robson  wrote:
> Has anyone had success with this...?
>
> This is what happens when I try to run:
>
> master x ~/git/vagrant/mediawiki/tests/phpunit $ php phpunit.php
>
> Warning: require_once(/vagrant/LocalSettings.php): failed to open
> stream: No such file or directory in
> /Users/jrobson/git/vagrant/mediawiki/LocalSettings.php on line 130
>
> Fatal error: require_once(): Failed opening required
> '/vagrant/LocalSettings.php' (include_path='.:') in
> /Users/jrobson/git/vagrant/mediawiki/LocalSettings.php on line 130

I use it frequently, and iirc the setup was nearly exactly what the
instruction on mediawiki said.

One other issue I did hit was that running even the databaseless tests
on a default vagrant setup ran out of memory. I upped my vagrant
config to use 2GB, and things work fine.


csteipp@herou:~/tmp/vagrant2> cat Vagrantfile-extra.rb
#Vagrant.configure('2') do |config|
#config.vm.synced_folder './browsertests', '/srv/browsertests',
#id: 'vagrant-browsertests',
#owner: 'vagrant',
#group: 'vagrant'
#end

Vagrant.configure('2') do |config|
config.vm.provider :virtualbox do |vb|
# See http://www.virtualbox.org/manual/ch08.html for additional options.
vb.customize ['modifyvm', :id, '--memory', '2048']
vb.customize ['modifyvm', :id, '--cpus', '2']
vb.customize ['modifyvm', :id, "--cpuexecutioncap", "90"]
end
end



> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Upgrading to 1.23

2014-06-12 Thread Chris Steipp
On Thu, Jun 12, 2014 at 10:15 AM, Beebe, Mary J  wrote:
> 4.   General security vulnerabilities. - I would love to have any 
> specifics here.

You can start with
https://bugzilla.wikimedia.org/buglist.cgi?f1=product&f2=product&f3=creation_ts&f4=resolution&list_id=321311&o1=changedfrom&o2=equals&o3=greaterthan&o4=equals&query_format=advanced&v1=Security&v2=MediaWiki&v3=2011&v4=FIXED

That's 55 reasons to upgrade :). CVE-2014-1610 is a compelling one for
many installs.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help: Needed in OAuth

2014-06-05 Thread Chris Steipp
On Thu, Jun 5, 2014 at 9:23 AM, Amanpreet Singh <
amanpreet.iitr2...@gmail.com> wrote:

> Dear Chris
> I tried this but still no result it gives same error NULL,
> I also copied your entire demo and pasted it to test but it also returned
> same result, maybe its something related to canonicalServerUrl,
> I also tried Magnus one it gives 'Error retrieving token:
> mwoauth-oauth-exception'.
>

One of the best ways to debug this is to start using a local instance, so
you can look at the server's debug log. Then you can see what error the
server is giving. MediaWiki vagrant has an oauth role, so you can just
enable role and start using it.

canonicalServerUrl is only used for the identity check-- it sounds like
you're not even to that point yet, so that shouldn't be an issue (if it
was, the client library would say you have an invalid JWT).

Feel free to ping me on IRC (csteipp) and I can try to walk you through.
You may want to try this script here:

https://www.mediawiki.org/wiki/User:CSteipp/OAuth_debug_client

That should at least prove it's not a connectivity / curl issue.


>
>
> On Thu, Jun 5, 2014 at 9:14 PM, Chris Steipp 
> wrote:
>
> > On Thursday, June 5, 2014, Amanpreet Singh  >
> > wrote:
> >
> > > Thanks for quick reply,
> > > I am just getting NULL after making an OAuth call and that callback
> > wasn't
> > > confirmed, I hope I am making call to correct url which is
> > >
> > >
> >
> https://mediawiki.org/wiki/index.php?title=Special:OAuth/initiate&format=json&oauth_callback=oob
> > > What I should get back is a token with key and a secret.
> > >
> >
> > Try using www.mediawiki.org-- otherwise the redirect will happen the
> > signature won't verify.
> >
> >
> > >
> > >
> > > On Thu, Jun 5, 2014 at 8:27 PM, Magnus Manske <
> > magnusman...@googlemail.com
> > > >
> > > wrote:
> > >
> > > > If all you want is some quick code infusion, I can offer my PHP
> class:
> > > >
> > > >
> > > >
> > >
> >
> https://bitbucket.org/magnusmanske/magnustools/src/ecb01ddc26c8129737d260d0491ccb410c4c62a3/public_html/php/oauth.php?at=master
> > > >
> > > > You'd have to patch the "loadIniFile" method to point at your ini
> file,
> > > but
> > > > the rest should work as is. High-level method are towards the end,
> > > usually
> > > > self-explaining like "setLabel".
> > > >
> > > > Cheers,
> > > > Magnus
> > > >
> > > >
> > > > On Thu, Jun 5, 2014 at 3:51 PM, Andre Klapper <
> aklap...@wikimedia.org
> > > >
> > > > wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > On Thu, 2014-06-05 at 20:13 +0530, Amanpreet Singh wrote:
> > > > > > I need some help regarding my GSoC project in which I need to
> > > implement
> > > > > an
> > > > > > OAuth login system for a browser based plugin, so we can identify
> > > > users.
> > > > > > But I am stuck and not able to get anything here >
> > > > > >
> > > > >
> > > >
> > >
> >
> https://github.com/apsdehal/Mediawiki-login-api/blob/master/lib/OAuth/MWOAuthClient.php#L77
> > > > > > . Kindly help me, and tell me if further information is needed.
> > > > >
> > > > > What is the problem you're facing, what have you tried already,
> > what's
> > > > > the output you get, etc.?
> > > > >
> > > > > andre
> > > > > --
> > > > > Andre Klapper | Wikimedia Bugwrangler
> > > > > http://blogs.gnome.org/aklapper/
> > > > >
> > > > >
> > > > > ___
> > > > > Wikitech-l mailing list
> > > > > Wikitech-l@lists.wikimedia.org 
> > > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > undefined
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org 
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > >
> > >
> > >
> > >
> > > --
> > > Amanpreet Singh,
> > > IIT Roorkee
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org 
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> Amanpreet Singh,
> IIT Roorkee
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Hardening WP/WM against traffic analysis (take two)

2014-06-05 Thread Chris Steipp
On Thu, Jun 5, 2014 at 9:45 AM, Zack Weinberg  wrote:

> I'd like to restart the conversation about hardening Wikipedia (or
> possibly Wikimedia in general) against traffic analysis.  I brought
> this up ... last November, I think, give or take a month?  but it got
> lost in a larger discussion about HTTPS.
>

Thanks Zack, I think this is research that needs to happen, but the WMF
doesn't have the resources to do itself right now. I'm very interested in
seeing the results you come up with.


>
> For background, the type of attack that it would be nice to be able to
> prevent is described in this paper:
>
> http://sysseclab.informatics.indiana.edu/projects/sidebuster/sidebuster-final.pdf
>  Someone is eavesdropping on an encrypted connection to
> LANG.wikipedia.org.  (It's not possible to prevent the attacker from
> learning the DNS name and therefore the language the target reads,
> short of Tor or similar.  It's also not possible to prevent them from
> noticing accesses to ancillary servers, e.g. Commons for media.)  The
> attacker's goal is to figure out things like
>
> * what page is the target reading?
> * what _sequence of pages_ is the target reading?  (This is actually
> easier, assuming the attacker knows the internal link graph.)
> * is the target a logged-in user, and if so, which user?
> * did the target just edit a page, and if so, which page?
> * (... y'all are probably better at thinking up these hypotheticals than
> me ...)
>

Anything in the logs-- Account creation is probably an easy target.


>
> Wikipedia is different from a tax-preparation website (the case study
> in the above paper) in that all of the content is public, and edit
> actions are also public.  The attacker can therefore correlate their
> eavesdropping data with observations of Special:RecentChanges and the
> like.  This may mean it is impossible to prevent the attacker from
> detecting edits.  I think it's worth the experiment, though.
>
> What I would like to do, in the short term, is perform a large-scale
> crawl of one or more of the encyclopedias and measure what the above
> eavesdropper would observe.  I would do this over regular HTTPS, from
> a documented IP address, both as a logged-in user and an anonymous
> user.  This would capture only the reading experience; I would also
> like to work with prolific editors to take measurements of the traffic
> patterns generated by that activity.  (Bot edits go via the API, as I
> understand it, and so are not reflective of "naturalistic" editing by
> human users.)
>

Make sure to respect typical bot rate limits. Anonymous crawling should be
fine, although logged in crawling could cause issues. But if you're doing
this from a single machine, I don't think there's too much harm you can do.
Thanks for warning us in advance!

Also, mobile looks very different from desktop. May be worth analyzing it
as well.


>
> With that data in hand, the next phase would be to develop some sort
> of algorithm for automatically padding HTTP responses to maximize
> eavesdropper confusion while minimizing overhead.  I don't yet know
> exactly how this would work.  I imagine that it would be based on
> clustering the database into sets of pages with similar length but
> radically different contents.  The output of this would be some
> combination of changes to MediaWiki core (for instance, to ensure that
> the overall length of the HTTP response headers does not change when
> one logs in) and an extension module that actually performs the bulk
> of the padding.  I am not at all a PHP developer, so I would need help
> from someone who is with this part.
>

Padding the page in output page would be a pretty simple extension,
although ensuring the page size after the web server is gzips it is a
specific size would be more difficult to do efficiently. However, iirc the
most obvious fingerprinting technique was just looking at the number and
sizes of images loaded from commons. Making sure those are consistent sizes
is likely going to be hard.


>
> What do you think?  I know some of this is vague and handwavey but I
> hope it is at least a place to start a discussion.
>

One more thing to take into account is that the WMF is likely going to
switch to spdy, which will completely change the characteristics of the
traffic. So developing a solid process that you can repeat next year would
be time well spent.


>
> zw
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help: Needed in OAuth

2014-06-05 Thread Chris Steipp
On Thursday, June 5, 2014, Amanpreet Singh 
wrote:

> Thanks for quick reply,
> I am just getting NULL after making an OAuth call and that callback wasn't
> confirmed, I hope I am making call to correct url which is
>
> https://mediawiki.org/wiki/index.php?title=Special:OAuth/initiate&format=json&oauth_callback=oob
> What I should get back is a token with key and a secret.
>

Try using www.mediawiki.org-- otherwise the redirect will happen the
signature won't verify.


>
>
> On Thu, Jun 5, 2014 at 8:27 PM, Magnus Manske  >
> wrote:
>
> > If all you want is some quick code infusion, I can offer my PHP class:
> >
> >
> >
> https://bitbucket.org/magnusmanske/magnustools/src/ecb01ddc26c8129737d260d0491ccb410c4c62a3/public_html/php/oauth.php?at=master
> >
> > You'd have to patch the "loadIniFile" method to point at your ini file,
> but
> > the rest should work as is. High-level method are towards the end,
> usually
> > self-explaining like "setLabel".
> >
> > Cheers,
> > Magnus
> >
> >
> > On Thu, Jun 5, 2014 at 3:51 PM, Andre Klapper  >
> > wrote:
> >
> > > Hi,
> > >
> > > On Thu, 2014-06-05 at 20:13 +0530, Amanpreet Singh wrote:
> > > > I need some help regarding my GSoC project in which I need to
> implement
> > > an
> > > > OAuth login system for a browser based plugin, so we can identify
> > users.
> > > > But I am stuck and not able to get anything here >
> > > >
> > >
> >
> https://github.com/apsdehal/Mediawiki-login-api/blob/master/lib/OAuth/MWOAuthClient.php#L77
> > > > . Kindly help me, and tell me if further information is needed.
> > >
> > > What is the problem you're facing, what have you tried already, what's
> > > the output you get, etc.?
> > >
> > > andre
> > > --
> > > Andre Klapper | Wikimedia Bugwrangler
> > > http://blogs.gnome.org/aklapper/
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org 
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > undefined
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org 
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> Amanpreet Singh,
> IIT Roorkee
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SVG linking of external images/bitmaps - xlink:href should support http(s) resources

2014-05-28 Thread Chris Steipp
On Tue, May 27, 2014 at 10:10 PM, Matthew Flaschen
wrote:

> On 05/27/2014 10:52 PM, Brian Wolff wrote:
>
>> I specifically said bits.wikimedia.org and upload.wikimedia.org (and not
>>>
>> commons.wikimedia.org), neither of which host user JavaScript.
>>
>>>
>>> Matt Flaschen
>>>
>>>
>>>
>> Gadgets are on bits and they are user controlled. Ditto for
>> mediawiki:common.js et al. (Unless you mean users as in non admins).
>> I see no usecase from allowing from bits. If someone wants an extension
>> asset they can upload it.
>>
>
> You're right, I was completely wrong about the user JavaScript. Actually,
> user scripts are on bits too.  Conceivably, it could limit it to
> directories starting with static-..., but that starts getting complicated.
>  It's probably safer to limit it to user-uploaded Commons files as you said.
>

It *should* be difficult to get javascript to run inside an image-- you
would have to find an element that we allow that interprets javascript
source. If anyone comes up with a way, I'd be very interested in hearing
about it. If the javascript is already in an svg, then it's much easier to
get it to execute.

But overall it's much safer to just not allow it, which is why we currently
don't.


>
> Matt Flaschen
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SVG linking of external images/bitmaps - xlink:href should support http(s) resources

2014-05-27 Thread Chris Steipp
On Tue, May 27, 2014 at 9:37 AM, "Christian Müller"  wrote:

> Hi,
>
>
> a recent discussion in
>   https://bugzilla.wikimedia.org/show_bug.cgi?id=65724#c3
>
> revealed that parts of the SVG standard are deliberately broken on
> commons.  While I see some reasons to not adhere fully to the standard,
> e.g. external resources might break over time, if they are moved or
> deleted, I don't feel it's good to break the standard as hard as it's done
> right now.  It puts a burden on creators, on the principle of sharing
> within the wikimedia environment and overall, it's even technically
> inferior and leads or might lead to useless duplication of content.
>

I'm far more concerned about the security/privacy issues than concern about
an external resource going away. The checks that you're hitting are likely
the security checks we do on the svg.


>
> The SVG standard defines an image element.  The image resource is linked
> to using the xlink:href attribute.  Optionally the image is embedded into
> the SVG using the
> https://en.wikipedia.org/wiki/Data_URI_scheme[https://en.wikipedia.org/wiki/Data_URI_scheme]
> .
>
> Combining SVGs with traditional bitmap images is useful in several ways:
> It allows creators sharing the way an image is manipulated and eases future
> modification that are hard to do or even impossible using traditional
> bitmap/photo editing.  It basically has the same advantages that mash-up
> web content has over static content:  Each layer or element can be modified
> individually without destroying the other elements.  It's easy to see that
> a proper SVG is more to its potential users than a classig JPG or PNG with
> only one layer, being the result of all image operations.
>
> These reasons point out the necessity for barrier-free access to the image
> element.
>
> Currently, commons cripples this access layed out in the standard and
> originally implemented by "librsvg".  It disables the handling of HTTP(S)
> resources.  Users needing the same bitmap in more than one SVG are forces
> to base64-embed their source, and hence duplicate it, in each individual
> SVG.  Indeed, there is quite some burden on creators and on wikimedia
> servers that duplicate lots of data right now and potentially even more in
> the future.  Note that this duplication of data goes unnoticed by the
> routines specifically in place for bitmaps right now, that check uploads on
> MD5 collision and reject the upload on dup detection.  Space might be cheap
> as long as donations are flowing, but reverting bad practice once it is
> common is harder than promoting good practice /now/ by adhering to the
> standard as closely as possible.
>
> Therefore I advocate change to librsvg in one of the two ways layed out in
> comment 3 of the bug report given above and (re)support linking to external
> bitmaps in SVGs.  Two strategies that come to mind to prevent disappearance
> of an external resource in the web are:
>
>  1) cache external refs on thumbnail generation, check for updates on
> external server on thumbnail re-generation
>
>  2) allow external refs to images residing on wikimedia servers only
>
>
> Point 2) should be considered the easiest implementation, 1) is harder to
> implement but gives even more freedom to SVG creators and would adhere more
> closely to SVG standard.  However, another argument for 2) would be the
> licensing issue:  It ensures that only images are linked to that have been
> properly licensed by commons users and the upload process (and if a license
> violation is detected and the linked-to bitmap removed from commons, the
> SVG using such a bitmap breaks gracefully).
>


Having our servers do arbitrary calls to external resources (option 1)
isn't a realistic option from a security perspective. There are some fun
poc svg files that abuse this to scan a server's dmz, attack other sites
with sql injections, etc. Trusting an image library to correctly speak http
without a memory corruption seems a little scary as well, but I'll admit I
haven't looked at librsvg's code myself.

From a privacy perspective, we also don't want to allow the situation where
a reader's device is reaching out to a server that we don't control. So if
someone includes a link to the original svg on a webpage, if there are any
major browsers that will pull those resources in and let an attacker see
the user's IP address, we shouldn't allow that... hmm, and now that I read
the bug, I see this is firefox'es behavior in the image you uploaded. We
probably want to block that behavior.

Allowing a whitelist of WMF domains via https may be possible. In general,
the security checking we do on uploaded files is complex enough that I
don't like adding another layer of specific checks and exceptions, but if
we can find a relatively simple way to do it that maintains our security
and privacy requirements, then I wouldn't stand in the way.


>
>
> Regards,
> Christian
>
>
> ___
> Wikitech-l mai

Re: [Wikitech-l] Bot flags and human-made edits

2014-05-20 Thread Chris Steipp
On Tue, May 20, 2014 at 6:05 AM, Jon Robson  wrote:

> I'm confused. Why wouldn't you just mark a user account as being a bot and
> simply determine bot edits from username alone?
>

Volume? Cluebot does a high volume of edits, but as mentioned, doesn't want
the edit hidden from RC.


>
> Any other mechanism seems prone to abuse or being inaccurate...
> On 20 May 2014 07:36, "Amir Ladsgroup"  wrote:
>
> > Thank you legoktm for exampling,
> > Another case that happened in Persian Wikipedia, is creating
> bot-generated
> > articles by user request this task is too contervisal to be marked as bot
> > and we didn't mark it but other edits of my bot is marked as bot
> >
> > Best
> >
> >
> > On Tue, May 20, 2014 at 11:01 AM, Legoktm  > >wrote:
> >
> > >
> > > On 5/19/14, 6:39 PM, Dan Garry wrote:
> > >
> > >> On 19 May 2014 19:36, Amir Ladsgroup  wrote:
> > >>
> > >>  As a bot operator I think API parameter about flagging bot or not is
> > >>> necessary
> > >>>
> > >>>  Sure, but as I'm not a bot operator, can you explain why and what
> you
> > >> use
> > >> this for, to help me understand? :-)
> > >>
> > >
> > > If the edits should show up in users watchlists/recentchanges for
> humans
> > > to look at. An example would be ClueBot NG on enwp which doesn't flag
> > it's
> > > edits with the bot flag so humans can review them.
> > >
> > > Another case where this recently came up is in MassMessage (bug 65180).
> > > Some edits like those to user talk pages should be marked as a bot
> since
> > > the user will receive a notification regardless, but ones that are made
> > to
> > > Project (or other) namespaces, should not be flagged as bot so users
> will
> > > see them in their watchlists.
> > >
> > > -- Legoktm
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > Amir
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-16 Thread Chris Steipp
On May 16, 2014 5:20 PM, "Chad"  wrote:
>
> On Fri, May 16, 2014 at 4:38 PM, MZMcBride  wrote:
>
> > Chris Steipp wrote:
> > >Accounts are kinda namespaced, so github user foo and sul user foo can
> > >both have phabricator accounts.
> > >
> > >Since we're using OAuth though, that requires a global wiki account so
> > >local only accounts would not be able to join. So we probably need
> > >password or LDAP auth at minimum.
> >
> > I suppose you could rely only on global (in the CentralAuth extension
> > sense) accounts, but it really would make sense for Wikimedia to get its
> > own house in order first: we should finish fully unifying login across
> > Wikimedia wikis before delving into concurrent authentication systems.
> >
> >
> Yes, let's please. But that's another thread.
>
> I'm less concerned about non-unified accounts than I am about the other
> (much more obvious) problem of "how do we use Phabricator if the cluster
> is down." Ryan suggested Labs LDAP and I agree, it's a very sane fallback.
> It's very unlikely for the cluster *and* LDAP to be down at the same time,
> and if they are it's probably network-related and we'll be screwed on
using
> Phabricator anyway.
>
>
> > I think this mailing list thread suffers from an analysis of what the
> > potential negative consequences of allowing third-party login are. The
> > positive to users (one less username and password to remember) is
clearer
> > to see. What are the drawbacks of doing this? I'd like to see the pros
and
> > cons outlined on mediawiki.org or meta.wikimedia.org.
> >
> >
> The positive side of "I can use one less login" is nice, don't get me
wrong.
>
> I'm mostly worried about security issues in 3rd party implementations of
> oAuth
> that we can't control. I asked Chris S. about this earlier today and I
hope
> he'll
> expand on this some more--especially concerning to me was the concrete
> example he gave with Facebook's own oAuth. Also he mentioned that
Twitter's
> oAuth is known to be insecure in its implementation.

I don't want to start a rumor that using Twitter's OAuth for authentication
is insecure, but OAuth 1 (which phabricator is using for the login) isn't
made for authentication... Insert broken record track of me taking about
this ;)

More authentication systems means a bigger attack surface we have to
secure. If you look at the vulnerabilities fixed in phabricator via their
bounty program [1], 3 are login with OAuth bugs. This makes me nervous (but
kudos to them for running the program and fixing these).

Although it wasn't possible in any of these reported bugs yet, the big risk
is that an attack will allow adding a login account to an existing
phabricator account via csrf, allowing the attacker to add their 3rd party
account to my phabricator account and then they can login as me using their
Facebook, etc account. This famously happened to stack exchange via the
Facebook login last year.

So I'll do an audit on the methods we decide to go with, but I'd like to
keep that number fairly small. Turning them on isn't totally "free".

[1] https://hackerone.com/phabricator
>
> Depending on how Github's oAuth is implemented that's the one I could see
> the strongest case being made for.
>
> Enabling all of them seems like it'll just make the login page cluttered
> with
> options used by about 1-2 people each but I could be wrong.
>
> -Chad
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-16 Thread Chris Steipp
On May 15, 2014 3:56 PM, "hoo"  wrote:
>
> On Thu, 2014-05-15 at 14:20 -0700, Quim Gil wrote:
> > This is a casual request for comments about the use of 3rd party
> > authentication providers for our future Wikimedia Phabricator instance.
> >
> > Wikimedia Phabricator is expected to replace Bugzilla, Gerrit and many
> > other tools, each of them having their own registration and user
account.
> > The plan is to offer Wikimedia SUL (your Wikimedia credentials) as the
> > default way to login to Phabricator -- details at
http://fab.wmflabs.org/T40
> >
> > However, Phabricator can support authentication using 3rd party
providers
> > like GitHub, Google, etc. You can get an idea at
> > https://secure.phabricator.com/auth/start/
> >
> > There are good reasons to plan for Wikimedia SUL only (consistency with
the
> > rest of Wikimedia projects), and there are good reasons to plan for
other
> > providers as well (the easiest path for most first-time contributors).
> >
> > What do you think? Should we offer alternatives to Wikimedia login? If
so,
> > which ones?
> >
> >
>
> Seeing the mess with user accounts we have on the Wikis these days,
> please make sure we wont run into naming conflicts.
> A wiki user with the global account "foo" should always be able to use
> that account Phabricator, no matter what users from other sources did
> before.

Accounts are kinda namespaced, so github user foo and sul user foo can both
have phabricator accounts.

Since we're using OAuth though, that requires a global wiki account so
local only accounts would not be able to join. So we probably need password
or LDAP auth at minimum.

>
> Cheers,
>
> Marius
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vagrant CentralAuth role

2014-05-05 Thread Chris Steipp
I just found out about that from Ori too. Problem solved. Thanks!


On Mon, May 5, 2014 at 12:42 PM, Bryan Davis  wrote:

> On Mon, May 5, 2014 at 1:17 PM, Chris Steipp 
> wrote:
> > Different domains is closer to how we run thing in production, but it
> would
> > require copying dns settings to your host (there doesn't seem to be a
> good,
> > cross-platform way to do this from vagrant itself, but if anyone has a
> > solution, that would make the decision easy).
>
> We have a public wildcard DNS record for *.local.wmftest.net that
> resolves to 127.0.0.1 for just this sort of thing. The Wikimania
> Scholarships role uses it to setup a named vhost for
> http://scholarships.local.wmftest.net:8080/.
>
> Bryan
> --
> Bryan Davis  Wikimedia Foundation
> [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
> irc: bd808v:415.839.6885 x6855
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Vagrant CentralAuth role

2014-05-05 Thread Chris Steipp
Hi all,

I'm planning to spend some time in Zurich getting a centralauth role for
vagrant working (part of
https://www.mediawiki.org/wiki/Z%C3%BCrich_Hackathon_2014/Topics#Production-like_Vagrant).
I wanted to get opinions (probably more bikeshed) about how you would like
to access multiple wikis on a single vagrant instance. If anyone is
interested in using CentralAuth on vagrant for development/testing, please
chime in!

We can either use a different subdirectory per wiki, or different domain
per wiki.

Different domains is closer to how we run thing in production, but it would
require copying dns settings to your host (there doesn't seem to be a good,
cross-platform way to do this from vagrant itself, but if anyone has a
solution, that would make the decision easy). So you would work on,

http://localhost:8080/ (main vagrant wiki)
http://loginwiki.dev:8080/ (loginwiki)
etc.

Different subdirectories is how I currently do development and I personally
don't mind it, but it makes turning CentralAuth on and off more of a
challenge, since the current wiki is in the web root. So

http://localhost:8080/wiki/ (main vagrant wiki)
http://localhost:8080/loginwiki/ (loginwiki)
etc.

Preferences?

Chris
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Security precaution - Resetting all user sessions today

2014-04-08 Thread Chris Steipp
Due to the speed of the script, it will take a while for everyone to be
logged out.

If you hit this issue, logging out and logging in again seems to fix the
problem. I'm still trying to track down why this is happening.


On Tue, Apr 8, 2014 at 4:43 PM, Greg Grossmeier  wrote:

> Chris S is actively looking into this. Thanks for the note.
>
> --
> Sent from my phone, please excuse brevity.
> On Apr 8, 2014 4:18 PM, "Risker"  wrote:
>
> > Thanks for the heads-up, Greg.  However, I'm finding that I am being
> > repeatedly logged out...it's happened every other edit I've made tonight,
> > which is a real pain.  Will report on IRC as well.
> >
> > Risker/Anne
> >
> >
> > On 8 April 2014 16:57, Greg Grossmeier  wrote:
> >
> > > FYI to this audience as well:
> > >
> > > We're reseting all user session tokens today due to heartbleed.
> > >
> > > What I didn't state below is that we have already replaced our SSL
> certs
> > > as well as upgraded to the fixed version of openssl.
> > >
> > > - Forwarded message from Greg Grossmeier 
> -
> > >
> > > > Date: Tue, 8 Apr 2014 13:54:26 -0700
> > > > From: Greg Grossmeier 
> > > > To: Wikitech Ambassadors 
> > > > Subject: Security precaution - Resetting all user sessions today
> > > >
> > > > Yesterday a widespread issue in OpenSSL was disclosed that would
> allow
> > > > attackers to gain access to privileged information on any site
> running
> > a
> > > > vulnerable version of that software. Unfortunately, all Wikimedia
> > > > Foundation hosted wikis are potentially affected.
> > > >
> > > > We have no evidence of any actual compromise to our systems or our
> > users
> > > > information, but as a precautionary measure we are resetting all user
> > > > session tokens. In other words, we will be forcing all logged in
> users
> > > > to re-login (ie: we are logging everyone out).
> > > >
> > > > All logged in users send a secret session token with each request to
> > the
> > > > site and if a nefarious person were able to intercept that token they
> > > > could impersonate other users. Resetting the tokens for all users
> will
> > > > have the benefit of making all users reconnect to our servers using
> the
> > > > updated and fixed version of the OpenSSL software, thus removing this
> > > > potential attack.
> > > >
> > > > As an extra precaution, we recommend all users change their passwords
> > as
> > > > well.
> > > >
> > > >
> > > > Again, there has been no evidence that Wikimedia Foundation users
> were
> > > > targeted by this attack, but we want all of our users to be as safe
> as
> > > > possible.
> > > >
> > > >
> > > > Thank you for your understanding and patience,
> > > >
> > > > Greg Grossmeier
> > > >
> > > >
> > > > --
> > > > | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> > > > | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |
> > >
> > >
> > >
> > > - End forwarded message -
> > >
> > > --
> > > | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> > > | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Optimizing our captcha images

2014-04-01 Thread Chris Steipp
I'm fairly sure not, although you might be able to run those from the logs.
I would really like to see a feedback mechanism in fancycaptcha (or all
captchas for that matter) so we could automatically run those numbers.

On Tue, Apr 1, 2014 at 11:30 AM, Ryan Kaldari wrote:

> Has anyone ever collected statistics on which of our captcha images are
> most commonly entered incorrectly? I've noticed that some of our images are
> quite difficult to read and should probably be removed from rotation. I
> could imagine applying a heuristic like:
>
> 0-10% wrong: Delete - too easy
> 10-30% wrong: Keep - just right
> 30-100% wrong: Delete - too hard
>
> Ryan Kaldari
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CentralAuth questions

2014-03-27 Thread Chris Steipp
On Thu, Mar 27, 2014 at 6:01 PM, John  wrote:

> You can also use the localuser table in the CA database.
>

Yep. Localuser keeps track of the attachments, so any entry there for a
username + wiki means the global username of the same name is attached on
that wiki. It's all done via username, not user id.

If you're using php to do the processing, you can use
CentralAuthUser::attachedOn() to test if the account is attached, or
use listAttached() to get the array of all wikis where the user is attached.


>
> On Thu, Mar 27, 2014 at 8:35 PM, Dan Andreescu  >wrote:
>
> > Thank you very much for the reply Max, +1 beer for next time we meet.
> >
> >
> > On Thu, Mar 27, 2014 at 5:10 PM, MZMcBride  wrote:
> >
> > > Teresa Cho wrote:
> > > >I'm trying to add a feature to Wikimetrics that will allow users to
> > > >create a cohort with a username and find all accounts across wikis. I
> > > >want to use the CentralAuth database, because as far as I can tell, it
> > > >stores the global username and all the local usernames. However, I
> don't
> > > >see where it connects the globalusers to the localusers. Is it just
> the
> > > >username?
> > > >
> > > >Does the username have to be the same across local wikis and you query
> > > >the localuser table with what you think is the global username? If
> > that's
> > > >the case, I suppose I don't need to look at the global table.
> > >
> > > Hi.
> > >
> > > Broadly, I think the answer for you're looking for is no: CentralAuth
> > > accounts (global user accounts that match to local user accounts) are
> not
> > > fully unified on Wikimedia wikis. It's a long-term goal, but it's a
> > > disruptive change to make, so it's taken a while. :-)
> > >
> > > It sounds like you want programmatically retrieve the info from:
> > > .
> > >
> > > If so, I'd recommend the MediaWiki Web API
> > > () for this. Perhaps the
> > > globaluserinfo API module?
> > >
> >
> https://www.mediawiki.org/w/api.php?action=query&meta=globaluserinfo&guiuse
> > > r=Jimbo+Wales&guiprop=groups|merged|unattached
> > >
> > > If you must directly query the MediaWiki database using SQL, you'll
> > likely
> > > need to read through the source code of the CentralAuth MediaWiki
> > > extension to figure out exactly what the PHP and SQL is doing with the
> > > underlying data. The source code of the CentralAuth MediaWiki extension
> > > can be found here:
> > > <
> https://git.wikimedia.org/tree/mediawiki%2Fextensions%2FCentralAuth.git
> > >.
> > > You'll likely want to read through central-auth.sql in particular.
> > >
> > > Dan Andreescu wrote:
> > > >Any links to documentation on consuming data from the CentralAuth
> > > >databases is welcome.  We searched a bit and found mostly installation
> > > >instructions.
> > >
> > > Well, very generally you (or your program) probably shouldn't be
> querying
> > > the databases directly, but if you can provide more specific
> information
> > > about where you looked, we can probably add some redirects for future
> > > w[ao]nderers.
> > >
> > > For general clarity, while I used www.mediawiki.org in the examples in
> > > this e-mail, because the Web API is retrieving global (wiki farm-wide)
> > > data, the equivalent URL paths should work on other Wikimedia wikis
> such
> > > as en.wikipedia.org or meta.wikimedia.org.
> > >
> > > MZMcBride
> > >
> > >
> > >
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-26 Thread Chris Steipp
On Wed, Mar 26, 2014 at 10:30 AM, Nuria Ruiz  wrote:

> >Additionally, how you escape a plain parameter like class vs. an
> >href vs. a parameter that is inserted into a url vs. an id attribute are
> >all different escaping strategies.
> Urls in the template engine need to be handled on their own, sure. But what
> template engine does not work in this fashion? There are three separate
> "entities" you normally deal with when doing replacement: translations,
> urls and plain attributes.
>

When looking at a typical web page, you need several escaping strategies.
OWASP roughly groups them into html body, plain attributes, URL context,
Javascript context, CSS context. My point was that you need several
MakeWhateverSafe functions, and have to use them in the right context. So
that is a long way of saying I disagree with you when you said that this
could be automated without some process having knowledge of the html
context and verifying the right escaping is being applied.


>
>
> >$html = Html::element( 'div', array( 'class' => $anything ), $anythingElse
> I see. Sorry but where I disagree is that the "quote me this replacement"
> is a lawful case for the template engine. The line above is doing a lot
> more than purely templating and on my opinion it does little to separate
> data and markup. Which is the very point of having a template engine.
>
> But if you consider that one a lawful use case, you are right. The example
> I provided does not help you.
>
>
> On Wed, Mar 26, 2014 at 6:15 PM, Chris Steipp 
> wrote:
>
> > On Wed, Mar 26, 2014 at 9:44 AM, Daniel Friesen
> > wrote:
> >
> > > On 2014-03-26, 9:32 AM, Nuria Ruiz wrote:
> > > >> The issue is that they apply the same escaping, regardless of the
> > > >> html context. So, in Twig and mustache,  > class={{something}}>
> > > is
> > > >> vulnerable, if something is set to "1234 onClick=doSomething()".
> > > > Right, the engine would render:
> > > >
> > > >  
> > > >
> > > > because it only escapes HTML by default.
> > > > Now, note that the problem can be fixed with  > class={{makeStringSafe
> > > > something}}>
> > > >
> > > > Where "makestringSafe" is a function defined by us and executed there
> > > that
> > > > escapes to our liking.
> > > How does a custom function jammed into the middle of a Mustache
> template
> > > fix the issue when the issue is not that foo={{something}} doesn't
> > > escape, but is that quoting is needed instead of escaping, and Mustache
> > > isn't context sensitive so neither Mustache or a custom function know
> > > that foo={{something}} is an attribute value in need of quoting?
> > >
> >
> > Exactly. Additionally, how you escape a plain parameter like class vs. an
> > href vs. a parameter that is inserted into a url vs. an id attribute are
> > all different escaping strategies. So there would be many different
> > "makeStringSafe" and probably "quoteAndMakeStringSafe" functions,
> > and code review would have to make sure the right one was being used in
> the
> > right place. Which means someone who is familiar with all of the xss
> > techniques would need to code review almost all the templates.
> >
> > For comparison, using our current html templating (as much as it sucks):
> >
> > $html = Html::element( 'div', array( 'class' => $anything ),
> $anythingElse
> > );
> >
> > The developer doesn't need to have any knowledge of what escaping needs
> to
> > apply to the class attribute vs the text.
> >
> >
> >
> > > ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/
> ]
> > >
> > >
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-26 Thread Chris Steipp
On Wed, Mar 26, 2014 at 10:30 AM, Nuria Ruiz  wrote:

> >Additionally, how you escape a plain parameter like class vs. an
> >href vs. a parameter that is inserted into a url vs. an id attribute are
> >all different escaping strategies.
> Urls in the template engine need to be handled on their own, sure. But what
> template engine does not work in this fashion? There are three separate
> "entities" you normally deal with when doing replacement: translations,
> urls and plain attributes.
>
>
> >$html = Html::element( 'div', array( 'class' => $anything ), $anythingElse
> I see. Sorry but where I disagree is that the "quote me this replacement"
> is a lawful case for the template engine.


I'm not sure I understand what you're saying here. Do you mean
makesafeString in your example shouldn't quote the text, but should instead
remove space characters?


> The line above is doing a lot
> more than purely templating and on my opinion it does little to separate
> data and markup. Which is the very point of having a template engine.
>
> But if you consider that one a lawful use case, you are right. The example
> I provided does not help you.
>
>
> On Wed, Mar 26, 2014 at 6:15 PM, Chris Steipp 
> wrote:
>
> > On Wed, Mar 26, 2014 at 9:44 AM, Daniel Friesen
> > wrote:
> >
> > > On 2014-03-26, 9:32 AM, Nuria Ruiz wrote:
> > > >> The issue is that they apply the same escaping, regardless of the
> > > >> html context. So, in Twig and mustache,  > class={{something}}>
> > > is
> > > >> vulnerable, if something is set to "1234 onClick=doSomething()".
> > > > Right, the engine would render:
> > > >
> > > >  
> > > >
> > > > because it only escapes HTML by default.
> > > > Now, note that the problem can be fixed with  > class={{makeStringSafe
> > > > something}}>
> > > >
> > > > Where "makestringSafe" is a function defined by us and executed there
> > > that
> > > > escapes to our liking.
> > > How does a custom function jammed into the middle of a Mustache
> template
> > > fix the issue when the issue is not that foo={{something}} doesn't
> > > escape, but is that quoting is needed instead of escaping, and Mustache
> > > isn't context sensitive so neither Mustache or a custom function know
> > > that foo={{something}} is an attribute value in need of quoting?
> > >
> >
> > Exactly. Additionally, how you escape a plain parameter like class vs. an
> > href vs. a parameter that is inserted into a url vs. an id attribute are
> > all different escaping strategies. So there would be many different
> > "makeStringSafe" and probably "quoteAndMakeStringSafe" functions,
> > and code review would have to make sure the right one was being used in
> the
> > right place. Which means someone who is familiar with all of the xss
> > techniques would need to code review almost all the templates.
> >
> > For comparison, using our current html templating (as much as it sucks):
> >
> > $html = Html::element( 'div', array( 'class' => $anything ),
> $anythingElse
> > );
> >
> > The developer doesn't need to have any knowledge of what escaping needs
> to
> > apply to the class attribute vs the text.
> >
> >
> >
> > > ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/
> ]
> > >
> > >
> > >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-26 Thread Chris Steipp
On Wed, Mar 26, 2014 at 9:44 AM, Daniel Friesen
wrote:

> On 2014-03-26, 9:32 AM, Nuria Ruiz wrote:
> >> The issue is that they apply the same escaping, regardless of the
> >> html context. So, in Twig and mustache, 
> is
> >> vulnerable, if something is set to "1234 onClick=doSomething()".
> > Right, the engine would render:
> >
> >  
> >
> > because it only escapes HTML by default.
> > Now, note that the problem can be fixed with  > something}}>
> >
> > Where "makestringSafe" is a function defined by us and executed there
> that
> > escapes to our liking.
> How does a custom function jammed into the middle of a Mustache template
> fix the issue when the issue is not that foo={{something}} doesn't
> escape, but is that quoting is needed instead of escaping, and Mustache
> isn't context sensitive so neither Mustache or a custom function know
> that foo={{something}} is an attribute value in need of quoting?
>

Exactly. Additionally, how you escape a plain parameter like class vs. an
href vs. a parameter that is inserted into a url vs. an id attribute are
all different escaping strategies. So there would be many different
"makeStringSafe" and probably "quoteAndMakeStringSafe" functions,
and code review would have to make sure the right one was being used in the
right place. Which means someone who is familiar with all of the xss
techniques would need to code review almost all the templates.

For comparison, using our current html templating (as much as it sucks):

$html = Html::element( 'div', array( 'class' => $anything ), $anythingElse
);

The developer doesn't need to have any knowledge of what escaping needs to
apply to the class attribute vs the text.



> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-26 Thread Chris Steipp
On Wed, Mar 26, 2014 at 3:21 AM, Nuria Ruiz  wrote:

> >So for string-based systems to be
> >as safe as dom ones, we also need a layer of policy and code review that
> we
> >might not need with a dom-based system.
> String based template engines (like handlebars) do escape as a default, you
> have to use "special" markup for it not to escape. Can you explain in more
> detail what is the security concern with those?
>

Correct. The issue is that they apply the same escaping, regardless of the
html context. So, in Twig and mustache,  is
vulnerable, if something is set to "1234 onClick=doSomething()". So
policy/code review is needed to say that attributes with user-supplied data
must be quoted in a way compatible with the templating engine (' or " for
Twig, " for Mustache since Mustache doesn't escape single quotes).


>
>
>
>
>
> On Wed, Mar 19, 2014 at 7:51 PM, Chris Steipp 
> wrote:
>
> > On Tue, Mar 18, 2014 at 8:27 PM, Sumana Harihareswara <
> > suma...@wikimedia.org
> > > wrote:
> >
> > > I'm trying to understand what our current situation is and what our
> > > choices are around HTML templating systems and MediaWiki, so I'm gonna
> > > note what I think I understand so far in this mail and then would love
> > > for people to correct me. TL;DR - did we already consense on a
> > > templating system and I just missed it?
> > >
> > > Description: An HTML templates system (also known as a templating
> > > engine) lets you (the programmer) write something that looks more like
> a
> > > document than it looks like code, then has hooks/entry points/macro
> > > substitution points (for user input and whatnot) that then invoke code,
> > > then emits finished HTML for the browser to render.
> > >
> > > Examples: PHP itself is kinda a templating language. In the PHP world,
> > > Smarty is a somewhat more mature/old-school choice. Mustache.js is a
> > > popular modern choice. And in other languages, you'd pick a lot of the
> > > MVC frameworks that are popular, e.g. Django or Jinja in Python.
> > >
> > > Spectrum of approaches: One approach treats HTML as a string ("here's a
> > > bunch of bytes to interpolate"). From a security perspective, this is
> > > dangerously easy to have vulnerabilities in, because you just naively
> > > insert strings. Then on the other end of the spectrum, you have code
> > > that always keeps the document object model (DOM) in memory, so the
> > > programmer is abstractly manipulating that data model and passing
> around
> > > an object. Sure, it spits out HTML in the end, but inherent in the
> > > method for turning those objects into HTML is a sanitization step, so
> > > that's inherently more secure. There's some discussion at
> > > https://www.mediawiki.org/wiki/Parsoid/Round-trip_testing/Templates .
> I
> > > presume we want the latter, but that the former model is more
> performant?
> > >
> >
> > I don't want to build too much of a straw man against string-based
> systems,
> > so it's probably more appropriate to say that the same escaping is
> applied
> > to all strings regardless of the html context, or the developer is
> > responsible for applying custom escaping. So for string-based systems to
> be
> > as safe as dom ones, we also need a layer of policy and code review that
> we
> > might not need with a dom-based system.
> >
> > Performance of the dom-based systems has turned out to be not that bad,
> but
> > performance is a major factor in any engine we go with.
> >
> >
> >
> > >
> > > We talked about this stuff in
> > >
> >
> https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-02-21
> > > and
> > >
> > >
> >
> https://www.mediawiki.org/wiki/Talk:Architecture_Summit_2014/HTML_templating#Wrap_up:_Next_steps
> > > . Based on that plus
> > >
> > >
> >
> https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_clusters#HTML_templating
> > > it seems like we are supposed to get consensus on which system(s) to
> > > use, and we kind of have four things we could choose:
> > >
> > > * oojs - https://www.mediawiki.org/wiki/OOjs_UI -- could use this
> > > toolkit with one of the template approaches below, or maybe this is
> > > enough by itself! Currently used inside VisualEditor and I am not sure
> > > whether any other MediaWiki extensions or teams are using i

Re: [Wikitech-l] HTML templating systems & MediaWiki - is this summary right?

2014-03-19 Thread Chris Steipp
On Tue, Mar 18, 2014 at 8:27 PM, Sumana Harihareswara  wrote:

> I'm trying to understand what our current situation is and what our
> choices are around HTML templating systems and MediaWiki, so I'm gonna
> note what I think I understand so far in this mail and then would love
> for people to correct me. TL;DR - did we already consense on a
> templating system and I just missed it?
>
> Description: An HTML templates system (also known as a templating
> engine) lets you (the programmer) write something that looks more like a
> document than it looks like code, then has hooks/entry points/macro
> substitution points (for user input and whatnot) that then invoke code,
> then emits finished HTML for the browser to render.
>
> Examples: PHP itself is kinda a templating language. In the PHP world,
> Smarty is a somewhat more mature/old-school choice. Mustache.js is a
> popular modern choice. And in other languages, you'd pick a lot of the
> MVC frameworks that are popular, e.g. Django or Jinja in Python.
>
> Spectrum of approaches: One approach treats HTML as a string ("here's a
> bunch of bytes to interpolate"). From a security perspective, this is
> dangerously easy to have vulnerabilities in, because you just naively
> insert strings. Then on the other end of the spectrum, you have code
> that always keeps the document object model (DOM) in memory, so the
> programmer is abstractly manipulating that data model and passing around
> an object. Sure, it spits out HTML in the end, but inherent in the
> method for turning those objects into HTML is a sanitization step, so
> that's inherently more secure. There's some discussion at
> https://www.mediawiki.org/wiki/Parsoid/Round-trip_testing/Templates . I
> presume we want the latter, but that the former model is more performant?
>

I don't want to build too much of a straw man against string-based systems,
so it's probably more appropriate to say that the same escaping is applied
to all strings regardless of the html context, or the developer is
responsible for applying custom escaping. So for string-based systems to be
as safe as dom ones, we also need a layer of policy and code review that we
might not need with a dom-based system.

Performance of the dom-based systems has turned out to be not that bad, but
performance is a major factor in any engine we go with.



>
> We talked about this stuff in
> https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-02-21
> and
>
> https://www.mediawiki.org/wiki/Talk:Architecture_Summit_2014/HTML_templating#Wrap_up:_Next_steps
> . Based on that plus
>
> https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_clusters#HTML_templating
> it seems like we are supposed to get consensus on which system(s) to
> use, and we kind of have four things we could choose:
>
> * oojs - https://www.mediawiki.org/wiki/OOjs_UI -- could use this
> toolkit with one of the template approaches below, or maybe this is
> enough by itself! Currently used inside VisualEditor and I am not sure
> whether any other MediaWiki extensions or teams are using it? This is a
> DOM-based templating system.
>
> Template approaches which are competing?:
> * MVC framework - Wikia has written their own templating library that
> Wikia uses (Nirvana). Owen Davis is talking about this tomorrow in the
> RFC review meeting.
> https://www.mediawiki.org/wiki/Requests_for_comment/MVC_framework
> * mustache.js stuff - Ryan Kaldari and Chris Steipp mentioned this I think?
> * Knockout-compatible implementation in Node.js & PHP
>
> https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library/KnockoutProposal#Longer-term_architecture
> and
>
> https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_library/Knockoff_-_Tassembly
> , being worked on by Gabriel Wicke, Matt Walker, and others. DOM-based.
>


I think
https://www.mediawiki.org/wiki/Requests_for_comment/HTML_templating_librarycaptures
most of the current thinking. While Knockoff is being developed,
Handlebars (and the php port of it) seems to be the leader for a
string-based solution.


>
> There's also an OutputPage refactor suggested in
> https://www.mediawiki.org/wiki/Requests_for_comment/OutputPage_refactor
> that's part of the HTML Templating RFC Cluster
>
> https://www.mediawiki.org/wiki/Architecture_Summit_2014/RFC_clusters#HTML_templating
> .
>
> I guess my biggest question right now is whether I have all the big
> moving parts right in my summary above. Thanks.


> --
> Sumana Harihareswara
> Senior Technical Writer
> Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth upload

2014-03-19 Thread Chris Steipp
I'm guessing the crop tool developer figured it out. That's not one use
case I have code for. If anyone has writing code, I'd love a link to it so
I can get a demo posted.

There is a trick to getting the form type right, since OAuth's spec
explicitly specified out doesn't work with multipart forms. I got it
working at one point in or  implementation, I'll see if I can dig up that
code.
On Mar 19, 2014 9:08 AM, "Magnus Manske" 
wrote:

> OK, this is killing me. I'm trying to upload files to Commons (using
> PHP/CURL).
>
> * I can upload local files with my own bot user.
> * I can upload from remote URLs using OAuth, /if the user is an admin/
>
> What I can't figure out is how to upload local files via OAuth. It's either
>
> "File upload param file is not a file upload; be sure to use
> multipart/form-data for your POST and include a filename in the
> Content-Disposition header."
>
> or (trying to add multipart/form-data to the header)
>
> "The authorization headers in your request are not valid: Invalid
> signature"
>
> Is there any example code for uploading local files to Commons via OAuth? A
> trick I can't find? Anything?
>
> Cheers,
> Magnus
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki, Cookies and EU Privacy Policy 95/46/EG

2014-03-10 Thread Chris Steipp
On Mon, Mar 10, 2014 at 8:46 AM, Manuel Schneider <
manuel.schnei...@wikimedia.ch> wrote:

> Dear all,
>
> not sure if this discussion already happens somewhere else, I couldn't
> find it on MediaWiki.org or by googling.
>
> The issue at hand is: EU privacy policy 95/46/EG[1] allows usage of
> cookies only if
> * the user has been informed beforehand in detail
> * the user has accepted the cookie
> * this acceptance was given freely, without doubt and through by action
> (This is the summary by the Article 29 Working Party issued in a Working
> Document 02/2013[2] on October 2nd, 2013.)
>
> An example how this is being implemented can be seen on sourceforge.org
> or here:
> * http://ec.europa.eu/justice/cookies/index_en.htm
>
> I checked MediaWiki:
> * anonymous users don't get a cookie, unless the site owner added
> something (eg. Google Analytics, Piwik or content served by another site
> using cookies)
> -> this is fine
>
> * as soon as I click the "Login" button on the wiki, a cookie is being set
> -> here we need to work, we need to ask first
>
> So I see two possibilities:
>
> 1) catch the click on the "Login" link to show a banner first to ask for
> the users consent, on acceptance forward the user to the login page
>
> 2) modify the login process to set the cookie after the actual login and
> put an additional text on the login page like "by logging in I accept
> the usage of cookies by this website"
>

The cookie on the login page is for the anti-csrf (and captcha if needed)
validation, so getting rid of it would be problematic from a technical
perspective (or would require a second click on the login page).



> -> as the login is an action which implies the consent, if we inform
> properly on the login form already
>
> Any thoughts about this?
>
> This issue also concerns all our Wikimedia websites, basically every
> MediaWiki out there where people may log into.
>
> The Austrian Communication Law (§ 96 Abs. 3 TKG) defines a penalty of
> 37.000 EUR.
>
> /Manuel
>
> [1]
>
> http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:html
>
> [2]
>
> http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp208_en.pdf
> --
> Wikimedia CH - Verein zur Förderung Freien Wissens
> Lausanne, +41 (21) 34066-22 - www.wikimedia.ch
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Commit Wars

2014-03-06 Thread Chris Steipp
On Thu, Mar 6, 2014 at 4:08 PM, Erik Bernhardson  wrote:
>
> Does core have any  policies related to merging?  The core features team
> has adopted a methodology(although slightly different) that we learned of
> from the VE team.  Essentially +2 for 24 hours before a deployment branch
> is cut is limited to fixes for bugs that  were introduced since the last
> deployment branch was cut or reverts for patches that turned out to not be
> ready for deployment.  Core is certainly bigger and with more participants,
> but perhaps a conversation about when to +2 and how that effects the
> deployment process would be benefitial?
>
>
Formally, no (not that I know of). Informally, I know a lot of us do a lot
of merging on Fridays, partly for this reason. I resisted merging a big
patch this morning because I want it to sit in beta for a while. I know a
few patches were merged this morning so that they *would* make it into
today's deploy. Everyone with +2 should always think about how/when things
will be deployed, and merge as appropriate. And it seems like most people
use good judgement most of the time.

If this is coming up a lot, then yeah, let's make some policy about it, or
just enforce it in how we do the branch cutting.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Two factor auth reset needed on wikitech

2014-02-28 Thread Chris Steipp
Correct, the scratch codes are the only way to login.

If you don't have this, you'll have to get someone to remove your
preference in the db.
On Feb 28, 2014 1:32 PM, "Matthew Walker"  wrote:

> Don't have them :p
>
> ~Matt Walker
> Wikimedia Foundation
> Fundraising Technology Team
>
>
> On Fri, Feb 28, 2014 at 1:23 PM, Jeremy Baron 
> wrote:
>
> > On Fri, Feb 28, 2014 at 9:15 PM, Matthew Walker 
> > wrote:
> > > I wasn't able to find any documentation on wikitech about how to reset
> it
> > > -- so I need your help to do that I think? I still know my password; so
> > I'm
> > > not looking to reset that -- maybe just temporarily disable two factor
> > auth
> > > on my account (Mwalker) and I'll re-enroll myself?
> >
> > I don't know that much about the process but I believe step one is to
> > find the slips of paper that you wrote down the codes that you're
> > supposed to use in this very situation.
> >
> > -Jeremy
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Security and Maintenance Releases: 1.22.3, 1.21.6 and 1.19.12

2014-02-28 Thread Chris Steipp
That was a mistake this release. We'll continue those going forward.
On Feb 27, 2014 7:56 PM, "Matthew Walker"  wrote:

> I note that there are security fixes in these release's -- did I miss
> Chris' email about these patches or are we moving away from the model where
> we send out an email to the list a couple of days before release?
>
> ~Matt Walker
> Wikimedia Foundation
> Fundraising Technology Team
>
>
> On Thu, Feb 27, 2014 at 6:55 PM, Brian Wolff  wrote:
>
> > > * (bug 61346) SECURITY: Make token comparison use constant time. It
> seems
> > > like
> > >   our token comparison would be vulnerable to timing attacks. This will
> > > take
> > >   constant time.
> >
> > Not to be a grammar nazi, but that should presumably be something
> > along the lines of "Using constant time comparison will prevent this"
> > instead of "This will take constant time", as that could be
> > interpreted as the attack would take constant time.
> >
> > --bawolff
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-24 Thread Chris Steipp
I know a few people who will be happy if they can keep running on stock
rhel6 (5.3). That would also mean epel can package 1.23.

After 1.19 is when we went to 5.3, so I think following president is good
too.
On Feb 23, 2014 6:04 PM, "Chad"  wrote:

> +1 here as well. Let's look at this for 1.24 :)
>
> -Chad
> On Feb 23, 2014 8:42 AM, "David Gerard"  wrote:
>
> > On 23 February 2014 01:25, Markus Glaser  wrote:
> >
> > > I'd like to see the next MediaWiki LTS version (1.23) to support PHP
> 5.3.
> > > MW1.23LTS has a scheduled release date at end of April (we might add a
> > week or
> > > two for safety). After that, no problem from my side (release
> > management) with
> > > dropping PHP5.3 support.
> >
> >
> > As an LTS user (typically on Ubuntu 12.04; assume hosting environments
> > won't go 14.04 straight away), that would make me very happy :-) And
> > would probably do, yes.
> >
> >
> > - d.
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deploying the most recent MediaWiki code: which branch?

2014-02-20 Thread Chris Steipp
On Thu, Feb 20, 2014 at 2:37 PM, Ryan Lane  wrote:

> Note that unless you're willing to keep up to date with WMF's relatively
> fast pace of branching, you're going to miss security updates. No matter
> what, if you use git you're going to get security updates slower, since
> they are released into the tarballs first, then merged into master, then
> branches (is this accurate?). Sometimes the current WMF branch won't even
> get the security updates since they are already merged locally onto
> Wikimedia's deployment server.
>

I've been releasing tarballs, then pushing the fixes into the release
branches and master in gerrit. It all happens within a couple of hours, but
the tarballs have a slightly narrower timeframe. I rarely push to wmfXX
branches, since those already have the patches applied on the cluster, and
the next branch cut from master will contain the fix from master.

We're potentially moving to pushing them into gerrit and having jenkins
build the tarballs, so this process might be flipped in the near future.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-11 Thread Chris Steipp
On Sat, Feb 8, 2014 at 8:14 AM, Brian Wolff  wrote:

> On 2/7/14, Steven Walling  wrote:
> > If feel like I should reiterate why I proposed this change. Maybe no one
> > cares, but I think it might help convince folks this is NOT an argument
> for
> > "let's reduce user freedom in the name of security."
> >
> > I didn't worked on the RFC because I love tinkering with password
> security
> > in my spare time and know lots about it. Far from it. I did it because I
> > think we're failing MediaWiki users on *all installations* by inviting
> them
> > to sign up for an account, and then failing to set default requirements
> > that help them adequately secure those accounts. Users tend to follow
> > defaults and do the minimum effort to reach their goals -- in this case
> to
> > sign up and then get editing. It's our job as the MediaWiki designers and
> > developers to set good defaults that encourage account security without
> > being excessively annoying.
> >
> > In addition to just being sane about security defaults, there is more.
> > Allow me to wax poetic a moment... If you can edit anonymously, why do we
> > allow and encourage registration at all? Many reasons of course, but one
> of
> > them is because it is a rewarding experience to have a persistent
> identity
> > on a wiki. We all know how real that identity becomes sometimes. When I
> > meet Krinkle or MZMcbride in real life, I don't call them Timo and Max.
> Or
> > if I do, I don't think of them as those names in my head.
> >
> > When wiki users start an account, they might think that they are just
> > creating something unimportant. They may actually have bad intentions.
> But
> > part of this is that we're offering people an account because it gives
> them
> > a chance to be recognized, implicitly and explicitly, for the work they
> do
> > on our wikis.
> >
> > I think setting a default of 1 character passwords required doesn't
> > reinforce the idea that an account is something you might actually come
> to
> > cherish a bit, and that it might even represent you in some important way
> > to others. By signaling to new users that an account is so worthless that
> > it's cool if you have a one character password... well, is that really
> such
> > a good thing?
> >
> > On Thu, Feb 6, 2014 at 5:44 PM, MZMcBride  wrote:
> >
> >> P.S. I also casually wonder whether there's a reasonable argument to be
> >> made here that requiring longer passwords will hurt editor retention
> more
> >> than it helps, but this thought is still largely unformed and unfocused.
> >>
> >
> > I think that's a canard. There are many many sites that do not have user
> > acquisition or retention problems, while also having sane password length
> > requirements. Yes, this is a potential extra roadblock, which may
> slightly
> > reduce conversion rates on the signup form by slowing people down.
> However,
> > one of the clear arguments in favor of doing this now (as opposed to say,
> > back in 2001) is that users will largely expect an account on a popular
> > website to require them to have a password longer than 1 character.
> >
> > If we really are scared about the requirements in our signup form driving
> > people away from editing, we can make many user experience improvements
> > that would, like every other site, offset the terrible awful horrible
> evil
> > of requiring a six character password. I'd be happy to list specifics if
> > someone wants, but this email is already too long.
> >
> > Steven
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Thanks for the background, I think its important to know the "why" for
> a change, not just a what. However it doesn't address what I see as
> the main concern being raised about this proposal - the lack of a
> threat model. Who is the enemy we're concerned about breaking into
> accounts? What is the enemy's resources? Anything done for security
> should be in reference to some sort of threat model. Otherwise we will
> probably end up implementing security that does not make sense, things
> that protect one aspect without protecting the important aspect, etc.
> Well most people think having distinct identities on wiki is
> important, what we need to protect them from is going to vary wildly
> from person to person. It wouldn't surprise me if the hard-core
> SoftSecurity people would argue for an honour system...
>

Totally agree, and I added a first pass for it at
https://www.mediawiki.org/wiki/Requests_for_comment/Passwords#Threats


>
> > Users tend to follow
> > defaults and do the minimum effort to reach their goals -- in this case
> to
> > sign up and then get editing.
>
> 'password' is probably less secure than most one letter passwords.
>
> --bawolff
>
> p.s. I don't think stronger password requirements will have much of an
> affect on user retention assuming the requirements aren't insane (e.g.
>

Re: [Wikitech-l] Password Hash

2014-02-06 Thread Chris Steipp
On Wed, Feb 5, 2014 at 8:26 PM, C. Scott Ananian wrote:

> Password hashing algorithms are not the same as general hash algorithms.  I
> would prefer we didn't use whirlpool; it is "recommended by NESSIE and ISO"
> as a hash function, but as a password hash.  CWE916 recommends "bcrypt,
> scrypt, and PBKDF2" specifically for password hashing.
>
> To be clear, I have nothing against the Whirlpool hash algorithm itself:
> it's got a long pedigree with a decent amount of cryptoanalysis.  It's just
> the extension to password hashing which is nonstandard.  If you wanted to
> use Whirlpool as a password hash, you should apply it as part of PBKDF2,
> which is parameterizable.  That would be a reasonable way to distinguish
> the WMF hash to avoid general attacks without inventing new cryptography.
>  The default PRF for PBKDF2 is HMAC-SHA-1; you would be replacing this with
> HMAC-Whirpool.  This would be much preferable to using
> str_repeat+Whirlpool.
>   --scott
>

Sorry for the misleading. "Tim's algorithm" was indeed in reference to
using str_repeat vs. the tight xor loop of pbkdf2. Here are the relevant
ways that each do work:

pbdkf2:
for ( $j = 1; $j < $this->params['rounds']; ++$j ) {
$lastRound = hash_hmac( $this->params['algo'], $lastRound, $password, true
);
 $roundTotal ^= $lastRound;
}

Tim's:
for ( $i = 0; $i < $iter; $i++ ) {
$h = hash( 'whirlpool', str_repeat( $h . $this->args[0], 100 ), true );
 $h = substr( $h, 7, 32 );
}

If you look at whirlpool's compression function for the long messages, and
see that pbdkf2 as pretty much a Davies-Meyer compression function, they
have very similar properties. Except where they're subtly different, of
course ;).

The first subtle difference that I like about pbkdf2 is that the password
is mixed in at each round throughout, whereas Tim only mixes it in directly
in the first iteration (which is roughly the same as 3 rounds of pbkdf2 for
an 8 character password and 8 byte salt, since whirlpool is operating on
512-bit blocks). This could make pbkdf2 weaker if a key recovery attack
suddenly showed up in the hmac function, although that seems very unlikely
for hmac-sha256.

Additionally, since Tim's assigns the output to $h instead of xoring into
the previous round, that would be the same as pbkdf2 doing an assignment
every 14 rounds, which would feel a little weaker to me. Tim's could be
updated to keep the last block and do an xor instead, and they would be
more similar.

For someone doing a custom crypto scheme, I think Tim does better than
most, but overall it seems like most people prefer complying with a well
recommended standard than being unique.

So far no one has said they dislike pbkdf2, while bcrypt would require an
extra hash in serial to make sure long passwords can be handled, and would
require the php version bump. Anyone have strong opinions against pbkdf2?



>
>
>
> On Wed, Feb 5, 2014 at 10:00 PM, Marc A. Pelletier 
> wrote:
>
> > On 02/05/2014 09:34 PM, Tim Starling wrote:
> > > Maybe Chris's phrasing misled you: I didn't invent the Whirlpool
> > > algorithm
> >
> > And so it did; something a quick google would have revealed. In my
> > defense, "The Whirlpool algorithm by Tim" was pretty convincing
> > attribution.  :-)
> >
> > I'd need to read up on that algorithm a bit before I have an opinion on
> > whether length-extension attacks are not an issue with it (which is
> > often particularly nasty when the message repeats or is cyclical).  Most
> > hashes fare better by prepending a nonce as salt than they do by padding
> > or repeating.
> >
> > -- Marc
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> (http://cscott.net)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's improve our password policy

2014-02-06 Thread Chris Steipp
On Wed, Feb 5, 2014 at 8:00 PM, MZMcBride  wrote:

> Hi.
>
> Tyler Romeo wrote:
> >On Wed, Feb 5, 2014 at 2:20 AM, MZMcBride  wrote:
> >> Ultimately, account security is a user's prerogative. [...] Banks and
> >>even e-mail providers have reason to implement stricter authentication
> >>requirements.
> >
> >This is conflicting logic. If it is the user's job to enforce their own
> >account security, what reason would banks or email providers have to
> >require long passwords?
>
> I'm not sure the logic is conflicting. I tried to separate individual
> thoughts into individual paragraphs. The common thread of my message was
> that I haven't yet seen enough evidence that the cost here is worth the
> benefit. The benefits to securing valueless accounts remains unclear,
> while the implementation cost is non-negligible.
>
> E-mail accounts are often used in identity verification processes and
> banks are banks. While you and I may disagree with their password
> policies, there's at least a reasonable explanation for implementing more
> stringent requirements in these two cases. Compare with MediaWiki user
> accounts. What's the argument here? Why is this worth any effort?
>

I think there are a couple of reasons why we have a duty to enforce strong
passwords. Let me try to convince you.

1) As I understand it, the reason we went from 0 to 1 character required is
spammers were actively trying to find accounts with no password so they
could edit with an autoconfirmed account. We rely on "number of
combinations of minimum passwords" to be greater than "number of tries
before an IP must also solve captcha to login" to mitigate some of this,
but I think there are straightforward ways for a spammer to get accounts
with our current setup. And I think increasing the minimum password length
is one component.

2) We do have a duty to protect our user's accounts with a reasonable
amount of effort/cost proportional to the weight we put on those
identities. I think we would be in a very difficult spot if the foundation
tried to take legal action against someone for the actions they took with
their user account, and the user said, "That wasn't me, my account probably
got hacked. And it's not my fault, because I did the minimum you asked me."
So I think we at least want to be roughly in line with "industry standard",
or have a calculated tradeoff against that, which is roughly 6-8 character
passwords with no complexity requirements. I personally think the
foundation and community _does_ put quite a lot of weight into user's
identities (most disputes and voting processes that I've seen have some
component that assume edits by an account were done by a single person), so
I think we do have a responsibility to set the bar at a level appropriate
to that, assuming that all users will do the minimum that we ask. Whether
it's 4 or 6 characters for us I think is debatable, but I think 1 is not
reasonable.



>
> I personally regularly use single-character passwords on test MediaWiki
> wikis (and other sites) because, as a user, it's my right to determine
> what value to place in a particular account.
>
> If one day MediaWiki wikis (or Wikimedia wikis, really) allow per-user
> e-mail (i.e., mzmcbr...@wikipedia.org) or if there comes a time when
> identity verification becomes part of the discussion (compare with
> Twitter's blue checkmark verified account practice), then it may make
> sense to require (l|str)onger passwords in those specific cases. Even
> today, if you want to make Jimmy or members of the Wikimedia Foundation
> staff have crazy-long passwords, that may be reasonable or prudent or
> what-have-you, but that doesn't mean MediaWiki core should go along.
>
> >If somebody guesses a user's password and empties their bank account, the
> >bank could care less, since it is the customer's fault for not making
> >sure their password is long enough.
>
> I'm not sure this is true, but it's too off-topic to discuss here. A
> thread about global banking laws and practices, particularly with regard
> to liability and insurance and criminal activity, would certainly be
> interesting to read, though. :-)
>
> >I'm sure a very heavy Wikipedia editor, who uses his/her account
> >to make hundreds of edits a month but isn't necessarily an administrator
> >or other higher-level user, sees their account as something more than a
> >throwaway that can be replaced in an instant.
>
> I absolutely agree with you on this point. And I think we can encourage
> stronger passwords, even on the login form if you'd like. Rather than only
> using user groups, we could also use edit count or edit registration date
> or any number of other metrics. The catch, of course, is (a) finding
> developer consensus on a reasonable implementation of a password strength
> meter and (b) finding local community consensus to make changes on a
> per-variable basis.
>
> >For example, MZMcBride, what if your password is "wiki", and somebody
> >compromises your account, and cha

Re: [Wikitech-l] Password Hash

2014-02-05 Thread Chris Steipp
On Wed, Feb 5, 2014 at 3:08 PM, Zachary Harris wrote:

> tl;dr PBKDF2 and bcrypt are both perfectly acceptable for security.
>
>
> PBKDF2 and bcrypt, as well as scrypt, are all well regarded by current
> infosec industry standards (with "current" being a key word). " While
> there is active debate about which of these is the most effective, they
> are all stronger than using salts with hash functions [that have] very
> little computing overhead" (CWE 916). Feel free to use whichever one
> best meets the project needs in terms of implementation, user version
> migration, etc.
>
> Custom crypto algorithms should indeed always be completely off the
> table, including any (supposed) "minor" custom modifications to crypto
> standards. Indeed, custom _implementations_ themselves are best to be
> avoided in favor of established libraries whenever possible.
>
> I had not heard of Whirlpool before. While (based on WP) the algo has a
> reputable designer, hashes can be built for different purposes, and the
> WP page does not appear to indicate that this one was designed for the
> purpose of strengthening the difficulty to crack. Indeed, the phrase
> "... was changed ... to one which ... is easier to implement in
> hardware" is an _undesirable_ quality when it comes to the goal of key
> stretching.
>
> Note that much confusion on the web about key lengths with bcrypt ("72"
> vs. "56" bytes) comes from the fact that there are TWO algorithms called
> "bcrypt" which both happen to use Blowfish. One is an encryption
> algorithm, and the other is a hash algorithm. While they share a common
> core component, the purposes are thereby entirely different. For the
> sake of the bcrypt HASHING/key-strengthening algorithm (which we care
> about now), the 72-byte input parameter is in no way a theoretical
> problem at all, even for non-Latin UTF-8 based passphrases which eat up
> 3 bytes per unicode point. The reason is because the text-based
> passphrase itself needs to "somehow" be converted into 18 words of
> 32-bits each anyway. (If we were _encrypting_ then the 56 bytes limit
> for THAT algorithm would come into play.) Even if you restrict your
> attention to ASCII it would not be ideal to simply convert ASCII code
> points to a (zero-padded) juxtaposed stream of 32-bit chunks anyway,
> because, well for one thing, you would be throwing away entropy due to
> not using the upper 1-bit range in each char, not to mention the range
> of unavailable non-printable ASCII characters. As noted on WP:bcrypt,
> "Mapping of password to input is unspecified in the original revision of
> bcrypt." So, despite the strict no-custom-crypto principle already
> noted, the "passphrase to bcrypt input" mapping is one place where the
> standard leaves you to just use practical smarts. Seeking to optimize
> the entropy range in this stage is almost certainly overkill anyway.
> Still, I believe we can do better than a "truncation of utf-8 text
> encoding" rule without great trouble.
>

Yes, we can hash the password first


>
> References:
> https://www.owasp.org/index.php/Password_Storage_Cheat_Sheet
> http://cwe.mitre.org/data/definitions/916.html  (Use of Password Hash
> With Insufficient Computational Effort)
> https://cwe.mitre.org/data/definitions/327.html  ( ... Do not develop
> custom or private cryptographic algorithms)
> https://bugzilla.wikimedia.org/show_bug.cgi?id=28419  (Re: The Whirlpool
> recommendation)
> https://en.wikipedia.org/wiki/Whirlpool_(cryptography)
> http://en.wikipedia.org/wiki/Bcrypt (Note reference to the "other"
> bcrypt algorithm near the bottom of External Links)
>
> -Zach Harris, PhD
> Secure Code Analyst
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password Hash

2014-02-05 Thread Chris Steipp
On Wed, Feb 5, 2014 at 1:03 PM, Brion Vibber  wrote:

> Offhand I'd say "use bcrypt", but from http://us3.php.net/password_hash --
>
> "*Caution*
>
> Using the *PASSWORD_BCRYPT* for the *algo* parameter, will result in the
> *password* parameter being truncated to a maximum length of 72 characters.
> This is only a concern if are using the same salt to hash strings with this
> algorithm that are over 72 bytes in length, as this will result in those
> hashes being identical."
>
> Is the 72-byte truncation a general bcrypt problem or specific to
> password_hash()? Any concerns or a non-issue? Note that some non-Latin
> strings can only fit 24 chars in 72 bytes of UTF-8. Long enough for most
> passwords, but some people like passphrases. :)
>

It's an issue with bcrypt itself (only uses 18 32 bit keys). Good point.



>
> -- brion
>
>
> On Wed, Feb 5, 2014 at 12:53 PM, Chris Steipp 
> wrote:
>
> > Hi all, I wanted to bikeshed just a little bit, to make sure there is
> some
> > consensus.
> >
> > tl;dr We're upgrading the password hash used to store passwords to make
> > offline cracking more difficult. In doing that, we need to set one of the
> > options as default. Speak up if you have strong feelings about one over
> the
> > other.
> >
> >
> > Along with refactoring how passwords are stored and checked,
> > https://gerrit.wikimedia.org/r/#/c/77645 implements two strong hashing
> > algorithms PBKDF2 [1] and bcrypt [2]. I added a followup commit to add in
> > the algorithm that Tim came up with in 2010 using Whirlpool as a hash
> > function [3].
> >
> > For any of these, there is a maintenance script to wrap current passwords
> > with one of the strong ones, so we can upgrade the whole database without
> > interaction from the users. It's also simple to upgrade the work factor
> or
> > change to a new algorithm, if we decide that is needed in the future. But
> > for the actual default...
> >
> > Bcrypt is probably the most common option for password storage in webapps
> > that I see. PHP 5.5 uses it as the default for the new password_hash()
> > function. The only issue is that PHP before 5.3.7 had a flaw in their
> > implementation which resulted in weak hashes. If we set bcrypt as
> default,
> > we would want to raise the minimum php version to 5.3.7 (it's currently
> > 5.3.2) for MediaWIki 1.23.
> >
> > PBKDF2 is an RSA standard and is included in PHP 5.5. Tyler did an
> > implementation in the patch to make it backwards compatible. The only
> > downside to it is the connection to RSA, who may have knowingly
> > standardized weak algorithms, although the security properties of PBKDF2
> > are fairly well studied and haven't been called into question.
> >
> > The Whirlpool algorithm by Tim would force password cracking software to
> do
> > a custom implementation for our hashes. It has very similar work effort
> to
> > bcrypt, and should keep our passwords as safe as using bcrypt. The theory
> > behind it seems good, but obviously, we might discover a gaping hole in
> it
> > at some point.
> >
> > Is there any strong preference among these options? My personal vote is
> for
> > bcrypt, if bumping the php version doesn't seem like a big deal to
> > everyone.
> >
> >
> > [1] - https://en.wikipedia.org/wiki/PBKDF2
> > [2] - https://en.wikipedia.org/wiki/Bcrypt
> > [3] -
> > http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg08830.html
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Password Hash

2014-02-05 Thread Chris Steipp
Hi all, I wanted to bikeshed just a little bit, to make sure there is some
consensus.

tl;dr We're upgrading the password hash used to store passwords to make
offline cracking more difficult. In doing that, we need to set one of the
options as default. Speak up if you have strong feelings about one over the
other.


Along with refactoring how passwords are stored and checked,
https://gerrit.wikimedia.org/r/#/c/77645 implements two strong hashing
algorithms PBKDF2 [1] and bcrypt [2]. I added a followup commit to add in
the algorithm that Tim came up with in 2010 using Whirlpool as a hash
function [3].

For any of these, there is a maintenance script to wrap current passwords
with one of the strong ones, so we can upgrade the whole database without
interaction from the users. It's also simple to upgrade the work factor or
change to a new algorithm, if we decide that is needed in the future. But
for the actual default...

Bcrypt is probably the most common option for password storage in webapps
that I see. PHP 5.5 uses it as the default for the new password_hash()
function. The only issue is that PHP before 5.3.7 had a flaw in their
implementation which resulted in weak hashes. If we set bcrypt as default,
we would want to raise the minimum php version to 5.3.7 (it's currently
5.3.2) for MediaWIki 1.23.

PBKDF2 is an RSA standard and is included in PHP 5.5. Tyler did an
implementation in the patch to make it backwards compatible. The only
downside to it is the connection to RSA, who may have knowingly
standardized weak algorithms, although the security properties of PBKDF2
are fairly well studied and haven't been called into question.

The Whirlpool algorithm by Tim would force password cracking software to do
a custom implementation for our hashes. It has very similar work effort to
bcrypt, and should keep our passwords as safe as using bcrypt. The theory
behind it seems good, but obviously, we might discover a gaping hole in it
at some point.

Is there any strong preference among these options? My personal vote is for
bcrypt, if bumping the php version doesn't seem like a big deal to everyone.


[1] - https://en.wikipedia.org/wiki/PBKDF2
[2] - https://en.wikipedia.org/wiki/Bcrypt
[3] -
http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg08830.html
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please update for the latest security patch

2014-02-03 Thread Chris Steipp
Hi lists,

If you haven't patched with the last security release, or know of a wiki
that hasn't patched yet, please do so immediately. An exploit was released
on the full disclosure mailing list over the weekend[1] that targets the
vulnerability in the PdfHandler extension.

If you're not able to patch for some reason, you may be able to work around
the issue:
* If you have never allowed .djvu files to be uploaded, but you do allow
pdf files, you can simply disable the PdfHandler extension (typically by
remove the include in your LocalSettings.php).
* If you have any .djvu files saved on your wiki, then there is no
workaround-- you need to apply the security patch to MediaWiki core.

If anyone is running an unsupported branch of MediaWiki (1.20 was recently
EOL'ed), and needs help creating a patch for their instance, I'm happy to
try and work with you to get the vulnerability closed. Contact me off list,
or on irc.


[1] - http://seclists.org/fulldisclosure/2014/Feb/6
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Security Releases: 1.22.2, 1.21.5 and 1.19.11

2014-01-28 Thread Chris Steipp
I would like to announce the release of MediaWiki 1.22.2, 1.21.5 and
1.19.11.

Your MediaWiki installation is affected by a remote code execution
vulnerability if you have enabled file upload support for DjVu (natively
supported by MediaWiki) or PDF files (in combination with the PdfHandler
extension). Neither file type is enabled by default in MediaWiki
installations. If you are affected, we strongly urge you to update
immediately.

Affected supported versions: All

== Security fixes ==

* Netanel Rubin from Check Point discovered a remote code execution
vulnerability in MediaWiki's thumbnail generation for DjVu files. Internal
review also discovered similar logic in the PdfHandler extension, which
could be exploited in a similar way. (CVE-2014-1610)


== Bug Fixes in 1.22.2 ==
* (bug 58253) Check for very old PCRE versions in installer and updater
* (bug 60054) Make WikiPage::$mPreparedEdit public


Full release notes for 1.22.1:


Full release notes for 1.21.4:


Full release notes for 1.19.9:


For information about how to upgrade, see



**
   1.22.2
**
Download:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.2.tar.gz

Patch to previous version (1.22.1):
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.2.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-core-1.22.2.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.2.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.2.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-i18n-1.22.2.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html

**
   1.21.5
**
Download:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.5.tar.gz

Patch to previous version (1.21.4):
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.5.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.21/mediawiki-core-1.21.5.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.5.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-1.21.5.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.21/mediawiki-i18n-1.21.5.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html

**
   1.19.11
**
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.11.tar.gz

Patch to previous version (1.19.10):
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.11.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-core-1.19.11.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.11.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.11.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.11.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html

**
   Extension:PdfHandler
**
Information and Download:
https://www.mediawiki.org/wiki/Extension:PdfHandler
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pre-Release Announcement for MediaWiki 1.22.2, 1.21.5, and 1.19.11

2014-01-27 Thread Chris Steipp
This is a notice that on Tuesday, Jan 28th between 21:00-22:00 UTC (1-2pm
PST) Wikimedia Foundation will release critical security updates for
current and supported branches of the MediaWiki software and extensions.
Downloads and patches will be available at that time, with the git
repositories updated soon after. The vulnerable feature is not enabled in
MediaWiki by default, however many sites will want to upgrade as soon as
the patch is available.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to collaborate when writing OAuth applications?

2014-01-21 Thread Chris Steipp
Yeah, it's not possible to drop it yourself yet. Let me, or any oauth admin
(stewards) know that you wasn't it dropped, and we can reject it.
On Jan 21, 2014 6:31 AM, "Dan Andreescu"  wrote:

> >
> > Another question is: i would like to drop my first "test-app"
> > consumer. How can I do it?
> >
>
> I'm not sure, but I would like to drop a consumer as well.  Last time I
> asked it was not yet possible.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Security Releases: 1.22.1, 1.21.4 and 1.19.10

2014-01-13 Thread Chris Steipp
I would like to announce the release of MediaWiki 1.22.1, 1.21.4 and
1.19.10.
These releases fix a number of security related bugs that could affect
users of
MediaWiki. In addition, MediaWiki 1.22.1 is a maintenance release. It fixes
several bugs. You can consult the RELEASE-NOTES-1.22 file for the full list
of
changes in this version. Download links are given at the end of this email.


== Security fixes ==

* MediaWiki user Michael M reported that the fix for bug 55332
(CVE-2013-4568)
allowed insertion of escaped CSS values which could pass the CSS validation
checks, resulting in XSS. (CVE-2013-6451)


* Chris from RationalWiki reported that SVG files could be uploaded that
include external stylesheets, which could lead to XSS when an XSL was used
to
include JavaScript. (CVE-2013-6452)


* During internal review, it was discovered that MediaWiki's SVG
sanitization
could be bypassed when the XML was considered invalid. (CVE-2013-6453)


* Durign internal review, it was discovered that MediaWiki's CSS
sanitization
did not filter -o-link attributes, which could be used to execute
JavaScript in
Opera 12. (CVE-2013-6454)


* During internal review, it was discovered that MediaWiki displayed some
information about deleted pages in the log API, enhanced RecentChanges, and
user watchlists. (CVE-2013-6472)


Additionally, the following extensions have been updated to fix security
issues:

* TimedMediaHandler: Bawolff discovered an XSS vulnerability with the way
the
extension stored and used HTML for showing videos. (CVE-2013-4574)


* Scribuntu: Internal review found a NULL pointer dereference in
php-luasandbox, which could be used for DoS attacks. (CVE-2013-4570)


* Scribuntu: Internal review found a Buffer Overflow in php-luasandbox. It's
not know if this could be use for code execution on the server.
(CVE-2013-4571)


* CentralAuth: Eran Roz reported that MediaWiki usernames could be leaked to
other websites. Javascript returned for CentralAuth's login would update the
page DOM with the username, even when included on other sites.
(CVE-2013-6455)


* SemanticForms: Ravindra Singh Rathore reported a missing CSRF check to
Mozilla, who reported the issue to us. Several other forms in the extension
were also fixed.


== Bug fixes in 1.22.1 ==

* (bug 59945) 1.22 tarball offers Extension SimpleAntiSpam which is supposed
to be in core.

* (bug 58178) Restore compatibility with curl < 7.16.2.

* (bug 56931) Updated the plural rules to CLDR 24. They are in new format
which is detailed in UTS 35 Rev 33. The PHP parser and evaluator as well as
the JavaScript evaluator were updated to support the new format. Plural
rules
for some languages have changed, most notably Russian. Affected software
messages have been updated and marked for review at translatewiki.net.
This change is backported from the development branch of MediaWiki 1.23.

* (bug 58434) The broken installer for database backend Oracle was fixed.

* (bug 58167) The web installer no longer throws an exception when PHP is
  compiled without support for MySQL yet with support for another DBMS.

* (bug 58640) Fixed a compatibility issue with PCRE 8.34 that caused pages
to appear blank or with missing text.

* (bug 47055) Changed FOR UPDATE handling in Postgresql


Full release notes for 1.22.1:


Full release notes for 1.21.4:


Full release notes for 1.19.9:


For information about how to upgrade, see



**
   1.22.1
**
Download:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.1.tar.gz

Patch to previous version (1.22.0), without interface text:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.1.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-i18n-1.22.1.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.22/mediawiki-core-1.22.1.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.1.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-1.22.1.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.22/mediawiki-i18n-1.22.1.patch.gz.sig

Public keys:
https://www.mediawiki.org/keys/keys.html


Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2014-01-13 Thread Chris Steipp
On Mon, Jan 13, 2014 at 8:32 AM, Zack Weinberg  wrote:

> To satisfy Applebaum's request, there needs to be a mechanism whereby
> someone can edit even if *all of their communications with Wikipedia,
> including the initial contact* are coming over Tor or equivalent.
> Blinded, costly-to-create handles (minted by Wikipedia itself) are one
> possible way to achieve that; if there are concrete reasons why that
> will not work for Wikipedia, the people designing these schemes would
> like to know about them.
>

This should be possible, according to https://meta.wikimedia.org/wiki/NOP,
which Nemo also posted. The user sends an email to the stewards (using tor
to access email service of their choice). Account is created, and user can
edit Wikimedia wikis. Or is there still a step that is missing?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pre-Release Announcement for MediaWiki 1.19.10, 1.21.4, and 1.22.1

2014-01-10 Thread Chris Steipp
This is a notice that on Tuesday, January 14th between 00:00-01:00 UTC
(*Monday* January 13th, 4-5pm PST) Wikimedia Foundation will release
security updates for current and supported branches of the MediaWiki
software, as well as several extensions. Downloads and patches will be
available at that time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   >