[Wikitech-l] What's our Bus Factor?

2013-01-09 Thread Jay Ashworth
It's the new year, and in light of the recent poll about which devs are 
working on what, let me make another, albeit vaguely macabre, suggestion:

If you're a developer, or other staffer, can the people around you pick
up the pieces if you get hit by a bus?  How badly will it impact delivery
and delivery scheduling of what you're working on?

Is the institutional knowledge about our architecture and plans sufficiently
well documented and spread out that we don't have anyone with an unreasonably
high bus factor?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Update on Ashburn data center switchover / migration – target date is week of 1/22/13

2013-01-11 Thread Jay Ashworth
I have forwarded this to the Outages mailing list, so that people who 
want to know/get complaints about such things have advance warning.

Cheers,
-- jra

- Original Message -
> From: "Ct Woo" 
> To: "Wikimedia developers" , "Development and 
> Operations Engineers"
> 
> Sent: Friday, January 11, 2013 3:07:15 PM
> Subject: [Wikitech-l] Update on Ashburn data center switchover / migration – 
> target date is week of 1/22/13
> All,
> 
> The Migration team is in the last lap on completing the remaining
> tasks to
> ready our software stack and Ashburn infrastructure for the big
> switchover
> day.
> 
> Per my last
> update,
> with the Fundraising activity behind us now, the team has scheduled
> the *week
> of 22nd January*, 2013 to perform the switchover. We are going to
> block a
> 8-hour migration window on the *22nd, 23rd and 24**th*. During those
> periods, *17:00 UTC to 01:00 UTC hours (9am to 5pm PST*), there will
> be
> intermittent blackouts and they will be treated as 'planned' outages.
> You
> can follow the migration on irc.freenode.org in the
> #wikimedia-operations
> channel.
> 
> The team is putting the finishing touches to the last few tasks and we
> will
> make the final Go/No decision on 18th Jan, 2013. An update will send
> out
> then. For those interested in tracking the progress, the meeting notes
> are
> captured on this wikitech
> page
> .
> 
> *Please note that we will be restricting code deployment during that
> week,
> allowing only emergency and critical ones only.*
> 
> Thanks.
> 
> CT Woo
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Update on Ashburn data center switchover / migration – target date is week of 1/22/13

2013-01-14 Thread Jay Ashworth
- Original Message -
> From: "Guillaume Paumier" 

> On Fri, Jan 11, 2013 at 10:48 PM, Jay Ashworth 
> wrote:
> > I have forwarded this to the Outages mailing list, so that people who
> > want to know/get complaints about such things have advance warning.
> 
> Thank you :)
> 
> For those, like me, who upon reading that message wondered if there
> was an "outages-l" among the gazillion Wikimedia mailing lists, Jay is
> referring to a third-party mailing list:
> https://puck.nether.net/mailman/listinfo/outages

Yeah; nerdview is even bad among nerds.

Outages is a collection of 3 mailing lists, a wiki, and a social media
report tracker run by Virendra Rode with some help from Frank Bulk and I;
Jared Mauch supplies the list reflectors.

Our wiki has, among other things, a page that collects useful network
testing and diagnostic tools, which I really need to groom again -- if you
look at it, and find something missing or broken, let me know.  :-)

It runs, of course, Mediawiki.  (Is there anything else?)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages

2013-01-16 Thread Jay Ashworth
- Original Message -
> From: "." 

> It could be interesting (but I have no idea if is feasible), if git
> recognize automatically elements in a commit text, and colorize it on
> the terminal screen (or maybe bold it if the screen renders using
> truetype fonts). This way, if you have written wikidata many times,
> you will quickly spot a problem if the commit renders to you with
> "fixed wkidata bug by reversing the polarity" and wikidata is not
> bolded/colored different. A alternate would be for this
> script/program, to extract keywords and present to you, so if you
> notice the commit lack the label "wikidata", theres something wrong.

Talk to the people over on the derivative wikitext list, spun off to
contain discussions relevant to creating a replacement MWText parser
(there are, I think 5 or 6 projects in varying degrees of activity;
though the list itself is pretty quiet).

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] New 1.20.2 install didn't prompt for license or private mode

2013-01-21 Thread Jay Ashworth
Is that stuff supposed to go after one chooses "no, ask me more questions"?

Cause I was not asked *any* more questions.

On a related story: if I give the installer a table prefix for which the
tables already exist, what will it do?  I have 3 wikis in the same DB, and
I therefore cannot simply drop the DB... and you apparently can't use
wildcards in DROP TABLE.

Cheers,
-- jr 'Bobby; where are ya'?" a
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-01 Thread Jay Ashworth
- Original Message -
> From: "Dan Andreescu" 

> The following manual seems to be the most actively maintained guide
> for getting MediaWiki installed on Windows:
> 
> http://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Windows
> 
> If you run into any problems, I'd suggest adding them to the manual
> along with any resolutions you or others come up with. Good luck!

I'm not sure he actually wants to run it on Windows.  

He may just need SSO with Active Directory.

   https://encrypted.google.com/search?q=mediawiki+active+directory

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-07 Thread Jay Ashworth
- Original Message -
> From: "Daniel Barrett" 

> 1. A desire for a department to have "their own space" on the wiki.
> I'm not talking about access control, but (1) customized look & feel,
> and (2) ability to narrow searches to find articles only within that
> space. The closest related concept in MediaWiki is the namespace,
> which can have its own CSS styling, and you can search within a
> namespace using Lucene with the syntax "NamespaceName:SearchString".
> However, this is not a pleasant solution, because it's cumbersome to
> precede every article title with "NamespaceName: " when you create,
> link, and search.
> 
> If the *concept* of namespaces could be decoupled from its title
> syntax, this would be a big win for us. So a namespace would be a
> first-class property of an article (like it is in the database), and
> not a prefix of the article title (at the UI level). I've been
> thinking about writing an extension that provides this kind of UI when
> creating articles, searching for them, linking, etc.
> 
> Some way to search within categories reliably would also be a huge
> win. Lucene provides "incategory:" but it misses all articles with
> transcluded category tags.
> 
> 2. Hierarchy. Departments want not only "their own space," they want
> "subspaces" beneath it. For example, "Human Resources" wiki area with
> sub-areas of Payroll, Benefits, and Recruiting. I realize Confluence
> supports this... but we decided against Confluence because you have to
> choose an article's area when you create it (at least when we
> evaluated Confluence years ago). This is a mental barrier to creating
> an article, if you don't know where you want to put it yet. MediaWiki
> is so much better in this regard -- if you want an article, just make
> it, and don't worry where it "goes" since the main namespace is flat.
> 
> I've been thinking about writing an extension that superimposes a
> hierarchy on existing namespaces, and what the implications would be
> for the rest of the MediaWiki UI. It's an interesting problem. Anyone
> tried it?

What you want, I think, is what Zope2 called "acquisition".  It's like
OO subclass inheritance, but it's *geographic* depending on where you 
were in the tree; the old Mac Frontier system did something like it
too.

You want links to have a Search Path, that starts with whatever part/
subpart of the tree the current page is in, and then climbs the tree, 
ending in the unadorned Main namespace, whenever a user clicks them.

That breaks the semantics of wikilinks some, but that's probably ok
for your use.  It *might* be generally useful; I'm trying to figure out
if there are any obvious common use cases that it breaks, and how you
tell where in the tree a page lives when it's created (and how you
would show that to users).

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> I believe the OpenID extension is matured to the point where it's usable on
> the Wikimedia projects, acting as an OpenID provider. The extension still
> needs review and such, but I think it's a good time to discuss how we'd
> like to implement this on the projects.

I, too, want to clarify: you're proposing centralizing WMF identity to 
whatever extent it is not already centralized, and then using OpenID
*within MWF*: so that all WMF sites and installed extensions can auth
users against our own user database?

Not authenticating users against external OID providers (which, as nearly
as I can tell, largely amount to "I am whom I say I am"), or allowing
external non-WMF sites to authenticate against our user database.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> Any OpenID consumer, whether WMF or not, would be able to use us as an
> authentication provider.

So, then, all OpenID guarantees is "this provider says it's the same person 
it was last time"?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> I see no reason in doing so. If third parties want to allow Wikimedia
> as a provider, I don't see why we'd object.

There is no potential liability there?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Jay Ashworth
- Original Message -
> From: "Marc A. Pelletier" 

> On 02/22/2013 10:44 PM, Jay Ashworth wrote:
> > There is no potential liability there?
> 
> IANAL, but I can't think of a scenario where allowing a user to prove "I
> am user X on Wikimedia projects" can create liability; if the client is
> pleased with the (proven) assertion for their purposes, they can use
> it. If not, they won't.

If those are the accepted semantics of the reply, then I retract the concern.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bringing OpenID as a provider to Wikimedia projects

2013-02-22 Thread Jay Ashworth
 Original Message -
> From: "Marc A. Pelletier" 

> On 02/22/2013 10:43 PM, Jay Ashworth wrote:
> > So, then, all OpenID guarantees is "this provider says it's the same
> > person it was last time"?
> 
> The exact semantics is, IIRC, "that person has presented credential to
> us we accept as identifying them as our user $IDENTIFIER". Whether the
> client trusts that $IDENTIFIER is reasonably stable for their
> purposes, or that they trust our word, is their call.

I'm translating that as "yes".  :-)

I've always looked with rather a jaundiced eye at OpenID, as it was sold
as "you can run your own authenticator service", and that always struck me
as "I am who I say I am", which is, obviously, pretty useless, in the
general case.  (Early examples showed login boxes where you *provided
the URL of a random OID provider*; clearly, if the site doesn't trust
said provider, the transaction is useless.)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] cleaning database of spam

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Platonides" 

> > What is exact procedure of properly removing page from database so
> > that it doesn't break anything? What needs to be deleted and in
> > which order?
> 
> maintenance/deleteArchivedRevisions.php permanently removes the
> content of deleted pages from the db.
> 
> For removing those users, see
> http://www.mediawiki.org/wiki/Extension:User_Merge_and_Delete
> 
> Also remember that due to the way mysql works, it may not release
> those 20GB back to the filesystem.

In particular, to get anything for your trouble, you will probably need
to dump the database, drop it, shut down MySQL and switch it to innodb
tablespace-per-file, turn it back on, and then reload the dump, as I 
recently had to.

This way, at least, once you clean it up, you can do the same dump and 
reload procedure on only one table, not the whole shootin' match.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] cleaning database of spam

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Petr Bena" 

> You meant innodb_file_per_table

Yes; I forgot the exact name, and tried (apparently unsuccessfully) to 
make that look as little like an exact parameter as possible.

Happily, the OP runs that way anyway.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DevOps/Continuous Deployment discussion?

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Juliusz Gonera" 

> On 02/20/2013 12:04 PM, Luke Welling WMF wrote:
> > I am strongly of the opinion that within broad ranges deployment
> > frequency
> > does not matter. It really does not matter if you deploy twice an
> > hour or
> > every second day.
> 
> What teams deploy every second day?

The ones who accidentally shipped a brown-paper-bag bug 2 days ago. ;-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> On Tue, Feb 26, 2013 at 5:32 PM, Leslie Carr 
> wrote:
> 
> > As some may have noticed, we are phasing out nagios in favor of
> > icinga ( https://www.icinga.org/ )
> >
> > nagios.wikimedia.org now redirects to icinga.wikimedia.org ! Please
> > let us know if you notice anything that has broken or is
> > inconsistent.

> Awesome work Leslie!

And thanks for the pointer, too; didn't realize the fork had happened.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Liangent" 

> > nagios.wikimedia.org now redirects to icinga.wikimedia.org ! Please
> > let us know if you notice anything that has broken or is
> > inconsistent.
> 
> So now there's no public view of server monitoring info?
> 
> http://status.wikimedia.org/ always shows nagios as disrupted now.

No, having done this sort of thing before, I would speculate that it
just slipped off their checklist, and they thank you for reminding them.

I thank you for reminding *me* that was there in the first place, though
I see that I once knew, for it is already listed here:

   http://wiki.outages.org/index.php/Dashboard

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Leslie Carr" 

> Icinga is public.

It may be, but that URL goes to an HTTPS Auth dialog, with nothing 
behind it if one cancels.  Perhaps something was missed?
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-26 Thread Jay Ashworth
- Original Message -
> From: "Jeremy Baron" 

> On Feb 26, 2013 11:25 PM, "Matthew Bowker"
> 
> wrote:
> > I hate to be "that guy," but is it supposed to be password
> > protected? Is
> there somewhere non-ops people can look for server status, or is
> http://status.wikimedia.org it?
> 
> try HTTP instead of HTTPS. (I don't know anything about why they're
> not the same or how long they've been like that.)

Noted.

Understand that for people who have HTTPS-anywhere installed (which should
be approximately everyone by now), that will be a common question.

Cheers,
- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nagios is dead, long live icinga!

2013-02-27 Thread Jay Ashworth
- Original Message -
> From: "Leslie Carr" 

> Can you try with https now ? I had forgotten to reload apache when
> pushing out a change to the https config (to allow https without
> login). You can also use http.

https://icinga.wikimedia.com is now confirmed accessible, yes.

One issue, possibly specific to me:

I'm old, my laptop has a 12" screen.  So I am prone to put Firefox in "Zoom 
Text Only" mode, and run the zoom up to read stuff.  Icinga handles that
pretty well, in our implementation, with one exception: that tab, top right,
that has the icinga logo in it also appears to contain some summary data,
and that part blows off the right edge of the screen (though it impinges on
the Icinga text logo even at normal size).

Not sure that's fixable, but I thought I'd mention it.

Thanks for getting this up, regardless.

And that service that's running status. is very spiffy; is that commercial?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] switching to something better than irc.wikimedia.org

2013-03-01 Thread Jay Ashworth
- Original Message -
> From: "Tyler Romeo" 

> > I think a very light weight proxy that only passes subscribe
> > commands to
> > redis would work. A read only redis slave could be provided but I
> > don't
> > think it includes a way to limit what commands clients can run,
> > including
> > administrative ones. I think we'd want a thin proxy layer in front
> > anyways,
> > to track and if necessary, selectively limit access. It could be
> > very
> > simple though.
> >
> 
> Mhm, in that case this might be a viable solution.

Dumb question: is the work ESR recently did on irker on-topic for this
conversation, and did anyone know it existed?  :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OpenID as a provider project page

2013-03-02 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> I wrote up some quick documentation on OpenID as a provider. Feel free to
> modify it, especially for inaccurately used terminology. It's also likely a
> good time to start bikeshed discussions on the urls, as I think it'll
> end up taking a while to lock that down.

I supposed if I take issue with calling URL choice a bikeshed, that would 
constitute bikeshedding, right?  :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OpenID as a provider project page

2013-03-03 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> > I supposed if I take issue with calling URL choice a bikeshed, that
> > would constitute bikeshedding, right? :-)

> If you have purely technical reasons for the choice of a domain name,
> rather than aesthetic ones, I'm all for not calling those reasons
> bikeshedding ;)

Any reasons I would ever have for advancing a specific domain name for
such a thing would be technical.  In this case, the job is to make it as
generic as possible *without* making it so generic that it's not obvious
to the general public how it serves the most common case.

Since I don't have a complete apprehension of what it does, I won't be
advancing a specific suggestion at this time (though I am a published
author on this topic :-).

Cheers,
- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-05 Thread Jay Ashworth
- Original Message -
> From: "Mark Holmquist" 

> The minification process, however, does *not* cause a problem. We can
> simply add the comments to the file(s) after the minification. It does
> mean we'll need to include, potentially, multiple license headers in
> one HTTP response, but that shouldn't cause much issue.

I am neither an engineer, nor a WMF staffer, but I want to throw a flag
here anyway.

Yes, it will cause an issue.  If that extra data is going in every reply,
multiply its size by our replies per day count, won't you?  I don't know 
what that number is, but I'm quite certain it's substantial. 

*Every single byte* that goes in a place where it will be included in every 
reply directly affects our 95%ile data transfer, I should think, and thus
our budget.  Bytes are not always free.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-05 Thread Jay Ashworth
- Original Message -
> From: "Tyler Romeo" 

> > Yes, it will cause an issue. If that extra data is going in every
> > reply,
> > multiply its size by our replies per day count, won't you? I don't
> > know
> > what that number is, but I'm quite certain it's substantial.
> >
> > *Every single byte* that goes in a place where it will be included
> > in every
> > reply directly affects our 95%ile data transfer, I should think, and
> > thus
> > our budget. Bytes are not always free.
> 
> True, but if it's legally required it's not like we have an option.

Certainly.  But I see no reason to think it's legally required.  And
while I, too, only play one on the Internet, I've been doing it since 1983.

And I haven't been surprised all that often.

Mr Villa will come up with a more researched decision, certainly, but I
am relatively certain that a defensible case can be made that minifying is
equivalent to compiling, for the purposes of the license.

And in the unlikely event that's not good enough, the Foundation may well
be able to get a codicil license on the relevant libraries, acknowledging
that it needn't include the license text in on-the-wire minified copies.

My personal opinion, though, is that the proper approach is that the
license be officially interpreted by its issuer to exempt this sort
of minification-caused potential violation, as otherwise, minification
will negatively affect everyone who uses it, many of whom haven't WMF's
budget.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-05 Thread Jay Ashworth
- Original Message -
> From: "Tyler Romeo" 

> But WMF getting a license doesn't help everybody else who uses MW.

Minification is a WMF cluster issue, not a MW software issue, is it not?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "Brian Wolff" 

> > Minification is a WMF cluster issue, not a MW software issue, is it
> > not?

> Mediawiki minifies things regardless of if its being run by the WMF or
> somebody else.

Ah; thanks.  Have not looked at internals lately.  Since minification to
me as a netadmin is a strategic "size of pipe" issue, I assumed it was 
something deployed on WMF sites, not something baked into the base package.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "David Gerard" 

> People will say any spurious bollocks

What's the license on that observation, David?  :-)

Cheers,
-- jr 'I wanna steal that' a
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "MZMcBride" 

> The Open Source Initiative doesn't seem to really like the idea:
> .
> 
> A number of former and current contributors (notably Lee Daniel Crocker)
> have released their creative works and inventions into the public domain:
> .
> 
> I've always found CC-Zero and its surrounding arguments to be pretty
> stupid. I release most of the code I write into the public domain
> (though most of it lacks sufficient creativity in any case).

My understanding is that CC-Zero exists *because "the public domain" does 
not exist in the IP law of many countries*.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "Chris Grant" 

> This is based on a flawed reading of the GPL. The GPL covers the
> distribution of program code. The license specifically states that “The act
> of running the Program is not restricted”. (Furthermore: “Activities other
> than copying, distribution and modification are not covered by this
> License; they are outside its scope.”)
> 
> The terms you are all referring to relate to the distribution of the
> software, not the running of the software. Wikipedia.org, does not
> distribute the software, that is MediaWiki.org's job. If Wikipedia wanted
> to, we could remove all licensing information from the software and it
> would still be completely legal. The GPL *only* comes into effect once
> you start distributing the software.

The problem here, Chris, is "what constitutes 'distributing the software'?"

WP is *sending a copy of the JS from its servers to a client PC, there to
be executed*.  *We* consider that "incidental", but a court might not; 
decisions I'm aware of have gone both ways.  So that might *be* the 
distribution step, legally, and trigger the license requirement.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "Platonides" 

> Regarding GPL requisites, it seems clear that minified javascript is
> “object code” [1], which we can convey per section 6d [2], which is
> already possible if you know how the RL works, although we should
> probably provide those “clear directions”. Most problematic would be
> that you should also obey sections 4 and 5 (although I see a bit of
> contradiction there, how are you supposed to “keep intact all notices”
> where most notices are present in comments, designed to be stripped
> when
> compiled?)
> 
> But are we conveying it?

> > To “convey” a work means any kind of propagation that enables other
> > parties to make or receive copies. Mere interaction with a user
> >through a computer network, with no transfer of a copy, is not
> >conveying.
> 
> As javascript is executed in the client, it probably is.

Perhaps.  But HTML is also "executed" in the client, and some legal
decisions have gone each way on whether the mere viewing of a page 
constitutes "copying" in violation of copyright (the trend is towards
"no", thankfully. :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "Jack Phoenix" 

> Let me just state this for the record: I find copyright paranoia and
> associated acts, such as this very thread with 59 (and counting!)
> messages absurd, ridiculous and a complete waste of time.

We note that you have spoken.

Alas, the other 153 people who own copyright in the code in question 
haven't and, no offense, Jack, assuming they will have the same outlook
you do -- when it's on the record that developers' opinions on this 
range to both ends -- is probably not a safe enough bet for the foundation.

:-}

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Seemingly proprietary Javascript

2013-03-06 Thread Jay Ashworth
- Original Message -
> From: "Chad" 

> Jack is not alone. The amount of bikeshedding on this list has reached
> truly epic proportions in the last couple of weeks...to the point where I've
> started ignoring the vast majority of the list (and I've always been
> an advocate for the usefulness of this list).

While I disagree as to whether minified code needs a human readable 
embedded license, I don't think it's reasonable to characterize the 
discussion as bikeshedding, Chad.  I care about the licensing on my
code.  I'm not alone.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-13 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> We just finished deploying a new SSL certificate to the sites. Now all
> *.m and *. certificates are included in a single certificate, except
> mediawiki.org. Unfortunately we somehow forgot mediawiki.org when we
> ordered the updated cert. We'll be replacing this soon with another
> cert that had mediawiki.org included.
> 
> This should fix any certificate errors that folks have been seeing on
> non-wikipedia m. domains.

Hey, Ryan; did you see, perhaps on outages-discussion, the after action
report from Microsoft about how their Azure SSL cert expiration screwup
happened?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-13 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> > Hey, Ryan; did you see, perhaps on outages-discussion, the after action
> > report from Microsoft about how their Azure SSL cert expiration screwup
> > happened?

> What's the relevance here?

"Does ops have a procedure for avoiding unexpected SSL cert expirations,
and does this affect it in any way other than making it easier to implement?",
I would think...

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-13 Thread Jay Ashworth
- Original Message -
> From: "Jeremy Baron" 

> Can you just link to the discussion archive?

Was a posting:

http://blogs.msdn.com/b/windowsazure/archive/2013/03/01/details-of-the-february-22nd-2013-windows-azure-storage-disruption.aspx

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: New unified SSL certificate deployed

2013-03-14 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> On Wed, Mar 13, 2013 at 9:24 PM, Jay Ashworth  wrote:
> 
> > - Original Message -
> > > From: "Ryan Lane" 
> >
> > > > Hey, Ryan; did you see, perhaps on outages-discussion, the after
> > > > action
> > > > report from Microsoft about how their Azure SSL cert expiration
> > > > screwup
> > > > happened?
> >
> > > What's the relevance here?
> >
> > "Does ops have a procedure for avoiding unexpected SSL cert
> > expirations,
> > and does this affect it in any way other than making it easier to
> > implement?",
> > I would think...
> >
> >
> We didn't have a certificate expiration. We replaced all individual
> certificates, delivered by different top level domains, with a single
> unified certificate. This change was to fix certificate errors being
> shown
> on all non-wikipedia domains for HTTPS mobile users, who were being
> delivered the *.wikipedia.org certificate for all domains.
> 
> The unified certificate was missing 6 Subject Alternative Names:
> mediawiki.org, *.mediawiki.org, m.mediawiki.org, *.m.mediawiki.org,
> m.wikipedia.org and *.m.wikipedia.org. Shortly after deploying the
> certificate we noticed it was bad and reverted the affected services (
> mediawiki.org and mobile) back to their individual certificates. The
> change
> only affected a small portion of users for a short period of time.
> 
> If you notice, I've already mentioned how we'll avoid and more quickly
> detect problems like this in the future:
> 
> "Needless to say I'll be writing a script that can be run against a
> cert to
> ensure it's not missing anything. We'll also be adding monitoring to
> check
> for invalid certificates for any top level domain."

I don't really think it was necessary to be this defensive, do you?

Well, clearly, you do.  My apologies for trying to be helpful in making 
sure you saw an analysis with useful information in it.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: [ PRIVACY Forum ] French homeland intelligence threatens a volunteer sysop to delete a Wikipedia Article

2013-04-06 Thread Jay Ashworth
In case y'all missed this:

- Forwarded Message -
> From: "PRIVACY Forum mailing list" 
> To: privacy-l...@vortex.com
> Sent: Saturday, April 6, 2013 3:10:01 PM
> Subject: [ PRIVACY Forum ] French homeland intelligence threatens a volunteer 
> sysop to delete a Wikipedia Article
> French homeland intelligence threatens a volunteer sysop to delete a
> Wikipedia Article
> 
> http://j.mp/16C8Cxn (Wikimedia France)
> 
> "Unhappy with the Foundation's answer, the DCRI summoned a Wikipedia
> volunteer in their offices on April 4th. This volunteer, which was one
> of those having access to the tools that allow the deletion of pages,
> was forced to delete the article while in the DCRI offices, on the
> understanding that he would have been held in custody and prosecuted
> if he did not comply. Under pressure, he had no other choice than to
> delete the article, despite explaining to the DCRI this is not how
> Wikipedia works. He warned the other sysops that trying to undelete
> the article would engage their responsability before the law. This
> volunteer had no link with that article, having never edited it and
> not even knowing of its existence before entering the DCRI offices. He
> was chosen and summoned because he was easily identifiable, given his
> regular promotional actions of Wikipedia and Wikimedia projects in
> France."
> 
> - - -
> 
> The return of "Vichy France" mentalities, apparently.
> 
> --Lauren--
> Lauren Weinstein (lau...@vortex.com): http://www.vortex.com/lauren
> Co-Founder: People For Internet Responsibility:
> http://www.pfir.org/pfir-info
> Founder:
> - Network Neutrality Squad: http://www.nnsquad.org
> - PRIVACY Forum: http://www.vortex.com/privacy-info
> - Data Wisdom Explorers League: http://www.dwel.org
> - Global Coalition for Transparent Internet Performance:
> http://www.gctip.org
> Member: ACM Committee on Computers and Public Policy
> Lauren's Blog: http://lauren.vortex.com
> Google+: http://vortex.com/g+lauren / Twitter:
> http://vortex.com/t-lauren
> Tel: +1 (818) 225-2800 / Skype: vortex.com
> 
> ___
> privacy mailing list
> http://lists.vortex.com/mailman/listinfo/privacy

-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Have production server access? Please read this document

2013-04-25 Thread Jay Ashworth
- Original Message -
> From: "Leslie Carr" 

> The Ops team has been working on a document about best practices with
> regards to production machines. If you have access to a production
> machine, please read this document
> 
> https://wikitech.wikimedia.org/wiki/Server_access_responsibilities

Very nicely done.  Noting the license, I'll probably steal it in turn.  :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] python vs php

2013-07-27 Thread Jay Ashworth
I will pass your approbation on to ESR  :_)

Cheers,
-- jra

- Original Message -
> From: "Yuvi Panda" 
> To: "Wikimedia developers" 
> Sent: Saturday, July 27, 2013 2:55:46 PM
> Subject: Re: [Wikitech-l] python vs php
> Can we all just agree that haskell, clojurescript and INTERCAL are the
> best ever, and move on?
> 
> 
> --
> Yuvi Panda T
> http://yuvi.in/blog
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [outages] www.wikipedia.com from Level3 via IPv6 not working

2013-08-11 Thread Jay Ashworth
- Original Message -
> From: "Mark Bergsma" 

> We had the same result in the Level3 looking glass, but while we were
> debugging it and trying to gather more info or hosts/networks
> affected, it started working again in the L3 LG as well. So it appears
> that the problem was resolved.

Problems reported on NANOG or Outages often 'magically fix themselves'.  :-)

Cheers,
- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia's anti-surveillance plans: site hardening

2013-08-16 Thread Jay Ashworth
- Original Message -
> From: "Zack Weinberg" 

> The first step really must be to enable HTTPS unconditionally for
> everyone (whether or not logged in). I see on the roadmap that there
> is concern that this will lock out large groups of users, e.g. from
> China; a workaround simply *must* be found for this. Everything else that is
> worth doing is rendered ineffective if *any* application layer data is
> *ever* transmitted over an insecure channel. There is no point
> worrying about traffic analysis when an active man-in-the-middle can inject
> malicious JavaScript into unsecured pages, or a passive one can steal
> session cookies as they fly by in cleartext.

I understand your goal, and your argument, but I've just this week been 
reminded that It Isn't Always China.

I found myself stuck on a non-rooted Android phone, and having to use
a demo version of a tethering app ... which wouldn't pass HTTPS on 
purpose.  Ironically, that's why it was the demo: I couldn't get through
it to PayPal to buy it from them.

My point here, of course, is that you have to decide whether you're
forcing HTTPS *for the user's good* or *for the greater good*... and 
if you think it's the former, remember that the user sometimes knows
better than you do.

If it's the latter, well, you have to decide what percentage of false
positives you're willing to let get away: are there any large populations
of WP users *who cannot use HTTPS*?  EMEA users on cheap non-smart phones
that have a browser, but it's too old -- or the phone too slow -- to 
do HTTPS?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia's anti-surveillance plans: site hardening

2013-08-16 Thread Jay Ashworth
- Original Message -
> From: "Brian Wolff" 

> Thanks for taking the time to write these two emails. You raise an
> interesting point about having everything on one domain. I really
> don't think that's practical for political reasons (not to mention
> technical disruption), but it would allow people to be more lost in
> the crowd, especially for small languages. Some of the discussion
> about this stuff has taken place on bugzilla. Have you read through
> https://bugzilla.wikimedia.org/show_bug.cgi?id=47832 ?

I should think we might be able to run a proxy that would handle such 
hiding, no?

> Personally I think we need to make a more formal list of who all the
> potential threats we could face are, and then expand that list to
> include what we would need to do to protect ourselves from the
> different types of threats (or which threats we chose not to care
> about). Some kid who downloads a firesheep-type program is very
> different type of threat then that of a state agent, and a state agent
> that is just trying to do broad spying is different from a state agent
> targeting a specific user. Lots of these discussion seem to end up
> being: lets do everything to try to protect against everything, which
> I don't think is the right mindset, as you can't protect against
> everything, and if you don't know what specifically you are trying to
> protect against, you end up missing things.

Definitely: the potential attack surfaces need to be explicitly 
itemized.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] You know, we really should shift to Windows

2013-08-21 Thread Jay Ashworth
- Original Message -
> From: "David Gerard" 

> http://www.rightscale.com/blog/cloud-cost-analysis/cloud-cost-analysis-how-much-could-wikipedia-save-cloud

How many machines do we have right now? Couple hundred?

What's a Win2008 server license going for? 

What percentage of our budget is that, anyway?  50?

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] editsection styling

2013-09-01 Thread Jay Ashworth
- Original Message -
> From: "Lee Worden" 

> I ask because I've been producing editsection-like links for a long time
> in our extension project, with commas in between - for example a LaTeX
> document will come with a list of links like "[log, pdf, dvi]". Maybe
> I should switch to using pipes instead of commas.

I'm not sure if there's a *policy* answer, but I would say that my opinion
is that a pipe is a better separator than a comma for two reasons:

1) Commas have a left-affinity (which pipes don't) and
2) Commas expect a following space (whereas you can do pipes with or without
as long as you make the same choice on both sides).

Therefore, for this category of separator, I think pipes would be less
jarring to readers -- at least English readers.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC]: Clean URLs- dropping /wiki/ and /w/index.php?title=..

2013-09-16 Thread Jay Ashworth
- Original Message -
> From: "MZMcBride" 

> The RFC currently seems to gloss over what problem is attempting to be
> solved here and what benefits a new URL structure might bring. I'd like to
> see a clearer statement of a problem and benefits to a switch, taking
> into account, for example, the overarching goal of making URLs fully
> localized.

Concur, especially in light of the face that *this does not permit you to
break the old URLs*.  They are everywhere, *and they must continue to work
forever*.

I hope I don't even have to justify why.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC]: Clean URLs- dropping /wiki/ and /w/index.php?title=..

2013-09-16 Thread Jay Ashworth
- Original Message -
> From: "Steven Walling" 

> How about the following?
> 
> Our current URL structure is extremely obtuse for non-technical users,
> and generally defies their expectations. To most people,
> en.wikipedia.org/Dogor even
> wikipedia.org/Dog should work just fine, not produce a 404.

Any collection of "most people" large enough to justify a change like this
is, I assert, too technically unsophisticated to be attempting to construct
URLs by hand (rather than by copy/pasta).

Do you propose to "fix" also the capitalization and spacing and URLescaping
rules, which are much more complicated than that?

My considered reaction, now after several hours, is that this is fixing
a problem which is not really broken for *anyone* except those who are
OCD about hiding the "tech-y" look in the Location box.  No offense. :-)

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] In case you missed this:

2013-09-17 Thread Jay Ashworth
How many printers would it take to keep up with updates to Wikipedia?

   http://what-if.xkcd.com/59/

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue

2013-09-26 Thread Jay Ashworth
- Original Message -
> From: "MZMcBride" 

> Much of the content on Wikipedia and other Wikimedia wikis comes from
> non-vested contributors. That is, many, many helpful additions and
> corrections come from people who will make only a few edits in their
> lifetime. While I can't disagree with the suggestion that reverting is
> easier than fact-checking, I very much doubt that assuming bad faith
> helps build a better project or a better community. And this is to say
> nothing of the fact that the seemingly simple act of providing a reference is
> often painful and unintuitive, particularly in established articles
> that employ complicated markup (infoboxes, citation templates, and ref
> tags).

My first 2 edits at TV Tropes had this property: not only were they reverted,
they were both reverted with snotty comments about procedure, and *the second
one was me doing what the first one had yelled at me for not doing*.  And I 
got yelled at the second time for following instructions.

I gave up.  It's fun to read, but not worth my time to contribute to.

I concur with MZM: We don't want to become that.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia Url Shortner Service

2012-03-23 Thread Jay Ashworth
- Original Message -
> From: "Petr Bena" 

> That doesn't allow you to type wi.ki/en/Donut in order to open
> article, shortened url is also hard to remember

Well, there's nothing for that.  wi.ki/en/Donut is not a shortened URL...
cause what if the article you were interested in was "List of episodes of
Buffy The Vampire Slayer"?

base36 shortnames, all lower case, please.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA  http://photo.imageinc.us +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we kill DBO_TRX? It seems evil!

2012-09-26 Thread Jay Ashworth
- Original Message -
> From: "Asher Feldman" 

> If lock timeout throws an exception that closes the connection to mysql, at
> least that will result in a rollback. If the connection is pooled and
> reused, it can likely result in a commit.

I would assert that if that's true, than connection pooling is unacceptably 
Broken As Designed.

Cheers,
-- jra
-- 
Jay R. Ashworth  Baylink   j...@baylink.com
Designer The Things I Think   RFC 2100
Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII
St Petersburg FL USA   #natog  +1 727 647 1274

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How would you disrupt Wikipedia?

2010-12-31 Thread Jay Ashworth
- Original Message -
> From: "George Herbert" 

> MW was designed to build an encyclopedia with Web 1.5 technology. It
> was a major step forwards compared to its contemporaries, but sites
> like Gmail, Facebook, Twitter are massive user experience advances
> over where we are and can credibly go with MediaWiki.

MediaWiki is nearly perfectly usable from my Blackberry with CSS, images,
and JavaScript disabled; please don't break that.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How would you disrupt Wikipedia?

2010-12-31 Thread Jay Ashworth
- Original Message -
> From: "Neil Kandalgaonkar" 

> Meanwhile, MediaWiki is perhaps too powerful and too complex to
> administer for the small organization. I work with a small group of
> artists that run a MediaWiki instance and whenever online collaboration
> has to happen, nobody in this group says "Let's make a wiki page!"

Why not?

> That used to happen, but nowadays they go straight to Google Docs. 

Oh.

Well, that's bad.  But people will choose the wrong tools; I don't think
that's evidence that MediaWiki's Broken As Designed.

"Too powerful and complex to administer"?

It needs administration?  In a small organization?

I set one up at my previous employers, and used it to take all my notes,
which required exactly zero administration: I just slapped it on a box,
and I was done.

And my successor is *very* happy about it.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWTF working demo

2011-01-01 Thread Jay Ashworth
- Original Message -
> From: "Mathias Schindler" 
> To: "Wikimedia developers" 
> Sent: Saturday, January 1, 2011 10:34:00 AM
> Subject: Re: [Wikitech-l] WYSIWTF working demo
> On Sat, Jan 1, 2011 at 1:50 PM, Magnus Manske
>  wrote:
> 
> > What I would like is some discussion about
> > * if this approach (working pseudo-WYSIWYG instead of unattainable
> > perfect WYSIWYG) is the way to go
> > * if the code I wrote would be a suitable basis for a system we can
> > throw at the general public
> > * if anyone is willing to help me with that
> 
> I love it and for the sake of demonstration the deep impact of the
> approach, I recommend one minor change:
> 
> * deactivate or hide or shrink the "read" and "edit" tab as they are
> now obsolete or make the WYSIWTF tab the default display.

Are like hell.

Some people -- you can include me in this -- *actively* hate WYSIWYG editing,
thank-you-very-much.  Good bet at least 30-50% of Wikipedia's "power editors" 
are very well versed in MWtext[1], and how to use it to get what they want;
I wouldn't recommend making it hard for those people to keep doing what they've
been doing.

Cheers,
-- jra 

[1] Yes, I've just made that statistic up, but I expect it will track with
other similar statistics.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWTF working demo

2011-01-02 Thread Jay Ashworth
- Original Message -
> From: "Mathias Schindler" 

> On Sat, Jan 1, 2011 at 4:49 PM, Jay Ashworth  wrote:
> > Some people -- you can include me in this -- *actively* hate WYSIWYG
> > editing,
> 
> my comment was in no way about the pros and cons of WYSIWYG editing
> and the decision at Wikimedia to have its own turing complete language
> for content. My comment was about this demonstrator which should have
> a huge selection bias of non-haters :)
> 
> I reject the WYSIWYG concept itself much less then I dislike the
> effects it has on humans who start to "make things look nice".

Oh: you were suggesting that *his demonstrator* not show the plaintext
edit tabs.

Well, remember that those looking at it to evaluate it will *also* note
that they're missing; this will have an effect on people's perceptions
one way or the other, even though it's not a production implementation.

My apologies for misunderstanding you, and for sounding like Get Offa My
Lawn Guy, something I have to guard against increasingly as I dive into
my forties.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-02 Thread Jay Ashworth
- Original Message -
> From: "lampak" 

> I've been following the discussion and as I can see it's already
> become rather unproductive*. So I hope my cutting in will not be very much
> out of place (even if I don't really know what I'm talking about).
> 
> Many people here has stated the main reason why a WYSIWYG editor is
> not feasible is the current wikitext syntax.
> 
> What's actually wrong with it?

Oh god!  *Run*!

:-)

This has been done a dozen times in the last 5 years, lampak.  The short
version, as much as *I* am displeased with the fact that we'll never have
*bold*, /italic/ and _underscore_, is that the installed base, both of 
articles and editors, means that Mediawikitext will never change.

It *might* be possible to *extend* it, but that requires that at least
one of the 94 projects to write a formally defined parser for it, in 
something resembling yacc, would have to complete -- and to my knowledge, 
none has done so.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-04 Thread Jay Ashworth
- Original Message -
> From: "Alex Brollo" 

> Just a brief comment: there's no need of seaching for "a perfect wiki
> syntax", since it exists: it's the present model of well formed
> markup, t.i. xml.

I believe the snap reaction here is "you haven't tried to diff XML, have you?

My personal snap reaction is that the increase in cycles necessary to process
XML in both directions, *multiplied by the number of machines in WMF data 
center* will make XML impractical, but I'm not a WMF engineer.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WikiCreole (was Re: What would be a perfect wiki syntax? (Re: WYSIWYG))

2011-01-04 Thread Jay Ashworth
- Original Message -
> From: "Brion Vibber" 

> Requiring people to do all their document creation at this level is
> like asking people to punch binary ASCII codes into cards by hand -- it's
> low-level grunt work that computers can handle for us. We have
> keyboards and monitors to replace punchcards; not only has this let most 
> people stop
> worrying about memorizing ASCII code points, it's let us go beyond
> fixed-width ASCII text (a monitor emulating a teletype, which was
> really a friendlier version of punch cards) to have things like _graphics_.
> Text can be in different sizes, different styles, and different languages. We
> can see pictures; we can draw pictures; we can use colors and shapes to create
> a far richer, more creative experience for the user.

None of which will be visible on phones from my Blackberry on down, which,
IIRC, make up more than 50% of the Internet access points on the planet.

Minimalism is your friend; I can presently *edit* wikipedia on that BB,
with no CSS, JS, or images.  That's A Good Thing.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What do we want to accomplish? (was Re: WikiCreole)

2011-01-04 Thread Jay Ashworth
- Original Message -
> From: "Mark A. Hershberger" 

> The problem naturally falls back on the parser: As I understand it,
> the only reliable way of creating XHTML from MW markup is the parser that
> is built into MediaWiki and is fairly hard to separate (something I
> learned when I tried to put the parser tests into a PHPUnit test harness.)
> 
> I think The first step for creating a reliable, independent parser for
> MW markup would be to write some sort of specification
> (http://www.mediawiki.org/wiki/Markup_spec) and then to make sure our
> parser tests have good coverage.

The last time I spent any appreciable time on wikitech (which was 4 or 5
years ago), *someone* had a grammar and parser about 85-90% working.  I
don't have that email archive due to a crash, so I can't pin a name to
it or comment on whether it's someone in this thread...

or, alas, comment on what happened later.  But he seemed pretty excited 
and happy, as I recall.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WikiCreole (was Re: What would be a perfect wiki syntax? (Re: WYSIWYG))

2011-01-05 Thread Jay Ashworth
- Original Message -
> From: "Brion Vibber" 

> A good document structure would allow useful editing for both simple
> paragraphs and complex features like tables and templates even on such
> primitive devices, by giving a dedicated editing interface the
> information it needs to address individual paragraphs, template
> parameters, table cells, etc.

A 'dedicated editing interface' is the canonical counter example to my 
#1 fundamental tenet of program and systems design: "Get The Glue Right".

The Right Glue, in this case, is bare HTML, which can be run nearly 
everywhere these days.

> I would go so far as to say that this sort of fallback interface would
> in fact be far superior to editing a big blob of wikitext on a small cell
> phone screen -- finding the bit you want to edit in a huge paragraph full of
> references and image thumbnails is pretty dreadful at the best of
> times.

Of course it would.

But the target audience here isn't people who *have* anything else; it's
people in the Sudan.  Well, the target audience I see from up here at 43,000 
feet.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What do we want to accomplish? (was Re: WikiCreole)

2011-01-05 Thread Jay Ashworth
- Original Message -
> From: "David Gerard" 

> Many, many bright people have dashed their foreheads against the
> problem.
> 
> Andreas Jonsson thinks he's largely cracked it:
> 
> http://davidgerard.co.uk/notes/2010/08/22/staring-into-the-eye-of-cthulhu/
> 
> - and even that required custom patches to ANTLR. The result runs in C
> and is of comparable speed to PHP.

I suspect it was Steve Bennett's attack run I was remembering.

Did anyone ever pull statistics about exactly how many instances of that
Last Five Percent there really were, as I suspect I suggested at the time?

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-05 Thread Jay Ashworth
 Original Message -
> From: "Daniel Kinzler" 

> On 05.01.2011 05:25, Jay Ashworth wrote:
> > I believe the snap reaction here is "you haven't tried to diff XML,
> > have you?
> 
> A text-based diff of XML sucks, but how about a DOM based (structural)
> diff?

Sure, but how much more processor horsepower is that going to take.

Scale is a driver in Mediawiki, for obvious reasons.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What do we want to accomplish? (was Re: WikiCreole)

2011-01-05 Thread Jay Ashworth
- Original Message -
> From: "George Herbert" 

> On Wed, Jan 5, 2011 at 7:35 PM, Jay Ashworth  wrote:
> > Did anyone ever pull statistics about exactly how many instances of
> > that Last Five Percent there really were, as I suspect I suggested at the
> > time?
> 
> Expansion off "how many instances..?" -

The thing you want expanded, George, is "Last Five Percent"; I refer 
there to (I think it was) David Gerard's comment earlier that the 
first 95% of wikisyntax fits reasonably well into current parser
building frameworks, and the last 5% causes well adjusted programmers
to consider heroin... or something like that. :-)

> At some point in the corner, the fix is to change the templates and
> pages to match a more sane parser's capabilities or a more standard
> specification for the markup, rather than make the parser match the
> insanity that's already out there.
> 
> If we know what we're looking at, we can assign corner cases to an
> on-wiki cleanup "hit squad". Who knows how many of the corners we can
> outright assassinate that way, but it's worth a go... The less used
> it is and harder to code for it is, the easier it is for us to justify
> taking it out.

Yup; that's the point I was making.

The argument advanced was always "there's too much usage of that ugly
stuff to consider Just Not Supporting It" and I always asked whether
anyone with larger computers than me had ever extracted actual statistics,
and no one ever answered.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] What would be a perfect wiki syntax? (Re: WYSIWYG)

2011-01-06 Thread Jay Ashworth
- Original Message -
> From: "George Herbert" 

> >> A text-based diff of XML sucks, but how about a DOM based
> >> (structural)
> >> diff?
> >
> > Sure, but how much more processor horsepower is that going to take.
> >
> > Scale is a driver in Mediawiki, for obvious reasons.
> 
> I suspect that diffs are relatively rare events in the day to day WMF
> processing, though non-trivial.

Every single time you make an edit, unless I badly misunderstand the current 
architecture; that's how it's possible for multiple people editing the 
same article not to collide unless their edits actually collide at the
paragraph level.

Not to mention pulling old versions.

Can someone who knows the current code better than me confirm or deny?

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF and IPv6

2011-02-03 Thread Jay Ashworth
- Original Message -
> From: "George Herbert" 

> I just checked and determined that there appear to be no  records
> yet for the WMF servers.
> 
> I have to admit to having been negligent in examining the IPv6
> readiness of the Mediawiki software. Is it generally working and
> ready to go on IPv6?

Is Apache?  That's the base question, is it not?  I think the answer is
yes.

> The importance of this is going to be high in the Asia-Pacific region
> within a few months:
> http://www.potaroo.net/tools/ipv4/rir.jpg
> 
> (APNIC runs out of IPv4 space to give to providers somewhere around
> August, statistically; RIPE in Feb or March 2012, ARIN in July 2012).

ARIN issued the last 5 available /8s to RIRs *today*; we've been talking
about it all day on NANOG.

> In each region, ISPs then will start running out of IPv4 to hand out
> within a month to three months of the registry exhaustion.
> 
> We have a few months, but by the end of 2012, any major site needs to
> be serving IPv6.
> 
> Out of curiosity, is anyone from the Foundation on the NANOG mailing
> lists?

Oh yeah; that's what triggered this.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF and IPv6

2011-02-03 Thread Jay Ashworth
- Original Message -
> From: "River Tarnell" 

> It doesn't matter if Apache supports IPv6, since the Internet-facing
> HTTP servers for wikis are reverse proxies, either Squid or Varnish.
> I believe the version of Squid that WMF is using doesn't support IPv6.

Oh, of course.  

> As long as the proxy supports IPv6, it can continue to talk to Apache
> via IPv4; since WMF's internal network uses RFC1918 addresses, it
> won't be affected by IPv4 exhaustion.

It might; how would a 6to4NAT affect blocking?

> Apache does support IPv6, though; some other content which is served
> using Apache, like lists.wm.o, is available over IPv6.
> 
> MediaWiki itself supports IPv6 fine, including for blocking. This was
> implemented a while ago. Training admins to handle IPv6 IPs could be
> interesting.

I mused on NANOG yesterday as to what was going to happen when network
techs started realizing they couldn't carry around a bunch of IPs in 
their heads anymore...
 
> >> (APNIC runs out of IPv4 space to give to providers somewhere around
> >> August, statistically; RIPE in Feb or March 2012, ARIN in July
> >> 2012).
> >ARIN issued the last 5 available /8s to RIRs *today*; we've been
> >talking about it all day on NANOG.
> 
> Not exactly. IANA issued the last 5 /8s to RIRs, of which ARIN is one,
> today. But George is talking about RIR exhaustion, which is still some
> months away.

His phrasing seemed a bit.. insufficiently clear, to me.  That was me, 
attempting to clarify.

> >> Out of curiosity, is anyone from the Foundation on the NANOG
> >> mailing
> >> lists?
> >Oh yeah; that's what triggered this. :-)
> 
> Does any useful discussion still take place on that list?

Sure.  The S/N is still lower than the Hats would prefer, but that's 
the nature of an expanding universe.

Cheers,
- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF and IPv6

2011-02-03 Thread Jay Ashworth
- Original Message -
> From: "River Tarnell" 

> Jay Ashworth  wrote:
> >- Original Message -
> >> From: "River Tarnell" 
> >> As long as the proxy supports IPv6, it can continue to talk to
> >> Apache
> >> via IPv4; since WMF's internal network uses RFC1918 addresses, it
> >> won't be affected by IPv4 exhaustion.
> >It might
> 
> No, it won't. The internal network IPs (which are used for
> communication between the proxy and the back-end Apache) are not
> publicly visible and are completely inconsequential to users.
> 
> >how would a 6to4NAT affect blocking?
> 
> ISP NATs are a separate issue, and might be interesting; if nothing
> else, as one reason (however small) for ISPs to provide IPv6 to end
> users. ("Help! I can't edit Wikipedia because my ISP's CGNAT pool was
> blocked!".)

You misunderstood me.

If we NAT between the squids and the apaches, will that adversely affect
the ability of MW to *know* the outside site's IP address when that's v6?

You're not just changing addresses, you're changing address *families*;
is there a standard wrapper for the entire IPv4 address space into v6?
(I should know that, but I don't.)

> >His phrasing seemed a bit.. insufficiently clear, to me. That was me,
> >attempting to clarify.
> 
> Okay. I feel your clarification was not very clear ;-)
>
> ARIN didn't issue any /8s today, IANA did. ARIN was one of the
> *recipients* of those /8s.

Acronym failure; sorry.  Yes; Something-vaguely-resembling-IANA issued those
last 5 blocks, in keeping with a long-standing sunset policy.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF and IPv6

2011-02-03 Thread Jay Ashworth
- Original Message -
> From: "George Herbert" 

> > It might; how would a 6to4NAT affect blocking?
> 
> It's not really a 6to4 NAT per se - it's a 6to4 application level
> proxy. The question is, what does Squid hand off to Apache via a IPv4
> back end connection if the front end connection is IPv6.
> 
> Which, frankly, I have no idea (and am off investigating...).

I rarely have answer, but I do try to ask good questions.

And yes, NAT was a poor choice of terms.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WMF and IPv6

2011-02-03 Thread Jay Ashworth
- Original Message -
> From: "Tim Starling" 

> It's not necessary for the main Squid cluster to support IPv6 in order
> to serve the main website via IPv6.
> 
> The amount of IPv6 traffic will presumably be very small in the short
> term. We can just set up a single proxy server in each location (Tampa
> and Amsterdam), and point all of the relevant  records to it. All
> the proxy has to do is add an X-Forwarded-For header, and then forward
> the request on to the relevant IPv4 virtual IP. The request will then
> be routed by LVS to a frontend squid.

That's so obvious I'm embarassed I didn't think of it.

Given how big we are, though "very" small may be most websites "medium traffic 
day".  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] A call for skins

2011-02-07 Thread Jay Ashworth
> I would LOVE skin designs right now. If you've got a nice idea for a
> skin feel free to mock it up and post the mockup images for it. If it
> looks interesting I'll consider turning it into a real skin,
> ESPECIALLY
> if it break our de-facto traditions on what makes up a wiki skin. Our
> rigid skin structure is one of the big limitations of our skinning
> system right now, skins that define things beyond the current
> restrictions are good examples needed while we break open the rigid
> structure of our skins.

Have you seen the skin on the DD-WRT wiki?  They designed it to look like
their UI,

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] secure.wikimedia.org commons sockpuppet site

2011-02-12 Thread Jay Ashworth
He's complaining, in effect, that there are more than one URL for identical
content, which is in fact generally a bad idea, but in this case, of course,
he's wrong: different *access protocols* are being used, so it's not possible
to conform the two...

Whether it is in fact still a Best Practice to make sure that they're the
same is another matter; I understand *why* we have a separate domain name 
for https, architecturally, but I'm not sure I *like* it.

Cheers,
-- jra


- Original Message -
> From: "Huib Laurens" 
> To: "Wikimedia developers" 
> Sent: Saturday, February 12, 2011 10:18:33 PM
> Subject: Re: [Wikitech-l] secure.wikimedia.org commons sockpuppet site
> I don't understand the email also... The secure site has been arround
> for years...
> 
> 2011/2/13, MZMcBride :
> > jida...@jidanni.org wrote:
> >> Someone posted a link to
> >>> https://secure.wikimedia.org/wikipedia/commons/wiki/Category:Hogtie_bondage
> >>
> >> Delving further, we find
> >> https://secure.wikimedia.org/wikipedia/commons/wiki/Main_Page says
> >> Welcome to Wikimedia Commons, when in fact the real site is
> >> http://commons.wikimedia.org/wiki/Main_Page . Or is it?
> >>
> >> In fact on any page on either site, one cannot find any link to the
> >> corresponding page on the other site.
> >>
> >> So now everybody will be passing around two times the amount of
> >> links to
> >> what in fact is the same material.
> >>
> >> OK, I found the pattern for converting one to the other:
> >> https://secure.wikimedia.org/wikipedia/commons/wiki/Category:Hogtie_bondage
> >> _
> >> http://commons.wikimedia.org/wiki/Category:Hogtie_bondage
> >>
> >> One would hope the owners of Wiki[pm]would redirect etc. one to the
> >> other, to stem the proliferation of non-canonical links.
> >
> > What in the Christ are you talking about?
> >
> > Reading with squinted eyes and a cocked head, it sounds like you've
> > discovered the secure site. It's documented here:
> >
> > * http://en.wikipedia.org/wiki/Wikipedia:Secure_server
> > * http://wikitech.wikimedia.org/view/secure.wikimedia.org
> >
> > I have no idea what this has to do with hogtie bondage or why a post
> > to this
> > mailing list (or the gendergap mailing list, for that matter) was
> > necessary.
> >
> > MZMcBride
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> 
> --
> Verzonden vanaf mijn mobiele apparaat
> 
> Regards,
> Huib "Abigor" Laurens
> 
> 
> 
> Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] secure.wikimedia.org commons sockpuppet site

2011-02-12 Thread Jay Ashworth
- Original Message -
> From: "Chad" 
> > In fact on any page on either site, one cannot find any link to the
> > corresponding page on the other site.
> 
> Yeah, secure.wikimedia.org's URL scheme isn't really friendly
> to outsiders. Historically, this is because SSL certificates are
> expensive, and there just wasn't enough money in the budget
> to get more of them for the top-level domains. Maybe this isn't
> the case anymore.

Is that in fact the root cause, Chad?  I assumed, myself, that it's because
of the squid architecture.

> > So now everybody will be passing around two times the amount of
> > links to what in fact is the same material.
> 
> If people are pasting double links, then they're being silly. I
> imagine a lot of stuff on Commons uses {{fullurl:}} so the links
> are properly generated by MediaWiki.

No, in fact the root cause of his complaint is pretty likely to be
HTTPS-everywhere, which redirects users to the https site in case they're
at an insecure wifi spot, so their creds don't get stolen.

This is likely to markedly increase https traffic; I've myself been wondering
if that's been noticed the last month.

> Redirection would be pointless. Serving them from the same
> domain (eg: https://commons.wikimedia.org) would be great
> and is already posted as a bug[0]. I think this is your primary
> complaint, but as usual you spent half of your post insulting
> people and creating straw men.

Aspergers' syndrome is a bitch.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] secure.wikimedia.org commons sockpuppet site

2011-02-13 Thread Jay Ashworth
- Original Message -
> From: "River Tarnell" 

> In article
> <18849937.7157.1297583642909.javamail.r...@benjamin.baylink.com>,
> Jay Ashworth  wrote:
> > > Yeah, secure.wikimedia.org's URL scheme isn't really friendly
> > > to outsiders. Historically, this is because SSL certificates are
> > > expensive, and there just wasn't enough money in the budget
> > > to get more of them for the top-level domains. Maybe this isn't
> > > the case anymore.
> 
> > Is that in fact the root cause, Chad? I assumed, myself, that it's
> > because
> > of the squid architecture.
> 
> LVS is in front of Squid, so it would be fairly simple to send SSL
> traffic (port 443) to a different machine; which is how secure.wm.o
> works now, except that instead of using LVS, it requires a different
> hostname.

Got it.

> However, I think the idea is not to start allowing
> https://en.wikipedia.org URLs until there's a better SSL
> infrastructure
> which can handle the extra load an easy-to-use, widely advertised SSL
> gateway is likely to create. secure.wm.o is currently a single machine
> and sometimes falls over, e.g. when Squid breaks for some reason and
> people notice that secure still works.

You did get the "EFF is pushing a Firefox plugin that has a rule that 
redirects all WP accesses to the secure site" part of that report, though,
right?  This curve has probably already started to ramp; now might be a
good time for someone ops-y to be thinking about this.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] secure.wikimedia.org commons sockpuppet site

2011-02-13 Thread Jay Ashworth
- Original Message -
> From: "Ryan Lane" 

> On Sun, Feb 13, 2011 at 1:51 AM, Jay Ashworth  wrote:
> > He's complaining, in effect, that there are more than one URL for
> > identical
> > content, which is in fact generally a bad idea, but in this case, of
> > course,
> > he's wrong: different *access protocols* are being used, so it's not
> > possible
> > to conform the two...
> >
> > Whether it is in fact still a Best Practice to make sure that
> > they're the
> > same is another matter; I understand *why* we have a separate domain
> > name
> > for https, architecturally, but I'm not sure I *like* it.
> >
> 
> This is something I'd very much like to fix. I had a fairly in depth
> discussion with the other ops folks about this last week. I think I'm
> going to put it on my goal list; however, we have a lot of higher
> priority tasks, so I wouldn't expect anything too soon.

Oh, I'm not, and secure.* is fine for me, for now.  But see my other note
to River.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making code review happen in 1.18

2011-02-14 Thread Jay Ashworth
- Original Message -
> From: "Jeroen De Dauw" 

> > +1 to migrate to a DVCS
> 
> Unless I'm mistaken no one has actually suggested doing that.

0 + 1 = 1, right?  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How users without programming skills can help

2011-02-14 Thread Jay Ashworth
Wow, it's difficult to unwind a middle-posted message.

I think that having a wizard to make reporting bugs easier is a great
idea, and will likely increase the number of problem reports you have
to work with...

as long as you don't get it in *my* when when I'm trying to report a
bug, thank-you-very-much.  :-)

Making an easy interface available is wonderful, as long as you don't 
drive power users up the wall with it, an opinion of mine which I'm
sure won't be a surprise to anyone who's watched me post here on 
wikitext and parsers/editors.  :-)

Cheers,
-- jra

- Original Message -
> From: "Diederik van Liere" 
> To: "Wikimedia developers" 
> Sent: Sunday, February 13, 2011 10:53:48 PM
> Subject: Re: [Wikitech-l] How users without programming skills can help
> Dear James, Amir and fellow wikimedia devs,
> 
> I understand your concern and I am not suggesting that we should force
> a
> user to enter all Bugzilla fields but add those 5 questions as a
> guideline
> in the free-text form. Reporters can use it when they feel uncertain
> what
> information we are looking for but they are not forced to stick to any
> format in particular.
> 
> Additionally, I think that Mediawiki users are as technological
> advanced as
> Firefox users so I don't think this will scare somebody away. If we
> really
> want to make it easier for people to file a bug then we should add a
> simple
> wizard to guide them through the process. In particular choosing the
> right
> product and component can be quite confusing / intimidating for
> somebody new
> to Medawiki.
> 
> On Sun, Feb 13, 2011 at 9:43 PM, James Alexander
> wrote:
> 
> > On 2/13/2011 8:46 PM, Diederik van Liere wrote:
> > > I think we can draw some inspiration from Mozilla's use of
> > > Bugzilla and
> > > particular the format they are encourage users when submitting a
> > bugreport:
> > >
> > > 1) Steps to reproduce
> > > 2) Expected result
> > > 3) Actual result
> > > 4) Reproducible (by bugreporter): always / sometimes
> > > 5) Version information, extensions installed, database used (this
> > > information is dependent on the skill level of the bugreporter and
> > > maybe
> > we
> > > can add make this information easily retrievable if it's current
> > > not easy
> > to
> > > determine.
> > >
> > > So maybe we can paste these 5 steps (or something similar) in the
> > > initial
> > > form used to file a bugreport.
> > >
> > > This would increase the quality of bugreports and make it easier
> > > for bug
> > > triaging.
> >
> > I can totally understand the idea behind this but I think Amir
> > brings up
> > the concern about this best:
> >
> > On 2/13/2011 5:56 PM, Amir E. Aharoni wrote:
> > > bugzilla.wikimedia.org is the tracker where i report more bugs
> > > than
> > > elsewhere. The second is bugzilla.mozilla.org . It's not because
> > > Firefox has less bugs (quite the contrary!) but because Mozilla's
> > > tracker requires me to fill more fields, such as steps for
> > > reproduction. This may encourage detailed reporting that helps
> > > developers solve the bugs, but it may also discourage people from
> > > reporting them in the first place.
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> > Gathering all that information on a bug report form could quite
> > clearly
> > make it easier to reproduce bugs and may make resolving them easier
> > but
> > I worry that the harder and/or more complicated we make the
> > reporting
> > the more likely we are to scare someone away from taking the time to
> > file the bug (which we want). I'm not totally sure where the best
> > balance there is.
> >
> > --
> > James Alexander
> > Associate Community Officer
> > Wikimedia Foundation
> > jalexan...@wikimedia.org
> > +1-415-839-6885 x6716
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> 
> 
> 
> --
> http://about.me/diederik";>Check out my about.me profile!
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How users without programming skills can help

2011-02-14 Thread Jay Ashworth
- Original Message -
> From: "Diederik van Liere" 

> "If you know which version you are using or you have other information
> that you think might be helpful please add it as well.
> You can also describe the problem in your own words and not sticking
> to the abovementioned questions."

I do want to grab one misconception by the horns here, though it is 
not as much applicable to Wikipedia proper as it is to Mediawiki, and so
may not apply to what Diederik had in his head:

If you don't know what version you're using, the bug report will be next
to useless, particularly if it's fire and forget.

If anyone on this thread hasn't already read Simon 'putty' Tatham's 
absolutely *excellent* guide to filing bug reports, you probably should.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How users without programming skills can help

2011-02-15 Thread Jay Ashworth
> With the impending Bugzilla 4 release, I would like to take some
> time in setting up the test instance to perhaps play with some of
> these options to see if we can tweak it into being more useful to
> everyone.

BZ4 is coming?  Cool.  You can't *imagine* how happy I'd be to help with
that project, Chad.  :-)

BZ3 can suck, but a couple of custom implementations of it (RH and SuSE
Novell's are, I think, the ones I'm thinking of) showed that it could be
implemented *well* if you put the work into it...

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] are you sure you want everything via HTTPS?

2011-02-15 Thread Jay Ashworth
- Original Message -
> From: jida...@jidanni.org

> Is that how Facebook™ or Google™ operate, sending every single
> component via HTTPS?
>
> No. Only the vital personal settings, password stuff is done that way.

Wrong.

Both Google and Facebook will be perfectly happy to let you conduct
your entire session in https, these days.
 
> As for not letting people know what pages you are browsing, well, I
> don't now. Does Google™ offer a way to not let wiretapping people know
> what pages you are searching? Probably. We Geritol™ Generation users
> aren't exactly sure to tell you the truth.

Quit trolling, dude.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] are you sure you want everything via HTTPS?

2011-02-15 Thread Jay Ashworth
- Original Message -
> From: "Thomas Dalton" 

> Ok, so offering HTTPS for everything isn't essential. What harm does
> it do, though?

it imposes on your server cluster some requirements -- and some load --
with which it would otherwise not have to deal.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] OT: Word-wrap

2011-02-20 Thread Jay Ashworth
- Original Message -
> From: "Thomas Dalton" 

> On 15 February 2011 22:43, K. Peachey  wrote:
> > Release Notes will be corrected and formatted to make it appropriate
> > for release, nothing is going to change it. The standard is for 80
> > (or
> > is it 72) characters wide and it will be corrected if theres errors.
> > This is just something that will always happen.
> > -Peachey
> 
> Why are we imposing such an outdated rule? CLIs and text editors got
> the ability to automatically wrap text several decades ago.

In fact, they *lost* the ability to automatically wrap text.

They thought that was an acceptable tradeoff for gaining the ability
to wrap *received* text if it was too long to fit.
 

I've just proven why that's a bad tradeoff.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] HTTPS revisited

2011-02-20 Thread Jay Ashworth
This thread at LWN seems like it might have some information which would
be interesting to those people who might be charged, down the road, with
the SSLizing of Wikimedia:

  http://lwn.net/Articles/428594/

In particular, it discusses SSL session-caching across a cluster, which I
hadn't realized was possible.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] stop changing the whitespace in RELEASE-NOTES please

2011-02-20 Thread Jay Ashworth
- Original Message -
> From: "Platonides" 

> Ryan Lane wrote:
> > I don't think we should encourage people to run trunk in production.
> > We should encourage people to run release candidates in production,
> > and possibly betas for those that know the software *really* well.
> > We should likely encourage people to run trunk on their live testing
> > environments, though.
> >
> > Occasionally security issues pop up in trunk that get caught in code
> > review. People who run trunk are much more likely to have security
> > problems, so on a production site, it's a problem. Similarly, it's
> > possible that commits may come in that can cause data loss, which
> > will later get caught in code review.
> 
> It is the aim of the MediaWiki community to have an always-working
> trunk.
> It /should/ be possible to run a small wiki from trunk without much
> more than verifying before deployment that the tests passes and a few look
> out for fixmes.

"should be possible" != "is a good idea".

Just sayin'

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] stop changing the whitespace in RELEASE-NOTES please

2011-02-20 Thread Jay Ashworth
 Original Message -
> From: "Platonides" 

> Jay Ashworth wrote:
> > "should be possible" != "is a good idea".
> >
> > Just sayin'
> 
> Specially when we are not there yet ;)

Best time to take policy decisions with large potential impact, no?  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] FIXME: Wall of Shame

2011-03-26 Thread Jay Ashworth
 Original Message -
> From: "K. Peachey" 

> Since we havn't done one of these in awhile, a wall of shame for
> fixmes,
> If it looks weird, copy it into a plain text editor.

(I believe you've mispelt "get a real email client".  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] FIXME: Wall of Shame

2011-03-26 Thread Jay Ashworth
- Original Message -
> From: "K. Peachey" 

> > (I believe you've mispelt "get a real email client". :-)

> It's longer than 80 characters wide, so some "real" clients will still
> display it wrong.

Fair point.  Why I always ran mutt in an xterm.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Change in Bugzilla's defaults

2011-04-13 Thread Jay Ashworth
- Original Message -
> From: "OQ" 

> On Wed, Apr 13, 2011 at 2:44 PM, Krinkle 
> wrote:
> > I agree. Defaulting new bugs to a low priority doesn't seem very
> > friendly
> > to new users. They don't know (and shouldn't have to know) what the
> > bugmeister's organization is.
> 
> Then make a triage priority and default them all to that.

Be my vote.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New SVN committer

2011-04-14 Thread Jay Ashworth
- Original Message -
> From: "Priyanka Dhanda" 

> Extension and Core:
> * Patrick Reilly (preilly)
> 
> Patrick has joined the WMF engineering team as Sr. Software Developer
> for mobile.

Congrats to him.

I've got an Android phone, 6 years as a Wikipedian, and 25 years experience
writing good tickets, if he needs any help from the field.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] facebook like box in mediawiki

2011-04-18 Thread Jay Ashworth
- Original Message -
> From: "Roan Kattouw" 

> Presumably it'll pass a referer header, which means the 3rd-party site
> would know who (IP and account, if they have one on their site, like
> on Facebook) visited which Wikipedia page when. I'm not very familiar
> with the privacy policy, but I'm pretty sure this is shady at the very
> least.

If Wikipedia or any other WMF sites deploy anything remotely related to
Facebook, be it Instant Personalization, or even anything slightly less 
intrusive, then I'm gone, both as a reader and an editor, and I'm pretty
sure I wouldn't be alone in that opinion.  Mission creep is alive and well
at Facebook.

Papa *spank*.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] facebook like box in mediawiki

2011-04-19 Thread Jay Ashworth
- Original Message -
> From: "Tim Starling" 

> On 19/04/11 15:32, Raul Kern wrote:
> > So it's better to find other hosting provider for chapters web site?
> 
> A lot of chapters have their own hosting. Part of the reason is so
> that they can set their own website policies without any interference
> from the Foundation. It's a reasonable solution.
> 
> For example, Wikimedia Israel has a Facebook "like" button, and
> Wikimedia Russia, Wikimedia France and Wikimedia Portugal have Google
> Analytics, all without JRA quitting.



The issue is still real, though, Tim, even if you don't minimize it to
a snark at me.  No one's going to cry if I stop editing... but I am not
by any means the only person unhappy with the behaviour of Facebook;
hell, Aaron Sorkin got an Oscar for dramatizing why you *should* be
concerned about it.

And they're a very large private corporation, resorting to financial
subterfuge to *continue* to stay private; I see no reason WMF should 
commingle its destiny with theirs.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Announcing the Open Source Sweble Wikitext Parser v1.0

2011-05-01 Thread Jay Ashworth
- Original Message -
> From: "Hannes Dohrn" 

> Hi everyone,
> 
> We are happy to announce the general availability of the first public
> release of the Sweble Wikitext parser, available from
> http://sweble.org.
> 
> The Sweble Wikitext parser
> 
> * can parse all complex Wikitext, incl. tables and templates
> * produces a real abstract syntax tree (AST); a DOM will follow soon
> * is open source made available under the Apache Software License 2.0
> * is written in Java utilizing only permissively licensed libraries

You should identify whether you mean "MediaWikitext", or some other
dialect -- MediaWiki Is Not The Only Wiki... 

and you should post to wikitext-l as well.  The real parser maniacs hang 
out over there, even though traffic is low.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Announcing the Open Source Sweble Wikitext Parser v1.0

2011-05-01 Thread Jay Ashworth
- Original Message -
> From: "Dirk Riehle" 

> > You should identify whether you mean "MediaWikitext", or some other
> > dialect -- MediaWiki Is Not The Only Wiki...
> >
> > and you should post to wikitext-l as well. The real parser maniacs
> > hang out over there, even though traffic is low.
> 
> It is MediaWiki's Wikitext; elsewhere it is usually called wiki
> markup.

Improperly and incompletely, perhaps, yes.

I'm a MW partisan, and think it's better than nearly all its competitors,
for nearly all uses... but even I try not to be *that* partisan.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Announcing the Open Source Sweble Wikitext Parser v1.0

2011-05-01 Thread Jay Ashworth
- Original Message -
> From: "Dirk Riehle" 

> Hmm, never viewed it that way. IMO, MediaWiki (developers) invented a
> wiki
> markup language and called it Wikitext; other engines just call it
> wiki markup
> or what not. For me, Wikitext always was the particular markup of
> MediaWiki,
> much like php or C++ are particular language names.
> 
> Is there any other engine that calls it's markup Wikitext? I'd be
> surprised.
> Even for WikiCreole wikicreole.org we used wiki markup.

We ourselves say it's a generic term:

http://en.wikipedia.org/wiki/Wiki_markup

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "Andreas Jonsson" 

> Subject: Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with 
> Wikia's WYSIWYG?)

> My motivation for attacking the task of creating a wikitext parser is,
> aside from it being an interesting problem, a genuin concern for the
> fact that such a large body of data is encoded in such a vaguely
> specified format.

Correct: Until you have (at least) two independently written parsers, both
of which pass a test suite 100%, you don't have a *spec*. 

Or more to the point, it's unclear whether the spec or the code rules, which
can get nasty.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "MZMcBride" 

> > Now that we have HipHop support, we have the ability to turn
> > MediaWiki's core parser into a fast, reusable library. The performance
> > reasons for limiting the amount of abstraction in the core parser will
> > disappear. How many wikitext parsers does the world really need?
> 
> I realize you have a dry wit, but I imagine this joke was lost on
> nearly everyone. You're not really suggesting that everyone who wants to
> parse MediaWiki wikitext compile and run HipHop PHP in order to do so.

I'm fairly certain that his intention was "If the parser is HipHop compliant,
then the performance improvements that will realize for those who need them
will obviate the need to rewrite the parser in anything, while those who
run small enough wikiae not to care, won't need to care."

That does *not*, of course, answer the "if you don't have more than one
compliant parser, then the code is part of your formal spec, and you
*will* get bitten eventually" problem.

Of course, Mediawiki's parser has *three* specs: whatever formal one 
has been ginned up, finally; the code; *and* 8 or 9 GB of MWtext on the
Wikipedias.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "Neil Harris" 

> I think it cannot be emphasized enough that what's valuable about
> Wikipedia and other similar wikis is the hard-won _content_, not the
> software used to write and display it at any given, which is merely a
> means to that end.
> 
> Fashions in programming languages and data formats come and go, but
> the person-centuries of writing effort already embodied in Mediawiki's
> wikitext format needs to have a much longer lifespan: having a
> well-defined syntax for its current wikitext format will allow the
> content itself to continue to be maintained for the long term, beyond
> the restrictions of its current software or encoding format.

The project of creating a formal specification for Mediawikitext was one
of the primary reasons for the creation of the (largely dormant) wikitext-l
list.  I fell off shortly after it was created myself, so I don't know how
far along that project got -- except that I know that it was decided that
since MWtext was not -- and could not be -- a strict subset of Creole,
that Creole was a Pretty Nice Idea... and we had no time for it.

For my money, that means the Creole folks lost[1], but what do I know.

Cheers,
-- jra
[1] Which is not incompatible with observations then that MWtext has some
really unaccepable boners in itself...

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Licensing (Was: WYSIWYG and parser plans)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "Peter Youngmeister" 

> If you guys still think about ideas as things that can be stolen,
> perhaps you should check out the open source movement. Here's a good
> reference:
> 
> http://en.wikipedia.org/wiki/Open_source

Aw, c'mon, Peter.  No strawmen; it's late.

The reasons why many programmers prefer GPL to BSD -- to keep the work 
they've invested long hours in for free from being submerged in someone's
commercial project with no recompense to them -- which GPL forbids and
BSD does not -- is widely understoood.

Myself, I'm firmly convinced after 30 years in this business, that, for
all its faults, the choice of the GPL by Linus changed the face of 
computing (and damned near everything else) just a much as the Apollo
project's investment in microelectronics gave us all PCs to run them
on in the first place.

You're welcome to disagree (though not on this list; there are lots of 
places better suited to license advocacy), but it would probably be good
not to scoff at people for holding that view.  'Specially when you're
using their code.  :-)

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "Daniel Friesen" 


> I'm fairly certain myself that his intention was "With HipHop support
> since the C that HipHop compiles PHP to can be extracted and re-used
> we can turn that compiled C into a C library that can be used anywhere by
> abstracting the database calls and what not out of the php version of
> the parser. And because HipHop has better performance we will no
> longer have to worry about parser abstractions slowing down the parser and as
> a result increasing the load on large websites like Wikipedia where they
> are noticeable. So that won't be in the way of adding those
> abstractions anymore."

What I get for not paying any attention to Facebook Engineering.

*That's* what HipHop does?  

> Naturally of course if it's a C library you can build at least an
> extension/plugin for a number of languages. You would of course have
> to install the ext/plug though so it's not a shared-hosting thing.

True.

But that's still a derivative work.

And from experience, I can tell you that you *don't* want to work
with the *output* of a code generator/cross-compiler.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-03 Thread Jay Ashworth
- Original Message -
> From: "Tim Starling" 

> I wasn't saying that the current MediaWiki parser is suitable for
> reuse, I was saying that it may be possible to develop the MediaWiki
> parser into something which is reusable.

Aren't there a couple of parsers already which claim 99% compliance or better?

Did anything ever come of trying to assemble a validation suite, All Those
Years Ago?   Or, alternatively, deciding how many pages it's acceptable to
break in the definition of a formal spec?

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] WYSIWYG and parser plans (was What is wrong with Wikia's WYSIWYG?)

2011-05-04 Thread Jay Ashworth
- Original Message -
> From: "Nikola Smolenski" 

> I was thinking whether it would be possible to have two-tier parsing?
> Define what is valid wikitext, express it in BNF, write a parser in C
> and use it as a PHP extension. If the parser encounters invalid
> wikitext, enter the quirks mode AKA the current PHP parser.
> 
> I assume that >90% of wikis' contents would be valid wikitext, and so
> the speedup should be significant. And if someone needs to reuse the
> content outside of Wikipedia, they can use >90% of the content very
> easily, and the rest not harder than right now.

Yeah, I made this suggestion, oh, 2 or 3 years ago... and I was never
able to get the acceptable percentage down below 100.0%.

Cheers,
-- jra

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   3   >