Re: [Wikitech-l] Anonymous editors & IP addresses

2014-07-11 Thread Happy Melon
On 11 July 2014 17:10, Gilles Dubuc  wrote:

>
> Maybe it's a cookie-based approach you had in mind? Where we automatically
> create an account tied to the user agent. That would mitigate the issue of
> converting a pseudo-account that might have been shared between several
> people to a proper account, but not completely get rid of it.
>

I'd have thought the chain of events "go to a library computer, do some
edits, decide to upgrade to a real account, do so, realise you've
inadvertently swept up all the unsalubrious penis vandalism that has been
made on that computer previously" would be unacceptably common.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Ensure that user is logged in

2014-06-19 Thread Happy Melon
Blame Reedy [1].  Or ask him for clarification...

--HM

[1]
https://www.mediawiki.org/w/index.php?title=API:Edit&diff=410992&oldid=404971


On 19 June 2014 14:27, Bartosz Dziewoński  wrote:

> On Thu, 19 Jun 2014 15:16:22 +0200, MZMcBride  wrote:
>
>  Petr Bena wrote:
>>
>>> Can this parameter be anywhere in the url? for example
>>> api.php?action=query&assert=user&prop=blabla or does it need to be on
>>> a specific position, like token?
>>>
>> It can be anywhere after "api.php", but not anywhere in the URL.
>> I don't believe any token requires a specific position in a URL.
>>
>
> Yeah… The API documentation mentions this, but I think it's wrong or at
> least misleading.
>
> https://www.mediawiki.org/wiki/API:Edit#Token
>
> "When passing this to the Edit API, always pass the token parameter last
> (or at least after the text parameter). That way, if the edit gets
> interrupted, the token won't be passed and the edit will fail."
>
> I'm reasonably sure that the HTTP and HTTPS protocols are smart enough to
> recognize "cut off" requests, and that any servers whatsoever are smart
> enough to implement this behavior.
>
>
> --
> Matma Rex
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data to improve our code review queue

2014-04-04 Thread Happy Melon
On 4 April 2014 00:38, Brian Wolff  wrote:

>
> > what about directing new volunteers there, asking them to submit their
> code
> > revisions. For a patch that has been waiting in silence for over a year,
> > any feedback will be better than no feedback.
>
> You sure about that? I would imagine that having no one look at your code
> for months, then having someone who doesn't have the authority to approve
> it
> nit pick it a little, followed by another couple months of waiting, to be
> more frustrating then no feedback at all.
>

This is also completely the wrong way to go about open-source development.
The work priorities of volunteers are the thing that you, as manager of
paid staff, *can't* control, as opposed to the work priorities of paid
staff, which you very much can.  If reviewing these old patches was in any
way interesting/exciting/fulfilling, volunteers would probably already have
*made* some contributions.  Being occasionally tasked with
uninteresting/unexciting/unfulfilling tasks that Just Need Doing is one of
the things that paid developers *get *paid *for*.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] captcha idea: proposal for gnome outreach for women 14

2014-03-03 Thread Happy Melon
On 28 February 2014 18:29, Brad Jorsch (Anomie) wrote:

> On Fri, Feb 28, 2014 at 12:07 PM, Mansi Gokhale  >wrote:
>
>
> Then there's the issue of different interpretation. Take for example
> https://www.mediawiki.org/wiki/File:Find-all-captcha-idea.png. Is the
> second image wearing glasses? Or is that a lorgnette or something like
> opera glasses, both of which are held in front of the eyes rather than
> worn?
>
> https://www.mediawiki.org/wiki/File:Find-the-different-captcha-idea.pnghas
> a similar problem. The first image is the only one with a cigarette, and
> the only one with non-realistic coloring. The second is the only bald one,
> and the only one with something resembling a lorgnette, and the only one
> not looking in the general direction of the camera, and the only one with a
> book. The fourth is the only child. The sixth is the only obvious female
> (I'm not sure about the cat). The eighth is the only one smiling, and the
> only one with visible teeth.
>

I think this is oversimplifying.  Of course some people can interpret a
picture puzzle in slightly different ways - the whole *point* of a captcha
is to distinguish between the intuitive reasoning of a human and the
formulaic reasoning of a computer; if there was absolutely no ambiguity, it
would be a very poor captcha.  In exactly the same way that the letters on
a captcha will sometimes be distorted in such a way that humans genuinely
make a mistake, sometimes the questions in a picture puzzle can be
sufficiently distorted to the point that they are answered incorrectly.
The 'difficulty' of *any* captcha obviously needs to be carefully
calibrated to hit the sweet spot between mundanity and ambiguity.  But
putting out nine pictures of humans and one picture of a cat and asking for
the "odd one out" is no easier to misinterpret than a squiggle that might
be a G or might be a 6.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic unit conversion

2014-01-17 Thread Happy Melon
Enwiki's {{convert}} template is a behemoth of a structure which is
intended to do this.  I once made an attempt to write a PHP-side extension
to do it (look at the revision history of the ParserFunctions extension),
but it never took off [1].  I don't think there was ever any enthusiasm to
take the ability to "tinker" with the formatting and output ({{convert}}
has a million and one different stylistic variations and parameters) away
from wiki template editors.

--HM

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=40039


On 17 January 2014 10:56, Jasper Deng  wrote:

> I would like to ask, how are significant figures going to be dealt with?
> 300 could mean anything from one to three significant figures, for example.
>
>
> On Fri, Jan 17, 2014 at 1:47 AM, Marc Ordinas i Llopis <
> marc...@wikimedia.org> wrote:
>
> > On Fri, Jan 17, 2014 at 9:58 AM, Petr Bena  wrote:
> >
> > > For example you would say the object has width of {{unit|cm=20}} and
> > > people who prefer cm would see 20 cm in article text, but people who
> > > prefer inches would see 7.87 inch.
> >
> >
> > This is a great idea! As proposed it'd be very helpful, but maybe it'd be
> > better if it showed the original text and a conversion on mouse-over
> (maybe
> > with a small icon to indicate it, like external links).
> >
> >
> > > This could be even based on
> > > geolocation for IP users
> > >
> > >
> > Oh, please, don't use IP geolocation for anything. It's terrible for
> people
> > travelling, using proxies, living abroad, living in places where more
> than
> > one languages are commonly spoken, learning new languages… If you want to
> > get an initial default, use Accept-Language (like, inches for en-US and
> cm
> > for anyone else :) and allow the user to modify it.
> >
> > Thanks,
> > Marc
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Please use sitenotice when a new version of software is deployed

2013-12-05 Thread Happy Melon
On 5 December 2013 23:36, Bartosz Dziewoński  wrote:

> On Thu, 05 Dec 2013 23:00:04 +0100, Dan Garry 
> wrote:
>
>  How about getting this stuff included in the Signpost? I think that's a
>> good medium for it.
>>
>
> Signpost is English-specific, Wikipedia-specific and
> English-Wikipedia-specific.
>


But let's be honest; the disproportionate majority of user discontent is
enwiki-specific as well...  :-p

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Happy Melon
On 18 November 2013 17:46, Nathan Larson  wrote:


> If Google agrees, they
> can stop giving wikis in general, or certain wikis, such influence over
> pagerank. The spammers have market incentives to become more sophisticated,
> but so does Google, since their earnings depend on keeping their search
> results relevant and useful, so that people don't switch to competitors
> that do a better job.
>

Market forces are not our friend.  Google's incentive is to *ignore* spammy
links, not to stop them existing; spammers' incentive is to get their links
wherever they possibly can, and particularly in the places where they're
effective, not to avoid putting links where they're not effective.  Pure
market forces would leave wikis (large and small) attacked by progressively
more sophisticated spam, search engines being progressively smarter about
ignoring the spam, and wikis *still being served with as much spam as
before* (and it being progressively harder to identify and remove).

Wikis can only participate in the arms race by exposing publicly the
*extent* to which spamming is pointless.  Google publicising the fact that
nofollow is ignored (and hence spamming is pointful) is actually a really
unhelpful thing for them to do.  If they really have taken the nofollow
weapon away from wikis altogether, then we need to find a way to get it
back.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Applying nofollow only to external links added in revisions that are still unpatrolled

2013-11-18 Thread Happy Melon
On 17 November 2013 11:41, Nathan Larson  wrote:

> In my experience, spam is pretty easy to spot
> because the bots aren't very subtle about it.
>

I'm sure spam directed at, say, enwiki, would get very subtle very quickly
if spammers thought there was a real chance of it being able to use
enwiki's pagerank weight.  Don't underestimate spammers' ability to learn
and adapt.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is assert() allowed?

2013-07-31 Thread Happy Melon
On 31 July 2013 15:01, Tyler Romeo  wrote:

> On Wed, Jul 31, 2013 at 8:38 AM, Happy Melon  >wrote:
>
> > Deliberately using a function which reduces the security of your
> > application to relying on everyone choosing the correct type of quotes is
> > definitely asking for trouble.
> >
>
> I don't see how this is an issue. htmlspecialchars() can cause an XSS
> vulnerability if you pass it the wrong ENT_ constant. Should we just stop
> using htmlspecialchars() in case developers pass the wrong constant?
>


Yes, IMO, it should be abstracted away with a carefully-written wrapper
function that bridges the semantic gap between "I want to do some character
conversions" and "I want to make this text safe to echo to the browser",
but that's just the point.  Of course there are plenty of language features
you can point to that open up pitfalls; each one having its own severity
and ease-of-discovery.  htmlspecialchars() has a medium severity and very
easy discovery, and it's a problem that's easy to eliminate by abstracting
the call to ensure it's always given the proper arguments.  My example was
to disprove your point that assert() with string arguments is not as bad as
eval(); it is, for exactly the same reasons.  Of course it's possible to
use eval() safely, just like any other construct, but general consensus is
that eval()'s security holes are severe enough and difficult-to-spot enough
to warrant strongly discouraging its use, and there is no reason not to
treat assert()-with-string-args the same way.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is assert() allowed?

2013-07-31 Thread Happy Melon
$_GET["foo"] = 'include( "evil_file.php" )';
assert( '$_GET["foo"] == "fluffy bunny rabbit"' ); // This is fine
assert( "$_GET['foo'] == 'fluffy bunny rabbit'" ); // But this is not

Deliberately using a function which reduces the security of your
application to relying on everyone choosing the correct type of quotes is
definitely asking for trouble.

--HM


On 31 July 2013 13:19, Tyler Romeo  wrote:

> On Wed, Jul 31, 2013 at 7:42 AM, Tim Starling  >wrote:
>
> > Indeed. In C, assert() will abort the program if it is enabled, which
> > is hard to miss. It is not comparable to the PHP assert() function.
>
>
> ...except PHP's assert() *also* aborts the program if enabled. What am I
> missing here?
>
>
> > The reasons I don't like assert() are:
> >
> > 1. It doesn't throw an exception
> > 2. It acts like eval()
> >
> > We could have a library of PHPUnit-style assertion functions which
> > throw exceptions and don't act like eval(), I would be fine with that.
> > Maybe MWAssert::greaterThan( $foo, $bar ) or something.
> >
>
> 1. It's fairly trivial to use assert_options() to make assertions throw
> exceptions if you really wanted to while developing.
> 2. Except it's not. Again, you're welcome to give an example where code
> provided as a string in an assertion is not exactly the same as having the
> code hardcoded.
>
> *-- *
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2016
> Major in Computer Science
> www.whizkidztech.com | tylerro...@gmail.com
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki extensions as core-like libraries: MediaWiki's fun new landmine for admins

2013-07-21 Thread Happy Melon
On 21 July 2013 11:30, Jeroen De Dauw  wrote:

> Hey,
>
> > > you're adding in a whole new set of incompatibilities.
> > >
> > > How so?
> > >
> > >
> > Extensions that use any of these extension libaries now depend on the
> > version of MediaWiki and the version of the extension. That's a new set
> of
> > dependencies that can be incompatible now.
> >
>
> It adds a component as dependency yes. It however does not add code as
> dependency. One of the main reasons for creating well designed components
> rather then throwing everything into one bucket is that it minimizes source
> code dependencies. If you minimize dependencies (as a good programmer
> should do on all levels), you also minimize the things that can cause
> compatibility conflicts. It thus certainly benefits the user.
>
> Imagine you have a package P, which contains components A, B and C. Only
> package P is versioned and released. That it could be split into 3 packages
> is a detail not visible to the end user. A is stand alone, B depends on A
> and C depends on A. You want to install extension E0 that depends on P v1.0
> or later. In particular, it depends on B, though again, the end user does
> not know this. You want to make this install on a wiki where you already
> have E1 installed. E1 depends on P v0.5 to v0.8. Internally it only cares
> about C. This situation means you can in fact not install E0 and E1 at the
> same time. The only reason for this is because to many things are packaged
> together. If B and C where in their own packages, there would be no issue.
>
> So one can for instance go from depending on package ABC that contains sub
> components A, B and C to depending on A and B. That's two packages instead
> of one if you look at it naively, though two components instead of 3 on
> closer investigation. It is quite bad to force users to care about C while
> there is no reason for doing so. They'll be affected by any issues that
> occur in C for no good reason. Every time ABC has a new release because
> something in C had to change, users will have to deal with this release,
> since to them it is not visible this is pointless.
>
> These are simple examples, with one component that could be split into 3.
> The problems for the user get significantly worse if one throws more things
> together and as this problem is replicated in the dependency chain.
>

The fact that this thought experiment is almost impossible to visualise
without pen and paper is symptomatic of the problem.  The end user of
MediaWiki doesn't care about arcane internal dependencies, or the rationale
for them, they want it to Just Work.  So either the system has to actually *
be* simple (version X of an extension requires version Y of MediaWiki, so
if I update my MW install I need to also update extensions (or vice
versa)); or it needs to be *properly* abstracted behind a system which
makes it *appear* simple (you run some script and everything magically
works or gives you informative explanations of why it didn't).

The former is obviously preferable in terms of clarity and maintainability;
and that is realised immediately if all essential components are properly
backwards-compatible.  There should never *be* a dependency of the form
"package X *no later than* vY".  That is a fault of package X, not of the
extension which depends on it.  Package X should have been properly managed
so that its public signature does not change incompatibly; and if a change
is necessary, the change is managed over several versions so that consumers
can safely update their use of the new signature and increase their
dependency version to the one which introduces the new behaviour.  "I need
all my components to be sufficiently recent that they support all the
features I want" is a methodology that end users can understand (and also
one which means that they should *always* be able to get the feature set
they want).  "I need to calculate the intersection of the dependencies of
all my features and resolve them manually or automatically" is not;
especially when there may not even *be* an intersection.

None of that is in any way groundbreaking, it's just basic design
principles which I think we all subscribe to, even if they don't always
materialise.  The point is that MediaWiki is *much* more likely to be able
to police and maintain that proper signature management in core code, than
in extensions and obscure miscellaneous modules.  It's great that modules
like Diff have good beady eyes on code quality and portability, long may
that continue.  That is *definitely* the exception, not the rule, with
extensions.  Our core code definitely isn't perfect in terms of signatures,
or even acceptable in many places.  But it's getting better because that's
where the most attention is focused.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Server reboots now through next week

2013-05-18 Thread Happy Melon
On 17 May 2013 23:26, Petr Bena  wrote:

> hey, could you point me to that security patch? I am curious as I am
> myself running bunch of linux boxes
>

+1
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Countdown to SSL for all sessions?

2013-04-30 Thread Happy Melon
On 30 April 2013 18:27, Petr Bena  wrote:

> SSL is requiring more CPU, both on server and client and disable all
> kinds of cache (such as squid or varnish), and some browsers may have
> problems with it OR in some countries encryption may be even illegal.
>
> Whatever you are going to do, you should let people turn it off.
> Wikimedia project itself has horrible security (in this thread I
> started some time ago -
>
> http://www.gossamer-threads.com/lists/wiki/wikitech/277357?do=post_view_threaded#277357
> I was even told that wikimedia doesn't need good security at all,
> because user accounts aren't so critical there), forcing SSL will not
> improve it much
>

I think you need to check those facts.  How many years do you have to go
back before the extra CPU needed for a client to decrypt an SSL connection
becomes noticeable on a client?  Or how many browser versions before
support becomes imperfect?  SSL support was introduced in Internet Explorer
version *Two*, in 1995.

SSL is about much more than just preventing account hijacking.  It hides
details of what you're doing and what pages you're reading from people who
have no right or need to know.  In some jurisdictions, the correlation
between the publicly-available content of a comment or edit, and the
snoopable identity of the person who made it, can be damning.  The more
routine and commonplace SSL connections are, the safer the people who are
protected by it will be.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bugzilla Weekly Report

2013-04-04 Thread Happy Melon
On 4 April 2013 15:23, Željko Filipin  wrote:

> On Thu, Apr 4, 2013 at 4:04 PM, Mark Holmquist  >wrote:
>
> > In slight contrast, would you mind putting them on a wiki somewhere and
> > linking to them?
>
>
> Sure. Do you have a suggestion where to put them?
>
> Andre, do we already have a place for something like that?
>
> Željko
>

Why not just upload them as images to mediawiki.org?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] switching to something better than irc.wikimedia.org

2013-03-01 Thread Happy Melon
Because we made that mistake with the API, and now we're stuck with a bunch
of deadweight formats that do nothing other than increase maintenance
costs.  If your first preference as a client developer is for JSON, it's
really not that hard for you to go get a library to receive it in XML
instead, or vice versa.  That's the whole point of a standardised format.

--HM


On 1 March 2013 13:48, Petr Bena  wrote:

> I see that the RFC is considering multiple formats, why not support
> all of them? We could make the client request the format they like,
> either XML or JSON, that would be a matter of dispatcher how it
> produce the output data.
>
> On Fri, Mar 1, 2013 at 2:35 PM, Daniel Friesen
>  wrote:
> > We actually have an open RFC on this topic:
> >
> >
> https://www.mediawiki.org/wiki/Requests_for_comment/Structured_data_push_notification_support_for_recent_changes
> >
> > --
> > ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Populating PageImages data

2013-02-01 Thread Happy Melon
But not simply the first image to be found in the source, which in many
cases is the icon in a maintenance template or top icon.  For
https://en.wikipedia.org/wiki/Louis_Bonaparte, for instance, the image
returned is correctly the one from the infobox, not the
book-with-question-mark icon from the needs-more-references template.
There's still room for improvement, for sure; but it's definitely a
legitimate piece of data to want to collect.

--HM


On 1 February 2013 15:17, John  wrote:

> Its broken, on pages where there are multiple images it just shows the
> first one
>
> On Friday, February 1, 2013, Max Semenik wrote:
>
> > On 01.02.2013, 18:14 John wrote:
> >
> > > I think there are still some serious issues with this extension, I
> > > have checked several pages, and used the max limit parameter and all
> > > it returns is a single thumb
> >
> > That's the point. If you want to enumerate all images on a page,
> > there's prop=images. PageImages returns just 1, most appropriate,
> > thumb.
> >
> > --
> > Best regards,
> >   Max Semenik ([[User:MaxSem]])
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Set $wgUseCombinedLoginLink = false on WMF cluster?

2012-05-28 Thread Happy Melon
On 28 May 2012 22:14, Steven Walling  wrote:

> On Mon, May 28, 2012 at 12:39 PM, Daniel Friesen
> wrote:
>
> > - To have a combined form like that you have to create a brand new
> special
> > page; you can't do that within our current Special:UserLogin
> > - We don't provide a way to simply override the special page used for
> > login links
> > Hence with these two facts that means it's impossible to make an
> alternate
> > login form without replacing stuff inside the personal_urls by using
> hooks
> > and constructing your own arrays to put in the list.
> > Wikia doesn't even bother with hooks, they just make core modifications
> > for this stuff.
> >
>
> Just FYI, work is being done to decouple account creation from
> Special:UserLogin. The initial SignupAPI extension was a GSOC project, and
> it's currently undergoing code review etc. This should allow us to
> implement some long overdue improvements, such as:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=34447
>
> The relevant bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=36225
>

This corner of MediaWiki is horrifically old and rusty; I tried a big
refactoring a while ago but it broke CentralAuth and so got pulled a couple
of times.  I might have another go now we have labs up and running and we
can easily set up a SUL domain to match the cluster's.  SignupAPI basically
copied mountains of code from LoginForm, IIRC; wasn't a real refactoring
(could be mis-remembering though).

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HTMLMultiSelectField as

2012-05-23 Thread Happy Melon
On 23 May 2012 18:16, Daniel Werner  wrote:

> Right now I am implementing a new option (as part of
> https://bugzilla.wikimedia.org/show_bug.cgi?id=36425) for which I'd like
> to
> use a  html element with options. Right now
> MediaWiki always generates a list of selectboxes instead of that when using
> the HTMLMultiSelectField class. We are talking about 280+ selectable items
> here, so for now we came to the conclusion that a real multi 
> would be nicer and less space consuming for now
> I have already managed to implement this multiple select,
> modifying HTMLMultiSelectField adding a new option 'usecheckboxes' which
> can be set to false to disable the known behavior and use a select element
> instead.
>
> This would mainly be for the JavaScript-less ui. If javascript were
> enabled, we could still do something nicer, for example with something like
> jQuery chosen plugin here.
>
> My question would just be, how I should implement these changes preferably.
> Is it ok with the new option for  HTMLMultiSelectField or should this be a
> new class inheriting from  HTMLMultiSelectField? I think
> HTMLMultiSelectField sounds more like describing what I just implemented
> rather than a bunch of select boxes, but of course renaming the existing
> one could "break" extensions (even though both are fully compatible and
> interchangeable). So one option would be simply naming the new one
> HTMLMultiSelectField2 if we don't want to stick with an additional option
> here.
>
> Cheers
> Daniel
>
>
There's some relevant comments and discussion in our draft style guide [1]
that might be interesting/relevant.

--HM

[1] https://www.mediawiki.org/wiki/Style_guide/Forms
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] @param documentation

2012-04-26 Thread Happy Melon
On 26 April 2012 20:59, Krinkle  wrote:

>
> Another aspect that is often inconsistent is that some people prefer to
> uppercase primative types (so instead of "string", "array", "Title", they
> use
> "Mixed", "Number", "String", "Array"). I find that somewhat confusing, but
> not
> sure if we should enforce that in the conventions.
>

This gives me not-entirely-pleasant reminders of "string" verses "String"
in C#...

Another thing that's worth thinking about is how to indicate that the
variable is an array of some uniform type.  We have a lot of
"Array(Title)", etc; but my IDE at least doesn't catch all that information
and only type-hints as far as it being an array.  It prefers, and fully
parses, the syntax "Title[]"; but does that work in Doxygen?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Inline styles trouble on the mobile site

2012-04-25 Thread Happy Melon
On 25 April 2012 12:56, Jon Robson  wrote:

> > We already don't validate. There's no point to trying to conform to a
> > validator when the spec/validator is wrong. And we already have cases
> like
> > that.
>
> I think we should try to validate though mostly for future proofing...
>
> > Anyways, technically you could already use scoped anyways. Just add the
> > scoped attribute. Don't use css that applies outside the content area.
> And
> > then it'll validate, it'll work in browsers, and when browsers actually
> > implement scoped they'll start restricting the scope.
> 
> > But after you've dealt with all the XSS issues; you've opened up the
> ability
> > to completely destroy the UI from within WikiText. In ways even worse
> than
> > the tricks attempting to simply cover the whole UI with a div. Those
> tricks
> > being ones you could technically eliminate by using overflow+relative on
> the
> > content area and disallowing position: fixed; (The only thing in the way
> of
> > that right now is WP's stupid page icon hack).
> >
> I think if we restricted css to templates that only trusted admins can
> edit then these problems goes away somewhat no?
>
> "Is wiki admin" doesn't traditionally mean "is fully trusted not to screw
things up deliberately" and *definitely* doesn't mean "is trusted not to
screw things up accidentally".  It would be pretty easy for an admin to
accidentally add styles which screw up the rendering of the edit form and
make it difficult to undo.  And at least at the moment any such
potentially-troublesome edits are confined to one small, low-traffic
namespace.  Trying to monitor recentchanges to the whole template namespace
on a large wiki for XSS is entirely impractical.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Lua: return versus print

2012-04-13 Thread Happy Melon
On 13 April 2012 13:12, Petr Bena  wrote:

> I have no knowledge of Lua, but I don't see what is problem with print
> here, the function print is supposed to print output to output device
> in most of programming languages, just as in this case, so I don't
> understand why we should want to use return (which is supposed to
> return some data / pointer back to function it was called from) in
> this case? I mean if we can pick if we should use print or return as
> recommended way to print text, I would vote for print(), especially if
> it has better performance that the implementation using return.
>
> On Fri, Apr 13, 2012 at 1:45 PM, Tim Starling 
> wrote:
> > At the moment, in the Lua support extension we have been developing,
> > wikitext is output to the wiki via the return value of a function. For
> > example in wikitext you would have:
> >
> > {{#invoke:MyModule|myFunction}}
> >
> > Then in [[Module:MyModule]]:
> >
> > local p = {}
> > function p.myFunction()
> >   return 'Hello, world!'
> > end
> > return p
> >
> > This is all nice and elegant and will work. There is an alternative
> > convention commonly used in scripting languages (and programming in
> > general for that matter), using a print function:
> >
> > local p = {}
> > function p.myFunction()
> >   print('Hello, world!')
> > end
> > return p
> >
> > I would have been happy to leave it as Victor Vasiliev made it, i.e.
> > using return values, but I happened across a performance edge case in
> > Lua which made me think about it. Specifically, this:
> >
> > function foo(n)
> >   s = ''
> >   for i = 1, n do
> >   s = s .. toString(i)
> >   end
> >   return s
> > end
> >
> > has O(n^2) running time. For 100,000 iterations it takes 5 seconds on
> > my laptop. Apparently this is because strings are immutable, so the
> > accumulator needs to be copied for each concatenation. It's very
> > similar to the situation in Java, where a StringBuffer needs to be
> > used in such an algorithm.
> >
> > It's easy enough to work around, but the problem is obscure enough
> > that I think probably most of our users will not realise they need to
> > work around it until it becomes severe.
> >
> > It would be possible to provide a print() function which does not
> > suffer from this problem, i.e.
> >
> > function foo(n)
> >   for i = 1, n do
> >   print(i)
> >   end
> > end
> >
> > could run in O(n log(n)) time. Intuitively, I would expect that
> > providing such a print function would encourage a programming style
> > which would avoid at least some instances of repetitive concatenation.
> >
> > The performance issue is probably no big deal, since most templates
> > are probably not going to be concatenating hundreds of thousands of
> > strings, and 5 seconds is still quicker than the time it takes most of
> > our featured articles to render at the moment. But like I say, it got
> > me thinking about it.
> >
> > Does anyone have any thoughts on return versus print generally? Are
> > there other reasons we would choose one over the other?
> >
> > -- Tim Starling
> >
>

I don't see a problem with supporting both.  Considering that you don't
want the return value of *every *function to always be printed, just the
return value of the function directly called by #invoke, you can just
document #invoke as implicitly translating to "print( foo() )".  Is there
an equivalent parser tag which does *not* print output?  That would make
the parallel even clearer.

Having a print() function would be very useful for debugging; you could
turn 'debug mode' on on sandbox pages with an input arg (I assume #invoke
and friends can take arguments?) and something like
{{#invoke:MyModule|MyFunction|debug={{#ifeq:{{SUBPAGENAME}}|Sandbox|true|false,
and output debugging data with more flexibility if you had a second channel
for printing.

Separately, it would be awesome to have some sort of 'intellisense' hinting
for potential pitfalls like this.  I've recently been doing a lot of work
in MATLAB, and it has a really effective hinter that warns you, for
example, when you change the size of a matrix inside a loop and encourages
you to predefine it, which is a similar concept.  I assume an editor with
syntax highlighting etc is somewhere on the development roadmap, albeit
probably fairly low down, so I guess add this as an even lower priority!

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Save to userspace

2012-04-11 Thread Happy Melon
On 11 April 2012 10:48, Benjamin Lees  wrote:

> On Wed, Apr 11, 2012 at 5:18 AM, Petr Bena  wrote:
> > I have no idea the page of extension say that it isn't stable
>

https://bugzilla.wikimedia.org/show_bug.cgi?id=1

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changes status in Gerrit

2012-04-05 Thread Happy Melon
On 5 April 2012 11:43, Antoine Musso  wrote:

> Le 04/04/12 22:56, Roan Kattouw a écrit :
> 
> >> What that means, is that a change could look fine (Verified + CR) and
> >> thus be merged by accident if someone with the correct right click the
> >> 'Submit Patch Set 1' button :-D
> >>
> >> No, that's not true. A change requires Code Review +2 before it can be
> > submitted, not +1. And only trusted reviewers have +2 powers.
>

So basically, in Gerrit, (1 + 1 != 2)...??  :-)

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Re-introducing "UNCONFIRMED" state to bugzilla

2012-03-13 Thread Happy Melon
On 13 March 2012 21:44, K. Peachey  wrote:

> Why is it even restricted? We don't restrict closing bugs.
>
>
As the rising tide of bugspam demonstrates, we *should*.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] reopening an older discussion and proposing a new global right "right-special-passwordreset-access'

2012-03-11 Thread Happy Melon
On 11 March 2012 16:36, Thomas Gries  wrote:

> Background and older discussions:
> https://www.mediawiki.org/wiki/Special:Code/MediaWiki/95596
> and
>
> http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/55055
> related is bug https://bugzilla.wikimedia.org/show_bug.cgi?id=35121
> "rename right-passwordreset to right-passwordreset-emailsent-capture-view"
>
> Hello,
>
> _/I want to propose a slight change in/_
> *
> Special:PasswordReset
> *
> similar to what I proposed last year in
> https://www.mediawiki.org/wiki/Special:Code/MediaWiki/95596
> which was then reverted in
> https://www.mediawiki.org/wiki/Special:Code/MediaWiki/95618
> because the change was not properly justified and discussed last year.
>
>
> I now suggest these changes:
>
> 1. includes/specials/SpecialPasswordReset.php
>
> class SpecialPasswordReset extends FormSpecialPage {
> public function __construct() {
> - parent::__construct( 'PasswordReset' );
> + parent::__construct( 'PasswordReset', *'special-passwordreset-access'* );
> }
>
>
> Because we already have "passwordreset" (right to view the password
> reset mail when it was sent to user)
> in order to avoid confusion I choose special-passwordreset-access
> (this should be renamed as suggested in bug35121, not topic of this mail)
>
> 2. includes/DefaultSettings.php:
>
> + *$wgGroupPermissions['*']['**special-passwordreset-access**'] = true;*
>
> 3. languages/MessagesEn.php:
>
> 'right-passwordreset' => 'View password reset e-mails',
> + *'right-special-passwordreset-access**' => 'Can access
> Special:PasswordReset',
> *
>
> Is it possible to reach a consensus about the new right so I can submit
> such a change to core ?
> I'll be available in #mediawiki now.
>
> Tom
> (Wikinaut)
>
>
Is there a bug for this?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Who can make Gerrit/Labsconsole accounts?

2012-02-29 Thread Happy Melon
On 29 February 2012 14:27, Petr Bena  wrote:

> OK, in that case we should update the list with more "functions", beside
> svn etc, to "LDAP admin", or whatever what is needed to create accounts for
> labs and such, and insert all missing people, like Sumana, Ryan, Sara...
>
>
https://meta.wikimedia.org/w/index.php?title=System_administrators&action=edit

The main purpose of that page is as an on-wiki reference to wiki user
accounts which might be encountered making sysadmin actions, as explained
in the first section of that page.  "Don't revert sysadmin actions" is
effectively a global policy, and the page serves an important function in
describing that policy.  As with all wiki pages on meta, it is intended
mainly for WMF wiki-users, not developers (otherwise it would belong on
mw.org).

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Question: Hook:LinkEnd and Title::getNamespace method for internal page ( such as User or User_talk page ) where the target page does not exist

2012-02-22 Thread Happy Melon
On 22 February 2012 08:28, Thomas Gries  wrote:

> Am 21.02.2012 23:57, schrieb Roan Kattouw:
> >
> > I don't think so. Even non-existent Title object must have their
> > namespace set.
> >
> Yes, they have.
>
> I have found the problem. It is not the Linker per se and come with a
> modified question.
> It has to do with i18n and localisation of the (in this case) names for
> USER and USER_TALK Namespace.
>
> Basically:
> a link on a page like [[Benutzer:Alice]] is not necessarily the same as
> [[User:Alice]] (even when the latter exists).
>
> It depends on the current setting of
>
> $wgLanguageCode = "en" ;
> $wgLanguageCode = "de" ;
> (during testing my extension I played with this setting)
>
> whether [[Benutzer:Alice]] it is in the Namespace or not.
>
> So I was trapped by thinking that _any_ localised Namespace (like
> "Benutzer") is necessarily the same as USER or USER_TALK,
> which was incorrect.
>
> Question:
> ===
> Has anyone an idea, how to detect language-independently whether a link
> on page is in Namespace USER or USER_TALK, or in a localised version of
> these (when $wgLanguageCode has been modified)?
>
> The goal is to detect and to mark USER or USER_TALK page links
> language-independently in
> function wfWikiArticleFeedsAddSignatureMarker in E:WikiArticleFeeds line
> 262 .
>
>
> Tom
>

Well if this were an on-wiki template I would suggest you normalise the
namespace name to the localised canonical name using {{NAMESPACE:}},
then compare it with a switch to the various similarly-normalised namespace
names: {{#switch: {{NAMESPACE:}} | {{ns:2}} =  |
{{ns:3}} =  }}.  Programatically you'd be able to
cut out a lot of circularity in that process, just have a look at what code
is used in the NAMESPACE: parser function and see what you can reuse.

Or, and I can't quite tell which you want from your comment, are you
looking to detect when a link uses a prefix which is a User: namespace
alias in *any* language, even when that prefix is not in use on the wiki?
Why would you want to do that?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHP 5.3 policies

2012-02-12 Thread Happy Melon
On 12 February 2012 15:29, Jeroen De Dauw  wrote:

> Hey,
>
> > On 12 February 2012 15:52, Happy Melon 
> wrote:
> > Eg?
>
> https://www.mediawiki.org/wiki/User:Jeroen_De_Dauw/DBDataObject
>

I had just twigged myself that that was what you were refering to.  That's
not really a good example of an "isolated" utility; I would interpret such
a thing to be something that core genuinely doesn't *need*, not just
something it doesn't *currently* use.  Although retro-engineering that
framework into our core data classes would be an absolute bitch, if we were
rewriting MW from scratch we would most likely use such an abstraction from
the start, and there's no reason not to do so with new core features if
practical.  Except, of course, the issue with PHP dependencies.
Essentially, the "isolation" of this utility is a product of its dependency
flexibility, *not* the other way around.  The only way such isolated
elements would occur in core is if we condone the version heterogeneity you
suggest for them!

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] PHP 5.3 policies

2012-02-12 Thread Happy Melon
On 12 February 2012 13:34, Jeroen De Dauw  wrote:

> Hey,
>
>
> My second question is about isolated utilities in core. With isolated I
> mean that although they might be using core, core does not use them. Can we
> at this point introduce such utilities that require 5.3, assuming there is
> good reason to have these utilities and that they cannot be made to work
> (sanely) with 5.2.x?
>

Eg?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Discussion: MW + Suhosin extension: run-time adaption of $wgResourceLoaderMaxQueryLength

2012-02-05 Thread Happy Melon
On 5 February 2012 08:55, Thomas Gries  wrote:

> Hi all,
>
> I discussed with Platonides and the Suhosin author an automatic MediaWiki
> run-time adaption of
> $wgResourceLoaderMaxQueryLength [1-3].
>
> Currently, Suhosin and the setting of
> suhosin.get.max_value_length is detected and signalled _only_ during the
> MW installation.
>
> However, if the system (Suhosin) settings are changed after the MW
> installation,
> it requires knowledge about this fact and and it requires to adapt
> the setting manually in any Mediawiki installation on this server.
>
> I now suggest to add something to the core which can adapt the
> $wgResourceLoaderMaxQueryLength also during run-time but still in the
> limits given by a previous
> $wgResourceLoaderMaxQueryLength in LocalSettings.
>
> // Design idea
> //
> // In LocalSettings / DefaultSettings
> // example of a user value from e.g. LocalSettings
> // this value may be cropped at run-time
> // to suhosin.get.max_value_length (if Suhosin extension is active)
> $wgResourceLoaderMaxQueryLength = 5212;
>
>
>
> /* In MW core after LocalSettings */
>
> if ( extension_installed( "suhosin" ) && ini_get(
> "suhosin.get.max_value_length" ) ) {
>
>   $wgResourceLoaderMaxQueryLength = min( $wgResourceLoaderMaxQueryLength,
> ini_get( "suhosin.get.max_value_length" ) );
>
> }
>
> [1] https://www.mediawiki.org/wiki/Manual:$wgResourceLoaderMaxQueryLength
> [2] https://www.mediawiki.org/wiki/Manual:Suhosin
> [3] https://github.com/stefanesser/suhosin/issues/4#issuecomment-3816249
>
>
It would make more sense to have an Extension:Suhosin which introduced
restrictions of this kind, probably in one of the hooks just after
LocalSettings.  While this is valuable functionality to have for users of
the suhosin patch, it's not applicable to the majority of MW installs.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Picture of the Year contest extension

2012-02-05 Thread Happy Melon
On 4 February 2012 23:56, Platonides  wrote:

> On 05/02/12 00:32, Mono wrote:
> > This is certainly a "project," but it could be used for every POTY
> contest,
> > perhaps some other contests, and it doesn't have to be elaborate.
> However,
> > as I said, it would be really great if we could get just a couple people
> > who have some experience developing MediaWiki extensions to help program
> > this in time for this year's contest.
> >
> > If anyone is interested in helping out or would like some more
> information,
> > please contact me as soon as possible - anything helps!
> >
> > Thanks,
> > User:Mono
>
> That's a good idea. It is also suitable for new developers getting
> practise. I can help with this.
> The most complex bit would be the possible user requirements.
> Other preferences need to be specified, but its implementation should be
> straightforward.
>
> I think the next step would be gaining feedback and collecting what the
> extension should do at some page, such as
> https://www.mediawiki.org/wiki/Extension:ImagePoll
>

 Do you need users to be able to vote for any picture on commons, or just
select from a shortlist gallery?  SecurePoll already allows you to use full
wikimarkup in question/option descriptions; you could just put image tags
in each option and quite easily build a gallery that way.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Some questions about #switch

2012-01-31 Thread Happy Melon
On 31 January 2012 14:05, Nickanc Wikipedia  wrote:

> > I'm considering introducing a limit on #switch cases of 2000 or so per
> > article, to address this issue. No doubt many templates will break,
> > but it's important to protect our servers, and we've always
> > discouraged this kind of #switch application.
> >
>
> In my opinion, we shall first check and list which templates will
> break, suggesting to fix them as soon as possible. Because of so many
> cases distincted, a 2000 cases #switch should be a overused template
> inside its wiki. Then introducing the limit.
>
> Nickanc
> [[m:User:Nickanc]]
>
>
Could this be implemented through a parser tracking category?  IIRC the
preprocessor generates other tracking cats when it hits its existing
limits, can it easily add a category *without* affecting the rendering?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Please help test MediaWiki 1.19 now - deployments may start in February

2012-01-26 Thread Happy Melon
On 26 January 2012 21:53, Sumana Harihareswara wrote:

> On 01/26/2012 04:45 PM, Chad wrote:
> > On Thu, Jan 26, 2012 at 4:31 PM, MZMcBride  wrote:
> >> What's this about "taking your site down for maintenance"? I don't think
> >> Wikimedia wikis have needed to go into read-only mode or go offline(!)
> for
> >> an upgrade since like MediaWiki 1.5 or something. I skimmed
> >>  and didn't see any
> crazy
> >> database schema changes or anything. Am I missing something?
> >>
> >
> > Not that I can see, there's no major schema changes this go
> > round so I can't see any reason we'd need to go offline/readonly.
> >
> > -Chad
>
> OK.  I was under the impression that we announce these windows partly
> because if something goes very wrong during the upgrade, sites might go
> offline or readonly.  Is that wrong or inapplicable here?  Perhaps
> better wording would instead just talk about possible JS breakage or
> some other, more likely ill effects?  Let me know and I'll send a
> followup mail.
>
> The risk of a new software bug sending a wiki offline is about the same
during a version update as for any other individual update: more changes
are being pushed, but more testing has been done on them.  Breakages can
almost always be remedied by rolling back the deployment, especially now we
have the het-deploy framework in good working order, and the more serious
the breakage, the faster it will be noticed.  Updates leaving the sites
readable but readonly are pretty implausible, and a breakage leaving a site
offline for a substantial period of time pretty unlikely.  I'd say
scheduling around 250 sets of national events against such remote
eventualities is substantial overkill.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wording (RE: SOPA banner implementation)

2012-01-17 Thread Happy Melon
On 17 January 2012 15:23, Markus Krötzsch wrote:

> On 17/01/12 15:01, Daniel Barrett wrote:
> >> http://test.wikipedia.org/?banner=blackout
> >
> > As a writer, I believe the current message ("The Wikipedia community has
> authorized...") is long and wordy and therefore not likely to be read by
> most users. I recommend shortening&  simplifying it. Here's an example that
> removes 30+ words and preserves the meaning:
> >
> > WE NEED YOU TO PROTECT FREE SPEECH ONLINE
> > The English Wikipedia is "blacked out" for 24 hours to protest two bills
> before the United States Congress, known as SOPA and PIPA. These bills
> endanger free speech in the United States and abroad, setting a frightening
> precedent of Internet censorship for the world.
> >
> > Today we ask you to take action.
> > [[Take action]] [[Learn more]]
>
> +1
>
> The first sentence is really too complicated.
>
> Markus
>

+1 for the first sentence.  The question of "authorised *who*?" is one that
only serves to distract attention from the main message.  I prefer the
current wording of the legislation description: I don't think people will
mind reading a few extra words there... it's not like they have anything
else to read!!  :-)

That said, I'm sure this list is not the best place to discuss the banner
wording.  Where is?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JavaScript on Special:UserLogin?

2012-01-12 Thread Happy Melon
On 12 January 2012 02:47, Daniel Barrett  wrote:

> On 11 January 2012 21:51, Daniel Barrett  wrote:
> >> * Remove any trailing "@companyname.com" from the username. Users in
> our company are accustomed to logging in this way on
> >>their Windows boxes, and we'd get several support calls per week from
> people who "can't log into the wiki" because they
> >>were adding @companyname.com onto their wiki usernames.
>
> >The way we do it on our Mantis bugtracker is to use the LDAP server
> >for all logins. Something like that for MediaWiki would do the job
> >too. One password!
>
> Yes, we use the LDAPAuthentication extension for MediaWiki. But usernames
> on the login page still must be "foo" rather than "f...@companyname.com".
> Hence the JavaScript to remove @.*.
>
> DanB
>
>
Of course, if the login form code wasn't such a swamp, there'd be a hook
you could use to preprocess the usernames server-side...  :-(

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] JavaScript on Special:UserLogin?

2012-01-11 Thread Happy Melon
On 11 January 2012 21:27, Daniel Barrett  wrote:

> In 1.18.0, the page Special:UserLogin no longer runs the JavaScript in
> MediaWiki:common.js.  Is this an intentional change from 1.17, and if so,
> is there a workaround to make custom JS run on the login page?
>
> (I tested this by putting alert('foo') in MediaWiki:common.js.  The alert
> appears on all articles & special pages except Special:UserLogin.)
>
> Thank you,
> DanB
>

Yes, no user-editable scripts are run on pages where password forms reside,
because it is trivially easy for users to use them to introduce
password-sniffing JS attacks, either deliberately or inadvertantly.  Or
that's the idea, at least; IIRC there's an open bug about gadgets running
somewhere they probably shouldn't, etc.

You could probably hack around it by using one of the many hooks in the
page rendering stack to add the JS module on the pages where it's not
already there, but before doing so it's worth thinking about the reason why
it was removed in the first place.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] should we keep $wgDeprecationWhitelist

2012-01-11 Thread Happy Melon
On 11 January 2012 19:04, Aaron Schulz  wrote:

> My inclination is to get rid of the feature for pretty much the reason you
> mentioned.
>
> In any case, is the point to avoid notices getting added to HTML for wikis
> with certain extensions? I don't know why a production site would be set to
> spew notices. Either error log settings or $wgDevelopmentWarnings can
> handle
> this. If it's to avoid them in the log, again, $wgDevelopmentWarnings
> works.
>
> IMO, the notices are most useful for core & extension developers testing
> their code (who deliberately let all warnings get spewed out). If the dev
> has time to work other extensions, and has the affected one enabled on the
> same test wikis that have other extensions being worked on, *then* it might
> be useful to hide certain warnings. However, it seems better to just delay
> the deprecation in core a cycle. The use case for the new global just seems
> too marginal and it seems pretty awkward and hacky.
>

I generally agree that this is not a good addition.  If a developer
suppresses warnings, he will then proceed to forget about the warning that
was suppressed, that's just a simple fact of life.  Then the whole benefit
of the warning system is negated; the developer is just as likely to update
their version and find that things break as if they had turned off
deprecation warnings altogether.

Rather, if a developer upgrades from 1.16 to 1.17 and notices a new
interface and a @deprecated warning on the old one, concludes that he
cannot fix it until 1.19 is released and he can drop support for 1.16, then
upgrades to 1.18 and starts getting warnings, he should hack core to remove
the wfDeprecated statement from the old interface.  Then when he upgrades
to 1.19, the probably-now-forgotten hack is overwritten and the warnings
reappear, reminding him that he should now start seriously thinking about
reworking the extension code.  He can then update, test and release a
1.19-compatible version of the extension before the core function itself is
removed in 1.20.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] LiquidThreads status

2012-01-04 Thread Happy Melon
On 3 January 2012 23:15, Erik Moeller  wrote:

>
> "Next" though would likely still mean no sooner than March/April. Any
> help is as always appreciated.
>

You say that help is always appreciated, and I think that, *in general*, it
is; but with regard to specific projects, particularly where there's a lot
of WMF staff involvement, I'm not entirely convinced.  I've been explicitly
*dis*couraged from taking a stab at projects because they were 'on the
list' of staff todos (one, ironically, Andrew's project for *after* LQT...
:-D ).  Unless you give volunteer devs more detail about what they *can*
productively do that won't be duplicated or wasted effort, I doubt anyone
will be able to do very much useful towards this (or any other) project.

Perhaps this, as something that a *lot* of people from everywhere on the
spectrum care about and would probably be enthusiastic to work towards, is
a good opportunity to try a more structured approach towards parceling out
work?  Publicise clearly whatever specification is currently floating
around the office (or take the time *now* to define it if it's not
already); make it clear what work needs to be done, and especially what
would make a good isolated module; and then come back in April and see if
anything has materialised that you can use?  If nothing useful has been
done, you've wasted no time or effort, just rescheduled it a bit; if good
work has been done then you've accelerated the project; and if work has
been done but it doesn't fit with what the staff produce, then you know
that the enthusiasm is there but that you need to work on your
communication for the next project.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] grouping users - an idea for a new SUL improvement

2012-01-01 Thread Happy Melon
Basically this comes back again to the "cool... oh wait SUL" thought
process...

Way back at the dawn of CentralAuth, the whole merging accounts 'thing' was
hopefully touted as a transitionary phase, with the ultimate nirvana being
having every username on every wiki either part of a global account,
reserved for a global account, or universally available; whereupon SUL
would cease to be an issue for projects like this (User:BobBot is a
universal global account affiliated to the User:Bob global account,
everywhere).  Does anyone have any stats on how far short we are of that
goal?  As in, what fraction of accounts on all wikis are still part of the
'messy' part of SUL rather than the 'clean' part?

--HM



On 1 January 2012 23:27, Platonides  wrote:

> Looks good. Although I'd rename the "member of Group:Bob" to "belongs to
> User:Bob", so you would have:
> User:Bob
> User:BobBot - belongs to User:Bob
> User:Bob (testing) - belongs to User:Bob
> User:Bob (vector skin) - belongs to User:Bob
>
>
> Although this opens the can of accounts with different names on several
> wikis.
> User:BobBot may belong to User:Bob everywhere but in wikis Foo, Bar and
> Baz, where Bob username was taken and he is known as 'Bob2'.
> Showing that "BobBot is of Bob" may be a bit confusing as in that wiki
> Bob is a different guy (even if coherent due to usage of sul usernames).
>
> It's not a problem to have sul Bob2 as belonging to Bob, but the local
> Bob may belong to global "Bob Smith". And if we start user groups, the
> Bob Smiths out there will ask for them to be recognised.
> Maybe there could be local accounts attached to user groups separatedly
> from sul ones.
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Tag extension detecting its tag name?

2011-12-21 Thread Happy Melon
I don't know if you can access that data directly from the extension
callback, but you can certainly wire it in without *too* bad a hack:

foreach( array( 'foo', 'bar', 'baz', quok' ) as $var ){
  $parser->setHook( $var, "WrapperClass::myCallback_$var" );
}

class WrapperClass {
  function __callStatic( $fname, $args ){
list( $junk, $var ) = explode( '_', $fname );
$args[] = $var;
return call_user_func_array( 'myCallback', $args );
  }
}

function myCallback( $input, $args, $parser, $frame, $var ) {
 return 'hello world';
}

It's pretty obviously a retrofitted design change, but it's fairly robust,
especially if you keep the first two bits of code together...

--HM

On 21 December 2011 15:29, Daniel Barrett  wrote:

> If I create a tag extension like this:
>
> $parser->setHook('foobar', 'myCallback');
> function myCallback($input, $args, $parser, $frame) {
>  return 'hello world';
> }
>
> can the callback "myCallback" efficiently detect the name of the parser
> tag, "foobar", that invoked it?
>
> The business problem is this: I use the same callback with 20 different
> tag names, and I'd like the behavior to change slightly depending on which
> tag name was used. (The tag names are the names of database servers in my
> company, and the callback accesses the particular database.)
>
> Right now I am using a very inefficient solution: dynamically creating 20
> callbacks (one for each tag name), each with slightly customized behavior.
> I'd rather do it with one callback.
>
> I realize this would be easy if I'd used a single tag name plus a variable
> argument, but it's too late to make that change. (The tags have been used
> 10,000 times and are widely known in our company.)
>
> Thank you very much.
> DanB
>
> ps: I asked this a few years ago, when the answer was "no," but maybe
> things have changed by now.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] irc bot

2011-12-15 Thread Happy Melon
On 15 December 2011 19:49, Petr Bena  wrote:

>
> If you believe that current bot is better or that more people would
> like to maintain its source feel free to insert it to svn, and later
> when it is ready to labs (since I don't know java I am really unable
> to help you with that). I didn't want to start a war what language /
> bot is better.
>

I am indeed struggling to see why a volunteer donating their time and
skills to produce a tool which is useful to the community, committing that
code in a way that makes it as accessible as possible to other developers,
prioritising its stability and availability, and soliciting feedback from
other members of the community, is in some way something that should be
treated with scorn and negativity.  If it has both better current
functionality than the current code, and more energy for development from
its maintainer(s), then it's a useful addition to our suite of tools; if
not then it's a useful addition to the corpus of open source software in
general.  In no circumstances is its existence anything worse than neutral.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Compiling Debian Packages (was Re: 1.18 PHP version requirement)

2011-11-19 Thread Happy Melon
On 19 November 2011 18:25, Dmitriy Sintsov  wrote:

> On 19.11.2011 20:23, Olivier Beaton wrote:
> > And it should be noted that all you rename in debian-apache are softlinks
> > in the mods-enabled and sites-enabled directories.  You would keep
> regular
> > non-prefixed filenames in mods-available and sites-available. Using this
> > method you could also allow some of your users to own the files
> themselves
> > (although this is a security risk, you may want to install a panel if you
> > have many users).
> >
> > If you don't need to micromanage the load order of a module, debian
> > provides some useful commands for enabling and disabling mods and sites.
> >
> > a2enmod   a2dismod,
> > a2ensite   a2dissite.
> >
> > They merely take the name of the module / vhost in -available directories
> > and it creates or removes the site without having to comment out large
> > chunks of files.  And by using these soft links you can keep the
> > configuration for the vhost while still removing it, without having to
> move
> > it or rename it.  It's convenient.
> >
> > > From what I understand all this isn't great with puppet though, and
> many of
> > those users (like wmf) just flatten the whole thing into a single
> > configuration file, which is your prerogative.
> >
> >
> If one wants to have their virtual hosts in separate small files, there
> is much better setup than Debian / Ubuntu offers:
> 1. Create separate small virtual hosts files as usual.
> 2. Include them line by line in main httpd.conf file.
> 3. Change of virtual hosts order is a simple line cut / paste in
> httpd.conf file. Enabling / disabling one virtual host is just a #
> character in the start of the line.
> 4. No symlinks, no silly numeric prefixes.
> 5. The utility that manages these one line includes in httpd.conf can be
> developed as well.
> 6. Single httpd.conf file read/write is atomic.
> 7. There is no guarantee that every FS will list the filenames in dir in
> ascending literal order. Although modern do, of course.
> etc.
> Dmitriy
>

"Better" here is clearly in the eye of the beholder.  The system you've
just described is virtually semantically identical to the Debian/Ubuntu
setup: you have a load of small vhost files, create a list of them sorted
however you want, and get apache to load them all in order.  Having the
list as symlinks is easier and safer to script, as the corresponding OS
filesystem functions are very well defined, but ordering the list is a
little more difficult.  If you put the list in httpd.conf as a list of
includes you are dependent on having a safe and effective system for
editing it via script, which is substantially harder to develop; but the
ordering is a little more elegant.

Personally I'd say that the list-of-includes method was the "ugly option"
"reminding me of ancient programming languages" and the symlink method was
the "better setup", but there's no cast-iron reason why it should be so;
it's basically a personal preference.  It's the sort of distinction you'd
discuss over a beer or ten, not one to discuss with the expectation of a
meaningful outcome.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Compiling Debian Packages (was Re: 1.18 PHP version requirement)

2011-11-19 Thread Happy Melon
On 19 November 2011 12:27, Dmitriy Sintsov  wrote:

> On 19.11.2011 2:15, Olivier Beaton wrote:
> > Debian already solves this through a rename hack.  For example the
> default
> > virtualhost is named 000-default   so that it gets loaded first.
> >   Similarly, I've had to rename module links so they are loaded before
> > others (dav before svn).  It's fairly straight forward and once you have
> a
> > lot of modules or vhosts, you'd curse every time you opened a 5,000+ line
> > conf file.
> >
> >
> I don't think that is a good solution. Because inserting / moving a
> vhost in-between requires a rename chain and multiple filename renames
> probably are not atomic. Can one make multiple renames in one kind of
> transaction (locking the dir, multirename, unlocking)? I don't have any
> troubles opening single 350 line conf file in vim (with syntax
> highlighting) and after copy / cut / paste, storing the "monolithic"
> vhosts.conf file is atomic (like transaction).
> Dmitriy
>

No, you can always create a new filename that will sort between two others
without needing to rename either of the other two.  And you can always
change *a* filename without altering the sort order of the collection.  So
if you have a "000-default" file and then a set of "100-foo" files that
must load after default (but it doesn't matter how they load amongst
themselves), then another set of "200-foo" files that must load after the
100- files, you can always choose a name ("050-", "100-", "150-", "200-" or
"250-") that will cause it to load at the time you need, without having to
rename any other files.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Coding Convention: Method chaining

2011-11-16 Thread Happy Melon
>
>
> On Tue, Nov 15, 2011 at 11:50 PM, John Du Hart 
> wrote:
> > Right now our coding conventions manual never touches on method chaining,
> > nor have I personally seen this practice in core. So I'm interested in
> what
> > the rest of the community feels about adapting this practice more, and if
> > there are trade offs I'm not aware of. Let's make an example, take this
> > code from Abuse Filter:
> >
> > $out = $this->getOutput();
> > $out->setPageTitle( wfMsg( 'abusefilter-examine' ) );
> >  $out->addWikiMsg( 'abusefilter-examine-intro' );
> >
> > So, instead of writing it like that, it could be written
> >
> >  $this->getOutput()
> > ->setPageTitle( wfMsg( 'abusefilter-examine' ) )
> >  ->addWikiMsg( 'abusefilter-examine-intro' );
> >
> > It's just another style I've encountered on other projects and I
> personally
> > like.
> >
> > --
> > John Du Hart
>

All our (newer) JavaScript is full of this sort of syntax, although it does
somehow look messier with the right-arrow than the period in jQuery.  I
don't see a problem with using the technique in our PHP code as well.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Temporary password too short

2011-10-30 Thread Happy Melon
On 30 October 2011 15:46, Thomas Dalton  wrote:

> On 30 October 2011 15:38, Neil Harris  wrote:
>


> > However, this is way, way, way lower risk than the current risk of
> > brute-forcing low-hanging-fruit user passwords...
>
> A password from /dev/random is extremely insecure.
>

I don't believe these two statements are in any way mutually exclusive.
There are degrees of "extremely insecure" in which "password1" ranks
significantly higher than "the password I keep on the post-it in my desk
drawer".  One is very weak in the face of anyone connected to the internet,
one is very weak in the face of anyone who has access to your office.
Significantly more people have access to the internet than have access to
your office/home/phone/filesystem.  Neither threat is negligible, both are
worth taking sensible measures to counter.  But the point where the
conversation loses all sense of perspective is when it loses all level of
utility.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #parser parser function - does this make any sense?

2011-10-30 Thread Happy Melon
On 29 October 2011 15:08, Platonides  wrote:

> Daniel Werner wrote:
> > I am thinking about creating a very simple parser function #parse doing
> > nothing but returning parameter 1 with an "'noparse' => false" option.
> > Is there anything like this (or what could be abused for this) already
> > or is there any reason why this might be a bad idea?
> >
> > The reason I want to have something like this is, I want to create a
> > template (for template and parser function black-box tests) accepting
> > something like {{((}}#somefunction:a{{!}}b{{!}}c{{))}} as parameter
> > value, showing {{#somefunction|a|b|c}} as output and at the same time
> > calling {{#parse: {{((}}#somefunction:a{{!}}b{{!}}c{{))}} }} so that
> > besides the definition also the result can be shown by the template
> output.
> >
> > regards,
> > Daniel
>
> I think that would make more sense as a tag extension (parse doesn't
> look like a good name, what about ?).
>
> @Happy Melon: I think he wants a funtion which shows both parsed
> wikitext and the original source.
>
>
He intends to *build* such a structure, certainly; but I read the OP as
saying he wanted to implement it via a template like {{demonstrate
template}} [1] but with (just) the backend handled by a new parser
function.  I agree that you'd be better off/would avoid many of the
problems given above by having a tag extension
{{foo|bar|baz=quok}} that spat out its contents as a
parameter to a customisable system message that read something like
""$1 produces: $1"".  If I remember the parse
order of tag extensions verses parser function extensions right, that
should work pretty much straight out of the box??

--HM

[1] https://en.wikipedia.org/wiki/Template:Demonstrate_template
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] #parser parser function - does this make any sense?

2011-10-29 Thread Happy Melon
On 29 October 2011 01:01, Daniel Werner  wrote:

> I am thinking about creating a very simple parser function #parse doing
> nothing but returning parameter 1 with an "'noparse' => false" option.
> Is there anything like this (or what could be abused for this) already
> or is there any reason why this might be a bad idea?
>
> The reason I want to have something like this is, I want to create a
> template (for template and parser function black-box tests) accepting
> something like {{((}}#somefunction:a{{!}}b{{!}}c{{))}} as parameter
> value, showing {{#somefunction|a|b|c}} as output and at the same time
> calling {{#parse: {{((}}#somefunction:a{{!}}b{{!}}c{{))}} }} so that
> besides the definition also the result can be shown by the template output.
>
> regards,
> Daniel
>
> So basically a function which double-parses wikitext?  There are quite a
few potential gotchas with that: firstly losing the parser state if you
don't implement it properly and throwing UNIQ...QINUs everywhere; although
that can be avoided.  I expect you'd get a lot of double-escaping: {{#parse:
[[Foo]] }} is going to come out as ""Foo"" or somesuch, as the double brackets
will be expanded to an  tag on the first parse, then the tag will be
escaped on the second.  It would probably be easy to get very deeply nested
as well.  How would you handle pre-save-transform substitutions like
signatures and pipe tricks?

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] So how does WMF set up its Apaches?

2011-10-27 Thread Happy Melon
On 27 October 2011 18:32, Ryan Lane  wrote:

> >> [1] that is being updated. A lot of the live configs are also on NOC
> >> [2]. Between sites like NOC and access to Puppet (via Git), you've got a
> >> majority of the data AS it is actually used (rather than as it was,
> >> when written, on wiki).
> >
> > Well, sure, but that's nap-of-earth; without a solid 5000ft view
> grounding
> > you, I suspect it's hard to make use of.
> >
> > I haven't yet read the book-prep page that was posted the other day; on
> > reflection, I suspect that (and the book chapter that comes from it) will
> > tell the things I'm looking for.
> >
>
> Basically everything is on noc and in puppet. Those will always be
> more up-to-date than our documentation. That said, most of our newer
> services are very well documented, and many of our older services have
> at least adequate documentation.
>
> AFAIK, the only services with less than adequate documentation are the
> PDF servers and search. Mobile is slightly out of date since we just
> changed that entire architecture, but it'll be up to date soon.
>
> - Ryan
>

That's not to say that the configuration files *themselves* couldn't do with
a bit of a spring-clean, though.  There are endless lines of commented-out
hacks and twisted override-chains that make it very difficult to see where
some things are configured from and what they are configured too.  Of
course, changing them runs the risk of inadvertantly removing some edgecase
that was providing "expected behaviour" for some site; but that risk is run
every time the config is updated *anyway*, it might as well be done in a
controlled fashion.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] File::getPath() refactoring

2011-10-19 Thread Happy Melon
On 19 October 2011 23:06, Platonides  wrote:

> El 19/10/11 21:41, Russell Nelson escribió:
> > In order to support viewing deleted files (currently a blocker bug in
> > SwiftMedia), I'm going to refactor File::getPath() into a new public
> > function File::getLocalPath(), which will return an instance of a new
> > class TempLocalPath, which will have two methods: getPath(), and
> > close(). This class will own a local copy of the file. When it goes
> > out of scope or its close() method is called (same thing), any
> > resources held by the class will be freed.
>  >
> > With the upcoming FileBackend class and subclasses, this class will be
> > a requirement. Since I need it anyway, I may as well do the work now
> > to create it. File::getPath() will remain as a call, but it will throw
> > an exception if SwiftMedia is installed. When I get finished, its only
> > uses will be by extension writers who have chosen not to publish their
> > code in our SVN.
>
> Why is this needed?
> A FileSwift class which needed to fetch the file from Swift could
> acquire it when calling getPath(), and have it deleted on FileSwift
> destruction (getPath() is usually called from several methods, so it
> isn't practical to acquire and destroy every time).
>

Being able to have files as resources which can be made temporarily
available and then destroyed is valuable for deleted images even outside
SwiftMedia, though, no?  The current system where we serve images through
index.php just to add authentication is not pretty...

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] File::getPath() refactoring

2011-10-19 Thread Happy Melon
On 19 October 2011 20:41, Russell Nelson  wrote:

> In order to support viewing deleted files (currently a blocker bug in
> SwiftMedia), I'm going to refactor File::getPath() into a new public
> function File::getLocalPath(), which will return an instance of a new
> class TempLocalPath, which will have two methods: getPath(), and
> close(). This class will own a local copy of the file. When it goes
> out of scope or its close() method is called (same thing), any
> resources held by the class will be freed.
>
> With the upcoming FileBackend class and subclasses, this class will be
> a requirement. Since I need it anyway, I may as well do the work now
> to create it. File::getPath() will remain as a call, but it will throw
> an exception if SwiftMedia is installed. When I get finished, its only
> uses will be by extension writers who have chosen not to publish their
> code in our SVN.
>

Is there a more standardised name than "close"?  "dispose" is pretty common
in compiled languages; do we have any sort of standard for that behaviour
within MW?  If not, is this a good opportunity to create one?

Other than that it sounds like a good plan to me.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] ResourceLoader and IE's stylesheet limits

2011-10-13 Thread Happy Melon
"Internet Explorer then generates another POST request if the original
request was a POST request. Or, Internet Explorer may send a GET request
instead."

Lolfacepalm.

--HM


On 13 October 2011 21:14, Platonides  wrote:

> Applies to:
> Windows Internet Explorer 9
> Microsoft Internet Explorer 4.01 Service Pack 1
> Microsoft Internet Explorer 6.0
> Microsoft Internet Explorer 6.0 Service Pack 1
> Windows Internet Explorer 7
> Windows Internet Explorer 8
>
> How's that it isn't fixed in newer versions?
> I expected that even if something like that failed in old versions, it
> would have already been fixed. :(
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New committers

2011-10-12 Thread Happy Melon
On 12 October 2011 14:33, Sumana Harihareswara wrote:

>
> That said, if you find a rally easy CSS or Javascript bug, toss it
> to me and I'll try it? :)
>
> Well, there's nothing *difficult* about bug 1  ;-)

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Preliminary git module splitup notes (MediaWiki core & extensions)

2011-10-09 Thread Happy Melon
On 9 October 2011 22:33, Niklas Laxström  wrote:

> On 9 October 2011 21:52, Daniel Friesen  wrote:
> > On 11-10-09 11:20 AM, Platonides wrote:
>
> > What I've been thinking isn't so much putting translations in another
> > repo, or even TWN doing any commits anywhere at all. Frankly the whole
> > idea of TWN reading and writing .php files has felt completely messed up
> > to me anyways. Sure our canonical message forms can be in .php, but
> > having the semi-automated system we use to translate to every other
> > language we support output php files feels like a relic of a time before
> > it existed and a band-aid hack just to make it possible for TWN to do
> > translations back then.
>
> Huge +1. I would sincerely welcome a move away from PHP-based i18n
> files. Having data in executable format is just stupid imho.
>

I don't really see how changing the format is going to have any impact by
itself.  Whether PHP, XML, a hand-rolled data format or anything else, it
still doesn't play nicely with version control.  Fundamentally we want to
make changes to content in a version-controlled project, and we want
everyone to have the latest versions of those changes; but we don't want the
version history *of the i18n content* mixed in with the version history of
the rest of the project.  The solution to that issue is obvious and has
nothing to do with the file format: if you don't want your changes showing
up in the version history of your repository, make your changes outside the
repository!


>  > I'd like to make TWN the proper source for all the translations. Rather
> > than TWN spitting out php for non-en, we have a proper generated output
> > format for translations, and MediaWiki uses that instead of .php for our
> > translations. Instead of TWN having to make this a commit somewhere, I
> > think we should pull those translations right from TWN once we need them.
>
> I'm not sure I want to add that burden to TWN right now. It's just
> single vserver with no uptime guarantees.
> I'm not opposed to the idea though - having efficient l10n update in
> the core, enabled by default, providing always up-to-date translations
> and perhaps also loading new languages on demand[1] would soo awesome.
> But like I said, that would need some serious effort to code and to
> make stable and secure content distribution channel. Any volunteers?
> :)
>
>
This would seem something of a slap in the face to anyone running an
'unplugged' wiki (on an intranet or low-connectivity area); *especially* one
using a language other than English.  I doubt this would play very happily
with the Foundation's vision of bringing offline content to, for instance,
rural Africa.  Not all MediaWiki installations run on a webserver
permanently hooked into the internet; some of them run from a USB stick
plugged into an OLOC laptop in the middle of the middle of nowhere.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Preliminary git module splitup notes (MediaWiki core & extensions)

2011-10-05 Thread Happy Melon
On 5 October 2011 22:30, Brion Vibber  wrote:

> On Wed, Oct 5, 2011 at 1:59 PM, Brion Vibber  wrote:
>
> This could still end up pulling from 600+ repos -- if there are actually
> changes in them all! -- but should make typical cases a *lot* faster.
>

Pushed localisation updates from translatewiki would produce precisely this
effect (minor but nonzero changes to hundreds of repos) on a daily or at
least weekly basis.  :-(

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making developer access easier to get

2011-10-04 Thread Happy Melon
On 4 October 2011 22:42, Yaron Koren  wrote:

> Hi,
>
> So let me propose a technical solution that is hopefully fully
> feasible: instead of separating out access by core MediaWiki vs.
> extensions, separate it out as: 1) core MediaWiki, 2) extensions used
> on public Wikimedia wikis, and 3) all other extensions. Or the
> separation could even be just MediaWiki core + WMF-used extensions vs.
> other extensions - two sets of code, separated by whether they're used
> on Wikipedia and the like.
>

This would be great, apart from one thing: new extensions are installed on
WMF wikis on a fairly regular basis.  Moving extensions between SVN repos,
or even between separate branches, would be extremely disruptive right
across the board, so this disctinction would at best be accurate only at the
time the repository was reconfigured.  At worst, we'd end up forking
extensions once they were installed on the cluster, with all new development
being made to the WMF version while users of the old version are left to
rot, blisfully unaware that the code they are using is no longer getting any
TLC.


> This ties in to something else I've thought about for a while: that
> maybe the CodeReview protocol should similarly be bifurcated, so that
> non-WMF extensions like, say, SocialProfile, don't get the same level
> of scrutiny as extensions like ParserFunctions. As an extension
> developer, I'm grateful for all the helpful reviews that my commits
> get - but if all that reviewing work is coming at the expense of the
> reviewing of code that's used on WMF sites, then it seems like a waste
> of resources.
>

This already happens, although more by default than through design.  Whole
swathes of commits to extensions and branches are already automatically
marked "deferred" in CR; usually that means that the amount of code review
it will receive is nil.  Our problems with code review are not due to the
distraction of random extensions!

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Native HTTPS support live on all projects

2011-10-03 Thread Happy Melon
On 3 October 2011 21:18, Brion Vibber  wrote:

> On Mon, Oct 3, 2011 at 12:13 PM, Ashar Voultoiz 
> wrote:
>
> > Can you possible enable $wgSecureLogin on all wiki?  The feature let you
> > login under HTTPS when you are come from HTTP.
> >
> > Man page:
> >  http://www.mediawiki.org/wiki/Manual:$wgSecureLogin
> >
> > Revisions:
> >  http://www.mediawiki.org/wiki/Special:Code/MediaWiki/75585
> >
>
> Hmm, this seems to indicate it will return you to http: after
> authenticating; this is an unsafe practice which I would recommend strongly
> against.
>
> If you log in on HTTPS, we want to make sure that no session data (eg login
> cookies) can leak to HTTP -- where someone on your wireless network could
> hijack your session, delete a thousand pages on Wikipedia, and get your
> account locked out.
>

The $wgSecureLogin thing was and is a treatment to a symptom of the problem,
not its cause.  Once the bugs brion mentions are worked through, we should
be encouraging all logged-in users to go to, and stay with, SSL.  What
$wgSecureLogin should do is prompt all visits to Special:UserLogin to be
redirected to https, and *not* send them back.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making developer access easier to get

2011-10-03 Thread Happy Melon
On 3 October 2011 18:00, Jack Phoenix  wrote:

>
> > And out of curiosity - is there a new policy in place?
>
> I wouldn't know, as the process has changed over the years, but I have to
> say that I liked it when commit access requests were on the MediaWiki.org
> wiki ([[mw:Commit access requests]]) -- IMO it was a better and more
> transparent way to manage commit access requests than an OTRS queue or
> whatever is used nowadays; then again, I'm just giving suggestions here,
> I'm
> not here to make any decisions as I'm not employed by the Foundation.
>

The biggest problem of the old mw.org queue was that it was simply neglected
for months at a time; my own commit access request was up there for over six
months before *anyone* looked at it *at all*.  I agree that it was more
transparent and maybe 'better'; but the most important requirement of the
system is that it *works* and is used.  If the OTRS queue works for the
current svn admins, then that's an important merit.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Counting revisions in 2011090

2011-09-29 Thread Happy Melon
On 28 September 2011 16:41, Brion Vibber  wrote:

> On Wed, Sep 28, 2011 at 3:27 AM, K. Mueller  wrote:
>
> > Platonides  gmail.com> writes:
> > > Do doing things the slow way:
> > > mysql> SELECT COUNT(*) FROM revision;
> > > | 416988781 |
> >
> > Thanks, this is quite close to the result from September 1st (
> 412,482,641
> > ). I
> > assume that the contents of the "revision" table go into the
> > pages-meta-history
> > dumps..? And I reckon that there is no safe method to verify the result,
> > though.
> >
>
> The pages-meta-history dump (with text) is produced from the
> pages-meta-history _stub_ dump, which is basically just a thin XML shell
> around a big query to the database on the 'page' and 'revision' tables.
>
> In theory, anything missing from the stub dump should be something that's
> not properly recorded at all -- for instance a revision that is not
> attached
> to a live page. This isn't supposed to happen but there probably are a few
> stray ones in the database. :)
>
> -- brion
>

SELECT * FROM `revision` LEFT JOIN `page` ON `rev_page` = `page_id` WHERE
`page_id` IS NULL

Anyone want a sweepstake on how long that'll take?  8-)

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Decorification of features

2011-09-27 Thread Happy Melon
On 27 September 2011 22:38, Daniel Friesen wrote:

> On 11-09-27 02:18 PM, Olivier Beaton wrote:
> >>> * File path
> >> Special:Filepath is an important special page for linking. If you have
> >> uploads then Special:Filepath is the only way for bots and other 3rd
> >> party tools to reliably get a link to a media file. If we get rid of
> >> Special:Filepath we get rid of the one thing making a rewrite of the
> >> upload system and paths used a reasonable possibility.
>

We should just make http://my.wiki/wiki/Media:Foo.jpg redirect to the
appropriate url; simple and effective.  There's really no need for a special
page for this.  Although now that it's in, we can't really get rid of it...
:-(

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Decorification of features

2011-09-26 Thread Happy Melon
On 26 September 2011 21:50, Brion Vibber  wrote:

> On Mon, Sep 26, 2011 at 8:13 AM, Olivier Beaton  >wrote:
>
> Things like LanguageConverter have been in MediaWiki for several years and
> date to younger times for the extension interface. Math wasn't broken out
> into an extension until 1.18 because it predates the extension interface
> entirely! Some also work along both language- and functional lines --
> LanguageConverter could perhaps turn into its own module, but then where do
> we bundle the language rules?
>
> In general, things that are infrastructural -- that provide bases for other
> things to work with -- are probably not unreasonable to leave where they
> are.
>
> -- brion
>
> LanguageConverter would be a bitch to move out only because there are
precious few people who could tell us how it's *supposed* to work, let alone
whether it still works after we change anything in it.  But LC doesn't AFAIK
form any sort of 'infrastructure': it's a bolt-on module that's rightly
disabled most places because it requires compiled binaries.  Just like Math,
in fact.  As you said on IRC, LC does sport some elements (db tables etc)
that Math doesn't, but nothing that's not commonly found in extensions of
various types.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 1.18 and Pre design

2011-09-24 Thread Happy Melon
On 24 September 2011 07:32, K. Peachey  wrote:

> I'm no SVN user so i'm emailing instead... As the 1.18 users may have
> noticed (eg: For example a common place would be CodeReview) the new
> designs for pre.
>
> In r87173[1] the layout was changed and based on consensus in review
> that was reverted and then 1.18 was rebranched at r92475 so I thought
> they would be back to normal, So has this been changed in another
> revision? (and if so, what revision)
>
> Also this is slightly broken for me (although this could be my browser
> (chrome)), in which if I click and drag my middle button (scroll
> wheel) you can ∞ scroll in all directions
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


It was redone in r91548, and reverted again in r93912.  That latter change
needs to be backported.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding MD5 / SHA1 column to revision table

2011-09-20 Thread Happy Melon
>On Mon, Sep 19, 2011 at 12:53 PM, Asher Feldman wrote:
>
>> Since the primary use case here seems to be offline analysis and it may
not
>> be of much interest to mediawiki users outside of wmf, can we store the
>> checksums in new tables (i.e. revision_sha1) instead of running large
>> alters, and implement the code to generate checksums on new edits via an
>> extension?
>>
>> Checksums for most old revs can be generated offline and populated before

>> the extension goes live. Since nothing will be using the new table yet,
>> there'd be no issues with things like gap lock contention on the revision

>> table from mass populating it.
>>
>
> That's probably the simplest solution; adding a new empty table will be
very
> quick. It may make it slower to use the field though, depending on what
all
> uses/exposes it.
>
> During stub dump generation for instance this would need to add a left
outer
> join on the other table, and add things to the dump output (and also needs

> an update to the XML schema for the dump format). This would then need to
be
> preserved through subsequent dump passes as well.
>
> -- brion

Can we resist the temptation to implement schema changes as new tables
purely to make life easier for Wikimedia?  Core schema changes are certainly
enough of a hurdle to warrant serious discussion, but they are not the
totally-intractable mess that they used to be.  1.19 already includes index
changes to the user and logging tables; it will already require the full
game of musical chairs with the db slaves.  Implementing this as a new
column does not actually make things any more complicated, it would just
mean that an operation that would take three hours before might now take
five.

It may or may not be an architecturally-better design to have it as a
separate table, but that is the basis on which we should be deciding it.
This is a big project which still retains enthusiasm because we recognise
that it has equally big potential to provide interesting new features far
beyond the immediate usecases we can construct now (dump validation and
'something to do with reversions').  Let's not hamstring it at birth based
on the operational pressures of the one MediaWiki end user who is best
placed to overcome said issues.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Adding MD5 / SHA1 column to revision table

2011-09-20 Thread Happy Melon
>On Mon, Sep 19, 2011 at 12:53 PM, Asher Feldman wrote:
>
>> Since the primary use case here seems to be offline analysis and it may
not
>> be of much interest to mediawiki users outside of wmf, can we store the
>> checksums in new tables (i.e. revision_sha1) instead of running large
>> alters, and implement the code to generate checksums on new edits via an
>> extension?
>>
>> Checksums for most old revs can be generated offline and populated before

>> the extension goes live. Since nothing will be using the new table yet,
>> there'd be no issues with things like gap lock contention on the revision

>> table from mass populating it.
>>
>
> That's probably the simplest solution; adding a new empty table will be
very
> quick. It may make it slower to use the field though, depending on what
all
> uses/exposes it.
>
> During stub dump generation for instance this would need to add a left
outer
> join on the other table, and add things to the dump output (and also needs

> an update to the XML schema for the dump format). This would then need to
be
> preserved through subsequent dump passes as well.
>
> -- brion

Can we resist the temptation to implement schema changes as new tables
purely to make life easier for Wikimedia?  Core schema changes are certainly
enough of a hurdle to warrant serious discussion, but they are not the
totally-intractable mess that they used to be.  1.19 already includes index
changes to the user and logging tables; it will already require the full
game of musical chairs with the db slaves.  Implementing this as a new
column does not actually make things any more complicated, it would just
mean that an operation that would take three hours before might now take
five.

It may or may not be an architecturally-better design to have it as a
separate table, although considering how rapidly MW's 'architecture' changes
I'd say keeping things as simple as possible is probably a virtue.  But that
is the basis on which we should be deciding it.  This is a big project which
still retains its enthusiasm because we recognise that it has equally big
potential to provide interesting new features far beyond the immediate
usecases we can construct now (dump validation and 'something to do with
reversions').  Let's not hamstring it at birth based on the operational
pressures of the one MediaWiki end user who is best placed to overcome said
issues.

--HM
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Can we make the life of extension developers easier?

2011-09-02 Thread Happy-melon

"Niklas Laxström"  wrote in message 
news:CAAVd=jbYKaqR83tbwHAOiFiqrb_wcF70Jk=zdlqtwfiumjn...@mail.gmail.com...
> I would like for us to:
> 1) start enforcing @since annotations for new core code
> 2) encourage adding FC interfaces to previous branches where possible
> (and to also make releases out of those branches including the
> interfaces)

I'm all in favour of requiring higher standards for documentation and 
annotations, as long as they are enforced consistently and everyone, however 
experienced, gets the same level of amiable prodding to conform to them. 
I'm also a fan of backporting interfaces as far as possible (I've broken a 
number of 1.19 commits into two bits and marked one of them for backporting 
to 1.18, for instance); but I wouldn't go so far as to say we should be 
adding features to released versions.  I think we have enough of a workload 
with controlling one active release branch, and that trying to manage a 
second would merely distract us from the business of actually implementing 
these new features in the first place.

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Ramping up to 1.18 deployment

2011-08-25 Thread Happy-melon

"Mark A. Hershberger"  wrote in message 
news:878vqhibuc@everybody.org...
> https://bugzilla.wikimedia.org/30352 -- jQuery.makeCollapsible.js should
>support 'autocollapse', 'innercollapse' and 'outercollapse' options
>
> Thanks for any help you can give on these bugs,

Is just arguing that it needn't block deployment cheating?  :-D

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] $wgExtensionAliasesFiles deprecated?

2011-08-25 Thread Happy-melon
Surely since Translate doesn't support it, the fact that none of the 
extensions that Translate does support use it, is something of a non 
sequitur?!?  :-p  There are 552 .alias.php files in /trunk/extensions, and 
1786 .i18n.php files.  Obviously it's hard to put a number on how many of 
the difference are just extensions without any SpecialPages, but I'd venture 
that the combined syntax is quite common.

--HM

"Siebrand Mazeland"  wrote in message 
news:414e19dc-3e6b-4a12-b7e1-462161ae6...@xs4all.nl...
> It being possible since whenever does not make it the standard, as out of 
> the 400 or so extension that Translate supports, actually zero do it that 
> way.
>
> As no patches have been submitted to change the behaviour of Translate and 
> because it was never seen as an issue with a high priority, it was never 
> addressed - and probably will not change until someone cares enough.
>
> We'd happily welcome patches to Translate on this feature request or any 
> of the other open issues and features 
> (http://translatewiki.net/wiki/Issues_and_features)!
> --
> Siebrand Mazeland
>
> M: +31 6 50 69 1239
> Skype: siebrand
>
> Op 24 aug. 2011 om 00:24 heeft "Happy-melon"  het 
> volgende geschreven:
>
>> The contrary (all localisations go together in foo.i18n.php) has been the
>> standard since r52503.  Does Translate really still not recognise this
>> format after over two years?
>>
>> --HM
>>
>> "Siebrand Mazeland"  wrote in message
>> news:e2722961-0fe3-4041-9e24-58a2c70ac...@xs4all.nl...
>>> Please put messages, aliases and magic words in different files, 
>>> otherwise
>>> the Translate extension does not understand it.
>>>
>>> Naming covention:
>>> * messages: /.i18n.php
>>> * aliases: /.alias.php
>>> * messages: /.magic.php
>>>
>>> --
>>> Siebrand Mazeland
>>>
>>> M: +31 6 50 69 1239
>>> Skype: siebrand
>>>
>>> Op 23 aug. 2011 om 06:49 heeft Jeroen De Dauw  
>>> het
>>> volgende geschreven:
>>>
>>>> Hey,
>>>>
>>>> Thanks for the info. I'm still unclear on what the best practices are 
>>>> for
>>>> 1.16 and later code. Am I supposed to put the $specialPageAliases in 
>>>> the
>>>> same file i18n file as the regular messages, or is it better to put 
>>>> them
>>>> in
>>>> a separate file? And in case of the later, how is this file preferably
>>>> named?
>>>>
>>>> Cheers
>>>>
>>>> --
>>>> Jeroen De Dauw
>>>> http://www.bn2vs.com
>>>> Don't panic. Don't be evil.
>>>> --
>>>> ___
>>>> Wikitech-l mailing list
>>>> Wikitech-l@lists.wikimedia.org
>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>>
>>
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] $wgExtensionAliasesFiles deprecated?

2011-08-23 Thread Happy-melon
The contrary (all localisations go together in foo.i18n.php) has been the 
standard since r52503.  Does Translate really still not recognise this 
format after over two years?

--HM

"Siebrand Mazeland"  wrote in message 
news:e2722961-0fe3-4041-9e24-58a2c70ac...@xs4all.nl...
> Please put messages, aliases and magic words in different files, otherwise 
> the Translate extension does not understand it.
>
> Naming covention:
> * messages: /.i18n.php
> * aliases: /.alias.php
> * messages: /.magic.php
>
> --
> Siebrand Mazeland
>
> M: +31 6 50 69 1239
> Skype: siebrand
>
> Op 23 aug. 2011 om 06:49 heeft Jeroen De Dauw  het 
> volgende geschreven:
>
>> Hey,
>>
>> Thanks for the info. I'm still unclear on what the best practices are for
>> 1.16 and later code. Am I supposed to put the $specialPageAliases in the
>> same file i18n file as the regular messages, or is it better to put them 
>> in
>> a separate file? And in case of the later, how is this file preferably
>> named?
>>
>> Cheers
>>
>> --
>> Jeroen De Dauw
>> http://www.bn2vs.com
>> Don't panic. Don't be evil.
>> --
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Dropping PHP 5.2 support

2011-08-20 Thread Happy-melon

"Jeroen De Dauw"  wrote in message 
news:camhmagdvn9y92xqypchd1dykwup6aisweuy-bijed5q8fzl...@mail.gmail.com...
> Hey,
>
> Then we can finally make use of the new things introduced in 5.3, which in 
> some cases
> are really useful and right now a complete pain to implement.

Could you provide some examples?  Late Static Binding is an obvious one, but 
are there other really valuable features?

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On refactorings and huge changes

2011-07-28 Thread Happy-melon
I have mixed reactions to what's been said here, although some points are 
certainly good.  I don't think the "size" of a change is really the 
important factor for determining both how easy it is to review, and how easy 
it would be to branch and merge; a much better metric is "complexity", how 
tightly it integrates with the rest of the codebase, and how many 'external' 
codepoints must be updated to complete the change.

ResourceLoader, the Installer, the Metadata, are all examples of changes 
which may be quite large, but which are quite isolated within the codebase 
and do not really sprawl across it in a particularly complex way.  the 
Blocking and Skin rewrites and (particularly) RequestContext are examples of 
the opposite, changes which are *not* actually particularly large in terms 
of number-of-lines (the blocking rewrite rewrote four files, RequestContext 
two, compared to about forty for ResourceLoader and 25 for Installer). 
Having code together in a few files totally-rewritten or added afresh is 
both easier to review, *and* easier to branch and merge.

A branch merge of a complicated change (like the Login rewrite I did in 
summer 2009, which bounced three times and is now bitrotted into oblivion) 
looks incredibly scary and is essentially impossible to review, you have to 
take it on faith assuming that the review *of the code while it was in the 
branch* was good enough.  Yes, it will contain iterated fixes for simple 
errors, but the sequence being compressed into one revision is a 
double-edged sword: rewrites are inevitably done progressively, and the 
easiest path for the reviewer to take is the same one as the coder.  When 
everything is compressed into one branch-merge-revision, there is no context 
from which to reconstruct that path.  So, and I think most people are saying 
the same thing, the review will still have to happen in the branch, on the 
individual revisions, with all the false-starts and follow-ups, and merged 
in one (probably very messy) branch merge.

But making complicated changes in a branch doesn't make them any less 
complicated, or any easier to review in the branch.  Complicated changes are 
difficult *because* of their extensive interaction with other code elements, 
not because of a particularly grand scope.  There is no such thing as a 
"small incremental change" when changing one interface in the actual subject 
of the rewrite requires changing fifty callers, the other interfaces on the 
target also called by the fifty callers, and the 200 other places that call 
*them*.

The list of actual *changes* to the blocking code is fairly minimal: 
internalise variables, rewrite the way blocks are constructed from a target 
or targets, separate and modernise the frontend interfaces for blocking, 
unblocking and listing blocks.  It is complicated, difficult to review and 
with a huge number of follow-up commits, because of the large number of 
*other* places that blocks are referenced and manipulated.  Those other 
places still exist in a branch, are still just as difficult to *find* in a 
branch, and still just as time-consuming to fix and review in a branch.

Thus I don't agree that doing rewrites like the Blocking stuff in a branch 
would have made it any easier, or any less time-consuming, to review.  On 
the other hand, you make an excellent point about *discussion* of changes 
before implementation; that needs to happen, and it needs to be thorough.  I 
again don't think that a branch is particularly useful for that purpose 
because to 'discuss' something in code is to go through exactly the same 
process of writing, rewriting and discarding as if you just went ahead and 
did it.  An RfC, on the other hand, allows (and requires) a spec to be 
developed in natural language, with great saving of time and effort.

*If* it gets the participation it deserves.  The people who would be 
spending their time calling "crazy" on checked-in code *must* take the 
attitude that participating in RfCs *is* the alternative and is time well 
spent, that they are not making problems magically go away, only moving them 
to another venue where they can be nipped in the bud.  There are only three 
options: those two, and dismissing the proposal altogether *regardless of 
its merits*.

We currently have ten RfCs listed; three are over a year old, five are 
between three and six months, and two are younger than a month.  Five of 
them, including *all* of the old ones, have no substantive contributions by 
*anyone* apart from the author.  Two more have exactly one other 
contributor, and two (Math and Configuration) are swarming with commenters. 
This is not a healthy system.  It could *become* a healthy system, but only 
if *everyone* is prepared to make it an integral part of their development 
work, not just the people who are being diligent enough to be, or being 
pressured into, writing proposals.

I don't think that anyone really thinks RFC is the magic bullet, but it's 
even l

Re: [Wikitech-l] Showing stub links by default - is it possible in a Wikimedia project?

2011-06-08 Thread Happy-melon

"Leo Koppelkamm"  wrote in message 
news:banlktinsckfvpnrscska4svdgq9zgvu...@mail.gmail.com...
>>
>>   If we proceeded to remove the feature, they could
>> fairly easily add it into Popups or one of the other JS citadels.
>
>
> I don't see a way to do it in JS w/o lengthy & expensive API checks..
> Leo

So they'll do it with lengthy API checks, just like the rest of the data 
Popups gathers.  We tell them not to worry about performance, remember?  The 
number of people who would use a JS implementation is probably small enough 
for it not to be particularly severe.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Showing stub links by default - is it possible in a Wikimedia project?

2011-06-07 Thread Happy-melon

"Platonides"  wrote in message 
news:isjjo3$dmq$1...@dough.gmane.org...
> Ashar Voultoiz wrote:
>> On 06/06/11 00:56, K. Peachey wrote:
>> 
 Since it skips cache, can not we disable that stub highlighter once for 
 all?
>>> Logged in users don't get cached versions of the page...
>>
>> I am well aware of that.  The root cause being the various options
>> available to users, my proposal is merely to get ride of one of the
>> options :-)
>
> I think you would find opposition from wikipedians when you tried to
> ditch the stub threshold option.

We find opposition from some subset of Wikipedians when we try to do just 
about anything.  The presence of a small group of extremely vocal users 
should certainly be noted, but shouldn't be an automatic blocker, or we'd 
never get anything done.  If we proceeded to remove the feature, they could 
fairly easily add it into Popups or one of the other JS citadels.

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Happy-melon

"Brion Vibber"  wrote in message 
news:BANLkTikTz=V77o8vYMxbA91dj=PNNbvO=w...@mail.gmail.com...
> On Wed, Jun 1, 2011 at 12:43 PM, Aryeh Gregor 
> > wrote:
>
>> So then what happens if volunteers' contributions aren't reviewed
>> promptly?
>
>
> Indeed, this is why we need to demonstrate that we can actually push code
> through the system on a consistent basis... until we can, nobody seems
> willing to trust pre-commit review.
>
> -- brion

+1.  Pre-commit-review, post-commit-lifetime, branching, testing, whatever; 
all of the suggestions I've seen so far are IMO doomed to fail because they 
do not fix the underlying problem that not enough experienced manhours are 
being dedicated to Code Review for the amount of work (not the 'number of 
commits', the amount of *energy* to make changes to code) in the system.  A 
pre-commit-review system doesn't reduce the amount of work needed to get a 
feature into deployment, it just changes the nature of the process.  At the 
moment revisions sit unloved in trunk until they fossilise in; in that 
system with the current balance of time they would sit unloved in a bugzilla 
thread or branch until they bitrot into oblivion.

There *are* strategies that could be implemented (like the 
review-and-super-review processes used by Mozilla) that *can* streamline the 
process, but as has been said elsewhere, that *still* needs top-level 
direction.  The members of Foundation staff who consistently get involved 
with these discussions do generally seem only to have hold of the 
deckchairs, not the wheelhouse.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-06-01 Thread Happy-melon

"Chad"  wrote in message 
news:banlktinrbx45wsg6yyclwpzpo7mjb3r...@mail.gmail.com...
> On Tue, May 31, 2011 at 7:49 PM, Happy-melon  wrote:
>> Every way of phrasing or describing the problem with MW CR can be boiled
>> down to one simple equation: "not enough qualified people are not 
>> spending
>> enough time doing Code Review (until a mad rush before release) to match 
>> the
>> amount of code being committed".
>>
>
> Maybe people shouldn't commit untested code so often.
>
> I'm not joking.
>
> -Chad

That's a worthy goal, but one that's orthogonal to Code Review.  Every 
single person on this list will have committed some unreviewed code to some 
repository at some time; the fewer times you've done it, the more likely you 
are to have crashed the cluster the times you did.  People doing some 
unquantifiably greater amount of testing doesn't mean we can spend any less 
time on CR per revision.  Automated testing, regression testing, other 
well-defined infrastructures (I think Ryan's Nova Stack is going to be of 
huge benefit in this respect) *do* save CR time *because reviewers know 
exactly what has been tested*.  A policy like "every bugfix must include 
regression tests" would definitely improve things in that area.

Of course, it's undeniable that more testing would lead to fewer broken 
commits, and that that's a Good Thing.  If we implement processes which set 
a higher bar for commits 'sticking' in the repository, whether that's 
pre-commit review, a branch/integrate model, post-commit countdown, 
whatever; people will rise to that level.

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Code review process (was: Status of more regular code deployments)

2011-05-31 Thread Happy-melon

"Russell N. Nelson - rnnelson"  wrote in message 
news:777bc8a5a78cc048821dbfbf13a25d39208f1...@exch01.ad.clarkson.edu...
> Robla writes:
> 1.  We say that a commit has some fixed window (e.g. 72 hours) to get
> reviewed, or else it is subject to automatic reversion.  This will
> motivate committers to make sure they have a reviewer lined up, and
> make it clear that, if their code gets reverted, it's nothing
> personal...it's just our process.
>
> Worse than pre-commit review. What if you make a change, I see it, and 
> make changes to my code using your changes. 72 hours later, they get 
> reverted, which screws my code. Okay, so then nobody's going to compile 
> anything, or use anything, if it has 72-hour code in it. The alternative, 
> if they want the code badly enough, is to review it so it will stick. 
> Well, that devolves to the same thing as pre-commit review.
>
> And ... this only makes sense for core, or maybe for stable extensions. It 
> doesn't make sense for experimental extensions where only one person is 
> likely to understand or care what the code says.

By far the most likely outcome of this is that in the two months following 
its introduction, 95% of all commits are reverted, because the people who 
are supposed to be reviewing them don't spend any more time than usual 
reviewing them.  If I make a change to the preprocessor, HipHop, Sanitizer, 
SecurePoll, passwords, tokens, or any of a number of other things, I'm going 
to need Tim to review them.  I don't begrudge Tim the thousand-and-one other 
things he has to do besides review my code within 72 hours.  Does that mean 
that no one should make *any* changes to *any* of those areas?  More 
dangerously still, does that mean that **only people who can persuade Tim to 
review** be allowed to make changes to those areas?  I know what the answers 
to those questions are *supposed* to be, that's not the point.  The point is 
**what actually happens**.

Every way of phrasing or describing the problem with MW CR can be boiled 
down to one simple equation: "not enough qualified people are not spending 
enough time doing Code Review (until a mad rush before release) to match the 
amount of code being committed".  Any attempt at a solution which does not 
change that fundamental equation is doomed to failure.  There are at least 
six things that could be changed ("enough people", "qualified people", 
"enough time", "Code Review", "match[ing]", "being committed"); we almost 
certainly need to change more than one.  The most likely outcome of this 
particular suggestion is simply to radically reduce the amount of code being 
committed.  I'm not sure that that's the best way to deal with the problem.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Status of more regular code deployments

2011-05-31 Thread Happy-melon

"Neil Kandalgaonkar"  wrote in message 
news:4de56cf9.9090...@wikimedia.org...
> On 5/31/11 3:20 PM, MZMcBride wrote:
>
>> taking a page out of the rest of the business world's book, you set a
>> deadline and then it just fucking gets met. No excuses, no questions.
>
> I think you have an optimistic view of how businesses actually work. :)

The only modification needed to bring that sentence in line with business 
reality is adding "it just fucking gets met **or someone's head rolls**.

> But, in any case, in a business, there is a Plan that everyone is trying
> to follow and in theory, deviations from that Plan are avoided. In our
> environment we want to be responsive to the schedule of a volunteer
> developer, who may be completely unaware or uninterested in our plans.

Plenty of businesses work on a rolling-development model, probably more 
businesses than have totally static specs.  The difference between that and 
WMF, and even between WMF and other non-businesses like Linux and Mozilla, 
is that if a release is mandated by some higher power and something is 
holding up that release, **whatever it is gets steamrollered out of the 
way**.  If there is a clear roadmap that says that any feature that's not 
debugged and ready-to-go by Wednesday morning, by the first Tuesday of the 
month, by the 32nd of June, whenever, then it gets reverted, no one is going 
to complain when lo and behold, such features get reverted.  *Everyone* is 
going to complain if the 32nd of June becomes the 32nd of December before 
the feature even makes it onto the cluster.

> Perhaps the answer is that we have to give the volunteer developers some
> obvious pathway to harmonizing their and our priorities. Like, if you're
> working on files and multimedia, you should be emailing Bryan, me, or
> maybe Tim or Russell. Could it be that simple?
>
>
>> The
>> problem seems to be finding anyone to lay down the damn law.
>
> Well, it's not like wiki pages happen by someone cracking a whip either.
> That said, we would benefit from some urgency towards correcting the
> problem.

Wiki pages don't need a whip, and nor does MediaWiki.  Wiki pages are 
*missing* several layers of delays and checkpoints in the 
volunteer-writes-something-to-volunteer-sees-it-in-use chain.  MediaWiki is 
currently like a wiki with FlaggedRevisions turned on on every articlespace 
page, but with all the admins on the wiki working on other things such that 
no one gets round to cleaning out the Special:OldReviewedPages list more 
than once every *nine months*.  That would kill a wiki community stone dead 
*pretty much instantly*.

>"Brion Vibber"  wrote in message 
>news:BANLkTinZzef=orvuw9dwicvahybuwxn...@mail.gmail.com...
>
> Sing it, brother! We're getting *some* stuff through quicker within the
> deployment branches, but not nearly as much as we ought to.

The fact that some things are *not* getting stuck in the CR quagmire is part 
of the *problem*, not the solution.  The upper levels of the developer 
hierarchy ***obviously know*** that the mainline CR process is substantially 
broken, BECAUSE ***THEY'RE NOT USING IT*** FOR THINGS THAT ARE IMPORTANT TO 
THEM.  The unavoidable implication of that observation is that the work of 
the volunteer developers *DOESN'T* matter to them.  Whether or not that 
implication is correct (I have confidence that it's not) is irrelevant, the 
fact that it's there is what's doing horrible damage to the MediaWiki 
community.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] help: error with Extension:Runphp

2011-05-23 Thread Happy-melon

"Benjamin Lees"  wrote in message 
news:banlktim_wnfewckh4gkxfwnai_jitau...@mail.gmail.com...
> In the line
> function parsePHP( $input, $argv, &$parser ) { return runphpExecCode(
> $input, $argv ); }
> Change &$parser to $parser

And then maybe delete that whole line, and all the other lines in the file? 
On your head be it, obviously, but I can't think of *any* scenario in which 
I'd use this extension, or recommend anyone else do so.  Being on an 
intranet or being restricted-account-creation will not protect your wiki 
against XSS vulnerabilities that are commonly found in MediaWiki and other 
software, and here you're basically trusting the security of your entire 
filesystem on the assumption that they will.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Feedback on online communications around the Berlin hackathon (including remote participants)

2011-05-17 Thread Happy-melon

"Steven Walling"  wrote in message 
news:BANLkTi=tgx+f01w6bjftqo40ortoubd...@mail.gmail.com...
> On Mon, May 16, 2011 at 12:43 PM, Guillaume Paumier
> wrote:
>
>> Setting up the live video feed isn't easy, and taking notes in real
>> time is pretty time- and energy-consuming, but we can make efforts to
>> continue to do it in the future, if it's worth it. We'd just like to
>> know how useful it is.
>>
>
> The EtherPad notes were especially useful for me. The live video was less
> useful because of the time difference. The videos after the fact are fun, 
> so
> if it was a combination of notes and post-produced videos I'd be happy.
>
> Thanks for all the hard work on documentation. It sure was nice to read 
> the
> blog posts as you all went along.
>
> Steven

Being in a similar timezone avoided that problem for me, and it was a 
welcome distraction from RL to watch some of the videos; but I didn't gain 
anything particular out of it being live because there was no obvious way to 
interact back.  Admittedly I was hampered by being unable to use IRC, but 
were people on it 'live' during the event?  I was tempted to drop a question 
onto the EtherPad and see if anyone picked it up.

I suspect that there actually was a reasonable amount of scope for remote 
interactivity, just that it wasn't as obvious or well-advertised as it could 
have been.  But since the only advantage to *outputting* media feeds in 
real-time is if remote viewers can *input* straight back, that's something 
that would be worth consolidating in the future.

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Shell requests backlog

2011-05-17 Thread Happy-melon

"Brion Vibber"  wrote in message 
news:BANLkTi=fayp8ekgw4aoag+j+-lkpz+n...@mail.gmail.com...
> On Tue, May 17, 2011 at 3:56 AM, MZMcBride  wrote:
>
> A 'NEEDINFO' resolution would probably be better than 'WONTFIX' for this
> case, and we should see if we can enable that. Ensuring that devs know 
> where
> to go to ask for a status & consensus update in case the original filer &
> CC'd people are unavailable would also help to make these go more
> consistently.
>
> Again, it would be better to have a 'NEEDINFO' resolution enabled for us 
> to
> use in this case.
>
>
> -- brion vibber (brion @ wikimedia.org / brion @ pobox.com)

The default BZ configuration comes, AFAIK, with a REMIND resolution.  This 
was briefly reenabled during the BZ4 upgrade, but was subsequently disabled 
again and the bugs I had closed with it reclassed as LATER.  This resolution 
seems to be broadly what you had in mind -- "We can't do anything with this 
as it stands because we need you to provide something, the ball's back in 
your court, ping us if you still want something done" -- and I don't really 
understand why it was (re)disabled.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki tech poster ideas?

2011-05-13 Thread Happy-melon

"Ashar Voultoiz"  wrote in message 
news:iqkdt1$6g3$1...@dough.gmane.org...
> On 10/05/11 20:52, Brion Vibber wrote:
> 
>> Any other nice poster-size visualizations hiding around?
>
> I really like
> http://commons.wikimedia.org/wiki/File:Wikimedia_Servers.svg
>
> It needs to be updated a bit, nonetheless it gives a good overview of
> our technical mission.
>
> -- 
> Ashar Voultoiz

I like 
http://commons.wikimedia.org/wiki/File:Wikimedia-servers-2010-12-28.svg in 
the same vein, although it's a bitch to keep updated.  It would be awesome 
if we could find or write some widget that would update a visualisation like 
this automagically...

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] search=steven+tyler gets Steven_tyler

2011-05-13 Thread Happy-melon

"Daniel Friesen"  wrote in message 
news:4dcdb7af.7020...@nadir-seen-fire.com...
> Fortunately I think most of the space/underscore switching done by code
> is actually isolated to a subset of Title and perhaps a few other core
> classes (probably ones like User and the filerepo stuff), most code
> should be using the title interface.

http://toolserver.org/~krinkle/wikimedia-svn-search/do-live.php?path=%2Ftrunk%2Fphase3&recursive=on&searchphrase=%27_%27×tamp=1305329107

Sure, keep telling yourself that...  :-P

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mediawiki Development IDE

2011-05-09 Thread Happy-melon

"Bryan Tong Minh"  wrote in message 
news:BANLkTinddq+HJgTE=4dwc5ofbvnht7p...@mail.gmail.com...
> On Mon, May 9, 2011 at 3:08 PM, Tod  wrote:
>> Is there an IDE that the MW developer community has settled on and can
>> recommend?
>>
> Sam has a license for phpStorm, iirc, available for all MediaWiki 
> developers.
>
>
> Bryan

I use phpStorm (got the license key from Sam), which is pretty good.  I 
probably use less than 1% of all its features, but the little things it adds 
(like being able to click on a function call, hit a key and get a popup of 
its documentation, or jump to its definition with another keypress) are 
useful little timesavers.  I started with Eclipse, but as Brion says it gets 
*very* grumpy when trying to load large projects, and seems to have quite a 
few memory leaks.

Ultimately, the codebase is sufficiently large and esoteric that few-to-no 
IDEs are able to fully discharge the task of "integrating" development, 
testing and deployment (I have yet to hear many success stories of 
debugging/breakpointing/etc, for instance, and you'll probably waste as much 
time (and hair) getting it working as you'd save by using it); so basically, 
go for whatever will do the things you want with the minimum of extraneous 
"features" which will probably just end up consuming time, memory and 
sanity...

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla Weekly Report

2011-05-02 Thread Happy-melon

"reporter"  wrote in message 
news:e1pprj3-bs...@kaulen.wikimedia.org...
> MediaWiki Bugzilla Report for November 29, 2010 - December 06, 2010

...

"reporter"  wrote in message 
news:e1qgwls-000389...@kaulen.wikimedia.org...
> MediaWiki Bugzilla Report for April 25, 2011 - May 02, 2011
>

Wow, now here's a blast from the past... :-D  A lot of these stats are now 
in the BZ4 report page, but it's still very nice to have the weekly 
reminder.  Cookie for whoever dug it out and got it going again!

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Release schedule for the rest of 2011 (was: Status of 1.17 & 1.18)

2011-05-02 Thread Happy-melon

"Chad"  wrote in message 
news:BANLkTikd4Eb-V8W-kA1qe+=bnjxmj+o...@mail.gmail.com...
>
> I understand, respect, and value the contributions of people who want to
> add new features. Features are what moves the product forward, and at
> no point do we want to be hostile to people willing to put in their time 
> and
> effort to add functionality.
>
> Given that our patch review process isn't fantastic, I really don't think 
> it
> would affect the majority of bugs anyway. Mainly affected would be
> people with core access who write a new feature without putting it
> through BZ first. Given that our core group is relatively small, I figured
> we could come to some sort of consensus, if we do indeed move forward
> with this.
>
> ...
>
> I don't know what our development community wants. I just happened to
> think it was a good idea and so brought it up for discussion. If we
> collectively decide I'm nuts, we can toss this proposal, I won't be upset.
> I know we'd need to keep development on a very strict timeframe, which
> is why I outlined a more strict branching and timeline to stick to. As 
> Roan
> and others pointed out, 6 months is a little long. I don't think we 
> couldn't
> decide on a branch point and stick to it, especially if we're not holding 
> up
> a branch for someone to finish sorting out a rewrite or major feature they
> didn't quite resolve.
>
>> Given our past record, I'm not really confident that we can pull that
>> off. There's a risk of screwing it up badly and offending a lot of
>> people. Release management isn't exactly an organisational strength.
>>
>
> I agree it's not our strength, but I don't think we can't do it. I
> think sticking
> to a firm branch date would largely alleviate this issue. I remain 
> convinced
> that a stability-focused release would be a good idea at some point, 
> whether
> we do it now or in a year.

Feature freezes don't have much potential in the current development climate 
because the choice is basically between committing a feature to trunk and 
not committing a feature at all.  Development work done in branches might as 
well be left in a working copy for all the attention it gets, BZ patches 
doubly so.  What would almost certainly happen in a feature freeze would be 
that developers who are interested in core rewrites / major features would 
simply queue up their work for the next release, which would make 1.20 
another massively-rewritten monster.  That, if not properly managed, is just 
creating a bigger problem down the line; although conversely you could say 
it would make for a particularly Awesome(TM) milestone release.

If a feature freeze is to work, it has to either a) be for a very short 
period so that developers neither get disenchanted and wander off nor start 
stockpiling working-copy changes to empty onto trunk once it's thawed, or b) 
be part of a fundamental change in the way we approach rewrites.  It would 
be perfectly acceptable to move to a completely different paradigm where 
branches are used properly, regularly reviewed, get plenty of 
inter-developer testing and can be smoothly merged back into trunk in an 
organised fashion.  But right now, the only way to reliably get external 
eyes on code is to put it in trunk; the occasions when multiple developers 
are working on the same branch are both rare and not quite the same thing.

My login rewrite was done as a branch merge and was reverted three times 
from 1.16 (not unreasonably, for sure, but for bugs flagged up by precisely 
the sort of diverse testing that being in trunk provides and being in a 
branch doesn't); it's now 30,000 revisions bitrotted and will probably have 
to be redone from scratch.  The 1.18 blocking rewrite was done in pieces in 
trunk and looks to have settled in reasonably well.  A feature freeze will 
probably result in rather more of those Big Scary Commits (TM) -- either 
branch merges or whole features put together in a working copy -- and fewer 
features implemented through incremental changes.  If we don't have 
provisions in place to handle that in some way, it will probably create more 
problems than it solves.

--HM



 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Release schedule for the rest of 2011 (was: Status of1.17 & 1.18)

2011-04-30 Thread Happy-melon

"Chad"  wrote in message 
news:BANLkTi=mwb7gdcfzftm4vxbwln3ikst...@mail.gmail.com...
> On Wed, Apr 27, 2011 at 12:43 PM, Chad  wrote:
> Tim was concerned about the release notes, but as I pointed out in my 
> previous
> e-mail, Sam's tidied this up (and it's low-hanging fruit if someone wants 
> to
> check behind us for sanity). That being said, I don't see any reason why 
> we
> can't drop a beta1 sometime this week. Give it a week, and drop a beta2. 
> Wait
> another week, then go final I think, all depending on what response we get 
> from
> the betas.

+1

> As for 1.18, I say we branch it the same day we drop 1.17 final (making 
> the
> branch is easy).

+1.  I think porting 1.17 fixes to 1.18 is a much lesser evil than allowing 
the 1.18 branch to get any bigger.

> There's still quite a bit of code to review, but going ahead
> and giving ourselves a cutoff point will make catching up easier. Large 
> projects
> still outstanding in 1.18 to review are the img_metadata merge, and 
> rewrites of
> Skin and Action code.

The plan *was* to revert the Action rewrite from 1.18 and put it into 1.19. 
If that's not going to happen then we should probably either a) push on with 
its development and try and get it fully in place for 1.18, b) (my 
preference) stabilise what's already there and leave it as a 
partially-used-framework like Message, or c) revert it altogether.  We can't 
roll it back out of 1.18 *and* 1.19.

> Looking ahead to 1.19, I'd like to do the same and branch soon after 1.18 
> has
> been dropped.

+1

> Going back over the past couple of releases, we've had quite a few 
> "rewrites"
> of major portions of code. While these are a necessary part of the process 
> of
> developing MW, they are difficult to review due to their complexity. This
> complexity also makes it more likely for things to break. If I may be so 
> bold,
> I would like to ask that 1.19 not contain any of these rewrites. Let's 
> focus on
> making it a bugfix/cleanup release. Personally I think it would make for a 
> very
> clean and polished release, as well as reducing the time for us to review 
> and
> ship it.

+0  -- I don't have time in the next six months to take the hedgecutters to 
anything else in the codebase anyway... :-D

> If we go this route, I don't see any reason we couldn't ship 1.19 by year 
> end
> (or if we really push, 11.11.11, as the other thread suggested). I
> think it would
> put us in a really good place to move forward into 2012, and help get us 
> back
> into a somewhat regular release pattern.

If it helps with getting onto a regular release pattern, then that's good. 
We need to be careful that that doesn't come at the expense of Useful Stuff 
Not Getting Done: it would be very easy to maintain a release schedule if we 
never added any new features, but it would be fairly pointless.  I do 
understand what you mean, and I think it would be a worthwhile exercise; 
just as long as we treat it as an experiment.

> I really would love to hear from people to see if they think I'm crazy or 
> if
> this could work out fairly well. I know it's pretty tl;dr for most people, 
> but
> the ones who read it are the ones I wanna hear from anyway ;-)

I approve of this response-filtration scheme... :-D

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] 11.11.11

2011-04-27 Thread Happy-melon
http://www.wefearchange.org/2011/04/release-that-rewrite-on-11.html

MediaWiki 1.18 (or even 1.19) on 11th November?  If mailman can manage it; 
why can't we...?

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Centralize PHP (and other) minimum requirements / MoveDefines.php up the call stack

2011-04-12 Thread Happy-melon

"Krinkle"  wrote in message 
news:e4c549e9-a115-4afe-b030-3f57f2f9b...@gmail.com...
> Right now there's a few points:
>
> * Minimum php versions are all over the source code, putting it in
> DefaultSettings.php or Defines.php would make be a good start, that
> way all hardcoded uses of the versions after those are loaded can be
> centralized.
> However there are many php-versions compared before those are included
> as well ( all (web) entry points and other files parsed before the
> inclusion of DefaultSettings and/or Defines).
>
> So a better solution would be to get those versions available right at
> the beginning of the web entry points.
>
> Possible solutions:
>
> 1) Instead of putting the define() or $wg...= in DefaultSettings.php /
> Defines.php, create namethisfile.php and put them in there and include
> it in the all entry points.
>
> This seems like a simple and quick solution but introduces yet another
> always-included file and puts them far away from other global
> variables and defines.
>
> 2) Put it in Defines.php
> * make it independant (ie. only defines(), nothing else, as the
> filename suggests)
> * and move it up the call stack
>
> Things like inclusion of UtfNormalDefines could be put in the places
> where Defines.php is currenty included[1] and assignment of the
> $wgFeedClasses variable shouldn't be in Defines.php anyway.
>
> 3) Just put them in DefaultSettings.php and Defines.php and replace
> all uses with the globals where they are hardcoded and available. Any
> uses before this file is loaded (entry points) can hard code it
>
> The third solution is basically what I was going to do, and can be
> safely done. But before I do so I'd like to know if the solutions that
> cover all scenarios are do-able.
>
>
> --
> Krinkle
>
> [1] UtfNormalDefines.php may not have to be moved though, looks good
> on second thought. It's included everywhere anyway so it doesn't save
> load by loading it later or earlier.

To add some context here, in r85918 I made some changes to our handling of 
old PHP versions.  Coaxing the PHP 4 parser to even get as far as letting us 
die() is quite a challenge, and just about every file in the codebase is 
incompatible with it (structures like wfFoo()->bar() are invalid, for 
instance).  However, PHP 5 versions are equally broken: no version of PHP < 
5.2.3 that I tried (and I now have eight of them floating around :D) was 
able to load a page without a fatal error.

What I have done is to move the PHP version check from WebStart.php (which 
was unparseable since Tim added a try/catch block in r85327) to the entry 
points index.php, api.php, load.php.  That way, only those files have to be 
PHP 4 compatible.  It is definitely worth doing this despite the age of PHP 
4, because the error message that previously resulted ("Parse error: 
unexpected T_CONST_FOOBAR" or somesuch) is so spectacularly unhelpful that 
people are far more likely to just give up in disgust than actually realise 
even what the problem is.

The issue this raises is that we now have the minimum supported PHP version 
hardcoded in at least six different places in the codebase.  It would be 
nice to have it centralised in a define() constant, but any file that it's 
put in then needs to be kept PHP 4 compatible.  The two options which stand 
out are either creating a new file for it, or putting it in Defines.php and 
moving that file to be included directly from the entry points, rather than 
from WebStart.php.  The question is: are there any gotchas associated with 
moving that up the call stack?

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Actions and Special Pages

2011-04-06 Thread Happy-melon

"Bryan Tong Minh"  wrote in message 
news:BANLkTimhwb5EZR=5tpwsh8m34c4gvgx...@mail.gmail.com...
> On Tue, Apr 5, 2011 at 4:41 AM, Daniel Friesen
>  wrote:
>> Personally, I like tacking on ?action=edit and especially purge.
>> Prefixing Special:Edit/ doesn't sound nice to me.
>> I know I fixed the issues with things like Special:Movepage not sharing
>> the same UI tabs as the rest of the actions.
>>
> I'm +1 with you on this. I don't have any convincing arguments against
> either way, but action links just look nicer to me.
>
>
> Bryan

I agree that there aren't really killer arguments in either direction.  But 
I *would* say that consistency is a virtue we should strive towards.  I want 
*either* Special:Move/Foo and Special:Edit/Foo, *or* Foo&action=edit and 
Foo&action=move.  There really is no justification for that discrepancy 
apart from "it's always been that way"... and that's what B/C is for.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Actions and Special Pages

2011-04-04 Thread Happy-melon
In the beginning, there was Article.php.  And the Article class had methods 
like Article::view(), Article::delete(), Article::protect(), etc.  And the 
fundamental codeflow of Mediawiki consisted of something along the lines of:

$article = new Article( $title );
$action = wfGetTheActionSomehow();
$article->$action();

Over time Article has grown and bloated to become our third largest class 
(after ZhConversion and Parser).  Several of its action methods have been 
spun out to separate files (EditPage.php, ProtectionForm.php and 
HistoryPage.php, among others).  It's long overdue that this process be 
carried through to its natural conclusion with all action methods spun out 
into some new structure.

There are essentially two competing possibilities for structuring this, and 
they reflect the two parallel systems we have for "doing things other than 
viewing" to a wiki.  One is action parameters, and the other is special 
pages.  We have action=edit, or Special:MovePage, for instance; they have a 
similar function but different structure.  We have action=delete to get rid 
of stuff, but then Special:Undelete to bring it back again. 
Special:Whatlinkshere and action=history are another pair of pages which 
have very similar principles (getting data that relates to a given page) but 
different implementations.

For either case in the backend I would think we'd want to create an 
ActionPage base class and an EditActionPage from that, which looks 
internally rather like a SpecialPage construct, might even subclass it.  I'd 
like to ask people's opinions about which they think would work better in 
the frontend for, say, editing or protecting.  If people think it would be 
better as a special page we'd make 
http://foo.example.com/w/index.php?title=Bar&action=edit a hard redirect to 
Special:Edit/Bar; that has the significant advantage of being able to be 
formed as an internal link.  Conversely if we'd like to keep it an action it 
would make sense to redirect Special:MovePage/Bar back to 
?title=Bar&action=move.  Or is something more exotic like 
[[Action:Edit/Bar]] a possibility?

Thoughts?

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Focus on sister projects

2011-04-01 Thread Happy-melon

"MZMcBride"  wrote in message 
news:c9bbcf19.1056...@mzmcbride.com...
> Ryan Kaldari wrote:
>> Yeah, the local CSS/JS cruft is definitely a problem. I've tried doing
>> clean-up on a few wikis, but I usually just get chewed out by the local
>> admins for not discussing every change in detail (which obviously
>> doesn't scale for fixing 200+ wikis). I would love to hear ideas for how
>> to address this problem.
>
> This caught my eye as Wikimedia has far more than 200 wikis. There seems 
> to
> be a shift happening within the Wikimedia Foundation. The sister projects
> have routinely been ignored in the past, but things seem to be going 
> further
> lately
>
> Personally, I'm in favor of disbanding all of the projects that Wikimedia
> has no intention of actively supporting in the near-future or even 
> mid-range
> future. I think the current situation in which certain sister projects are
> supported in name only is unacceptable to the users and to the public.
>
> MZMcBride

I would be very interested to hear what criterion you would use to separate 
out a group of 200 (or any number other than zero, one or all [1]) wikis 
which are "maintained" from the rest which are "unmaintained"; where the 
distinction in quality of service, the ratio of Foundation resources to 
userbase or readership, or any other meaningful statistic, showed any 
obvious jump across the boundary.  You would need to be able to show such a 
thing in order to make anyone believe that there is any "intention" (or lack 
thereof) for the Foundation to do anything with the sister projects.

It's one thing to argue that more of the Foundation's resources should be 
directed to particular projects; that's a perfectly reasonable discussion, 
but for foundation-l, not here.  It's quite another to argue that an 
arbitrary number (don't forget that Ryan is referring to the number of wikis 
with broken JavaScript which are unable to fix it themselves, not any 
attempt to count every wiki in the cluster) represents some freudian slip 
into some diabolical scheme or even into a subconscious mindset.  Even if 
that is what you want to claim, that belongs in foundation-l as well.  "Our 
shell request workflow could use work" is a time-honoured topic which comes 
and goes and seems to be in a relatively successful era at the moment. 
Anything more political than that has nothing to do with, and no place on, 
wikitech-l.

--HM

[1] http://en.wikipedia.org/wiki/Zero_One_Infinity 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Bugzilla upgrade on April 1st at 17:00 GMT

2011-03-31 Thread Happy-melon

"Priyanka Dhanda"  wrote in message 
news:4d94c498.7050...@wikimedia.org...
> Hello,
>
> bugzilla.wikimedia.org is scheduled to be upgraded to version 4.0

\O/

> on April 1st ...

Wait a minute...

> If you see any problems after the upgrade, please report them using
> https://bugzilla.wikimedia.org/enter_bug.cgi?product=Wikimedia and use
> the component Bugzilla.

And if that problem is "can't submit any bug which has 'bugzilla' in the 
title"?   :-P   That would make an awesome April Fool's joke... :-D

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The priority of code review (Re: Code Review tools)

2011-03-28 Thread Happy-melon

"Rob Lanphier"  wrote in message 
news:AANLkTi=lgjrjj-gjx+_sdb0wpc62kptpoxbr7jkm8...@mail.gmail.com...
> On Mon, Mar 28, 2011 at 7:20 AM, Aryeh Gregor 
> > wrote:
>
>> If, as Tim says, Wikimedia developers were un-assigned from code
>> review after the 1.17 deployment, *that* is the problem that needs to
>> be fixed.  We need a managerial decision that all relatively
>> experienced developers employed by Wikimedia need to set aside their
>> other work to do as much code review as necessary to keep current.  If
>> commits are not, as a general rule, consistently reviewed within two
>> or three days, the system is broken.  I don't know why this isn't
>> clear to everyone yet.
>
>
> Hi Aryeh,
>
> You say that as though this were obvious and uncontroversial.  The reason
> why we've been dancing around this issue is because it is not.
>
> Right now, we have a system whereby junior developers get to commit 
> whatever
> they want, whenever they want.  Under the system you outline, the only
> remedy we have to the problem of falling behind is to throw more senior
> developer time at the problem, no matter how ill-advised or low-priority 
> the
> changes the junior developers are making.  Taken to an extreme, this means
> that junior developers maintain complete control over the direction of
> MediaWiki, with the senior developers there purely in a subservient role 
> of
> approving/rejecting code as it comes in.
>
> What comes of this system should be obvious: senior developer burnout.  If
> only reward we offer for becoming an experienced developer is less
> interesting work with less power over day-to-day work, we're not going to
> attract and retain people in senior positions.
>
> To be clear, none of the developers in WMF's General Engineering group 
> have
> been pulled off of code review.  However, not all of the WMF's senior 
> staff
> are part of GenEng.
>
> Rob

The fact that this is a very reasonable (and clearly genuine) problem makes 
is *more* concerning that it's being "danced around", not less.  If this is 
a problem, then it needs a solution, because it sounds like we're not going 
to make much headroad into the larger issues without it.  What solution do 
you propose?  Greater focus amongst junior devs?  A Mozilla-style 
'review'/'superreview' model?  Even "all the junior devs going away and 
leaving the senior devs in peace" is *a* solution, although I'm sure that's 
not the preference given that most of our senior devs were once junior devs. 
But something has got to change, one way or another.  Because only about 3% 
of the past 500 reviews have been done by permanent staff, and commits are 
still accumulating around 20% faster than reviews, in bulk.  What's the 
plan...?

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meaning of "fixme" (Re: code review criticism (Re: Converting to Git?))

2011-03-27 Thread Happy-melon

"Platonides"  wrote in message 
news:imm5c2$rib$1...@dough.gmane.org...
> Ilmari Karonen wrote:
>>
>> I think it might be a good idea to split these two cases into separate
>> states.  My suggestion, off the top of my head, would be to leave
>> "fixme" for the latter and add a new "broken" status for the former.
>
> +1
> We should also add another state for fixmes that are not about problems
> in the revision itself, but request for improving more code (eg. you
> should fix the same thing -added in MW 1.4- in other 10 locations of the
> code, too).

That sort of thing should be a tag, because it is orthogonal to (and can 
actually change independently of) the status of the revision itself.  It 
would make it impossible to 'ok' the revision without losing the 'extend' 
information, which is exactly the opposite of what we want.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Meaning of "fixme" (Re: code review criticism (Re: Converting to Git?))

2011-03-25 Thread Happy-melon

"Chad"  wrote in message 
news:AANLkTi=o1xrb3rnwbk3hvc5fki2xi9n+cs5qbh6nw...@mail.gmail.com...
> On Fri, Mar 25, 2011 at 10:30 AM, Ilmari Karonen  
> wrote:
>> On 03/24/2011 08:00 PM, Roan Kattouw wrote:
>>> * We need to set a clear policy for reverting problematic revisions
>>> (fixme's) if they aren't addressed quickly enough (again, let's say
>>> within a week). Currently we largely leave them be, but I think we
>>> should go back to something more decisive and closer to the "keep
>>> trunk runnable, or else Brion kicks your ass" paradigm and make it a
>>> bit more formal this time

>> I think it might be a good idea to split these two cases into separate
>> states.  My suggestion, off the top of my head, would be to leave
>> "fixme" for the latter and add a new "broken" status for the former.
>>
>
> +1 to everything Roan said and +1 to everything above.
>
> -Chad
>

Definite +1 to this as well.  I don't think that 'broken' commits should be 
immediately reverted **in the workflow we currently operate**; rather I 
think they should be given 12 or 24 hours (probably the latter) after 
notification to be fixed, and then a policy of prompt reversions.  Reverting 
what might be a 99% successful change at the drop of a hat is the worst of 
all worlds.  If we have a clear policy then the revert makes a subtle shift 
from "aggressive" to "standard procedure", and we also have an upper bound 
on the time trunk can remain broken.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] code review criticism (Re: Converting to Git?)

2011-03-24 Thread Happy-melon
I think Roan hits it on the nose.  Most of the problems Ashar and Neil raise 
are flaws in our code review process, not flaws in the tools we use *to do* 
code review.  I actually think that CodeReview works quite well, **for the 
system we currently use**.  I think many of us agree that, one way or 
another, *that system* has major flaws.

The fact that one discussion has quickly fragmented into fresh threads on 
*all* of the 'big three' (code review workflow, VCS, and release cycle) 
illustrates how intimately connected all these things are.  It makes no 
sense to choose a VCS which doesn't support our code review workflow; if our 
code review is worthless if it does not support a coherent release cycle; 
and the release workflow (and the to-freeze-or-not-to-freeze question) has a 
dependency on the VCS infrastructure.

Ultimately, though, it's a mistake to think of any of these issues as 
technical questions: they are **social** problems.  We have to choose the 
*mindset* which works for us as individuals, as a group and as a charitable 
Foundation.  Currently our development mindset is of the Wild West: pretty 
much everyone works alone, on things which either interest them or which 
they are being paid to be interested in, and while everyone is responsible 
enough to fix their own bugs, our focus is on whatever we, individually, are 
doing rather than the finished product, because the product only *becomes* 
finished once every 6 months or so.  The only reasons now that we keep trunk 
broadly runnable are a) it makes it easier for us to continue our own 
development, and b) the TWN people shout at us whenever we break it.

I'm not, let me be clear, saying that said 'Wild West' mindset is at all a 
bad thing, it is very open and inclusive and it keeps us from the endless 
trivial discussions which lead to cynicism and then flames in more 
close-knit communities.  But as Roan says, it is *not* the only mindset, and 
the alternative is one which is more focussed at every stage on how changes 
affect a continuously-finished product.  We know the regime which is at the 
other end of the scale: the Linux kernel's universal pre-commit review, 
which I'm going to suggest we call the 'Burnt Offering' approach to coding 
as patches are worked, reworked, and inevitably reduced in number before 
being presented for divine approval.  That has clear advantages, in ensuring 
very high code quality and probably improving *everyone's* coding skills, 
but also the disadvantages Roan mentions.

The smoketest-trunk-every-week development model, which defies being given a 
crass analogy, is somewhere in the middle, and I think that's closer to 
where we need to be.  If we made an absolute policy of scapping to the WMF 
cluster once a week, every week, it would force a shift in our mindset 
(arguably a shift *back*), but not one that's seen as an artificial 
limitation.  No one will begrudge a release manager reverting changes on 
Tuesday afternoon which people agree will not be fixed in time for a 
Wednesday scap, while the same release manager spending Tuesday *not* 
merging changes for the exact same reason is seen in a much more negative 
light.  We retain people's ability to make rapid and immediate changes to a 
bleeding-edge trunk, but still ensure that we do not get carried away, as we 
did for 1.17 and are still merrily doing for 1.18, on a tide of editing 
which is not particularly focussed or managed (witness the fact that out of 
the 15,000 revisions in 1.17, we can point out only about three 'headline' 
features).

There are implementation questions to follow on from whichever workflow 
regime we move towards: for the weekly-scap process we need to find a 
replacement for Brion and his cluebat which is as reliable and efficient as 
he was; for a Linux-style system we need to sort out how to ensure that 
patches get the review that they need and that it doesn't just kill our 
development stone dead; and even to continue in the Wild West we need to 
sort out how to stop traceing out the Himlayas with the graph of unreviewed 
commits and actually get our damn releases out to prove that the system can 
work.  My main point is that *any* technical discussion, about SVN/Git, 
about CodeReview or its alternatives, even about Bugzilla/Redmine, is 
premature unless we have reached an adequate conclusion about the social 
aspects of this combined issue.  Because Git does not write code, nor does 
CodeReview or Bugzilla.  *We* write MediaWiki, and we could in principle do 
it in notepad or pico if we wanted (some of us probably do :-D).  The most 
important question is what will make us, as a group, more effective at 
writing cool software.  Answers on a postcard.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On HTML5

2011-03-22 Thread Happy-melon

"K. Peachey"  wrote in message 
news:aanlktinanrjcoho_0rac4amfs3gt98hkr0nz2yzpk...@mail.gmail.com...
> On Wed, Mar 23, 2011 at 2:25 AM, Max Semenik  
> wrote:
>> As the matter of fact, MediaWiki serves HTML5 by default. The only
>> reason why it is still not enabled on Wikipedia is backward
>> compatibility with numerous screen-scraping scripts/tools. However,
>> they had their last warning recently - HTML5 was briefly enabled a
>> couple of times and there's no guarantee that next time it will not
>> stick :D
> Was this written anywhere? where and what is the date?
>
> I thought it was the issue that broke access for certain IE users that
> resulted in it disabled last time. We have been telling people not
> to screen scrape for YEARS and to use the api instead where possible,
> We should just re-enable it and let them solve their own problems.

In a perfect bonsai model of the IPv6 implementation, the recent enabling of 
$wgHTML5 prompted a mass panic to rewrite TWINKLE, enwiki's flagship 
javascript anti-vandal-plus-hundreds-of-extra-random-bells-and-whistles 
package, which had sat essentially untouched throughout our many warnings. 
It's moderately amusing to watch, but to be fair to them it is a huge 
codebase and it's not unreasonable to give them a month or so to sort it 
out.  Most other wikis just copy from enwiki, so they're both the driving 
force, and the heel draggers... :-D

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Converting to Git?

2011-03-22 Thread Happy-melon

"Chad"  wrote in message 
news:aanlktikrre_3o+pycjdx2+qil6zt3tpohqucodwe_...@mail.gmail.com...
> On Tue, Mar 22, 2011 at 12:11 PM, Siebrand Mazeland
>  wrote:
>
> Also, the
> comment about code review is also a point. Right now, CodeReview
> does not support Git, and really the implementation was never built
> with Git in mind. I think we could hack it in, but it wouldn't be pretty
> and if Git's the answer then I think we'll be leaving this tool in favor
> of something else (I and many others like Gerrit quite a bit).

To my mind, this is one of the most important points.  We have built up a 
very comprehensive infrastructure for code review in SVN, and there is a lot 
of manhours behind that work; and just as many hours associated with setting 
up a replacement system for git.  Who is going to put that infrastructure in 
place, in amongst all the many other priorities we have at the moment? 
Moving to a VCS which makes it "easier to review stuff" **in principle** is 
going to be of no use whatsoever if it sends the **practical 
implementation** of that review process back to the stone age.

--HM
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] RFC: MediaWiki Style Guide (Forms)

2011-03-10 Thread Happy-melon

"Chad"  wrote in message 
news:aanlktimlnfewazemtst4sv2ujexj-f-ujpf+srgoo...@mail.gmail.com...
> On Thu, Mar 10, 2011 at 1:50 PM, Neil Kandalgaonkar  
> wrote:
>> FYI, there are classes in MediaWiki like HTMLForm and HTMLFormField
>> which abstract a lot of form creation in MediaWiki already. They are not
>> used universally, but very very often.
>>
>
> Actually, HTMLForm isn't used all that much. It's fairly new, and
> only came around when Andrew redid the preference system not
> too long ago.

Indeed, it's currently only used in Special:ComparePages, 
Special:DisableAccount, Special:EmailUser, Special:Preferences and the new 
Special:Upload.  I've been one of the more prolific implementers of it since 
Andrew drew it up for the preferences rewrite in 1.16; I have it in a 
mostly-completed cleanout of the Augean stables that is Special:Blockip, and 
a rather unpolished rewrite of Special:UserLogin which uses it as well.

>> Those libraries are more useful for the technical details of forms (for
>> instance, preventing cross-site request forgery is built-in) but if you
>> want to standardize usability and style maybe you should start there.
>
> ...
>
> +1. I think this is a great idea, and helps enforce UI consistency
> rather than hoping people Get It Right.

+1 from me to.  I'm absolutely of the opinion that HTMLForm abstraction 
should be used much more widely; and this is one good reason of many. 
HTMLForm produces a form which has a clear and (most importantly) consistent 
structure, and also means we can apply style guidelines like these in one 
fell swoop.

> An example of a UI element
> I thought of recently that could be used in lots of places and add
> a level of usability would be an auto-suggesting text field for places
> you type in a username (Email, blocking, dozens of other places).

I've been thinking about that too; it would be completely invaluable.  One 
thing I've been slowly working towards is a JS module for htmlforms, which 
is now in existence (/resources/mediawiki/mediawiki.htmlform.js).  HTMLForm 
is highly extensible by creating new HTMLFormField subclasses; for the 
Upload form btongminh created a field for the file-on-your-computer input, 
with various bells and whistles attached for previewing, client-side 
validation, etc.  A user field could very happily be implemented in exactly 
the same way, indeed probably should.  I fairly recently introduced a field 
type for 
"select-dropdown-created-from-a-parsed-system-message-plus-optional-additional-reason-text-field"
 
like we see on the deletion/protection/upload/block forms, in much the same 
vein.

--HM

 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Easy code review

2011-02-28 Thread Happy-melon

"Platonides"  wrote in message 
news:ikh9dc$9cs$1...@dough.gmane.org...
> With no pressing timelines, we are slacking off again.

Then let's get a new deadline in place.  What's holding us back from 
timetabling a 1.17wmf2??  It strikes me that the features that will appear 
in that are fundamentally different from the blockers on the 1.17.0 tarball, 
and as you say, all the focus seems to be on that at the expense of 
attention to bleeding-edge code.  1.17 is feature-frozen now, we know what's 
going into it and what work needs to be done, and we have an implicit 
deadline (ie ASAP) for getting it finished.  A new WMF release deadline in 
early- to mid-March would give us a much-needed incentive to wage war once 
again against the CR backlog, and move further towards the 
continuous-release-to-wmf model that I think we're pretty much unanimously 
in support of.

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Changing CentralAuth login - New proposal

2011-02-19 Thread Happy-melon

"K. Peachey"  wrote in message 
news:AANLkTikYX-NjZowR2mTp3Cbc1mP3GCM73BFA8=gju...@mail.gmail.com...
> On Sun, Feb 20, 2011 at 1:20 AM, Platonides  wrote:
>> Create a multilingual wiki for handling logins, available as
>> https://login.wikimedia.org/
>> https://login.wikipedia.org/ https://login.wikiversity.org/ etc.
> I would recommend users.*, that way in the future when we have decent
> interwiki trunslucation we have somewhere to stick user pages etc, as
> well providing somewhere sensible for global user preferences as well.

or global.* ??

--HM 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Best way to track RELEASE-NOTES changes each svnupdate?

2011-01-23 Thread Happy-melon

 wrote in message news:87mxmwfie4.fsf...@jidanni.org...
> C> On Sun, Jan 16, 2011 at 7:16 PM, Happy-melon  
> wrote:
>>> Isn't that what release notes are for?
>
> Say, how do you pros see what changed?
> Here is my extra stupid way. I do it every few weeks.
>
> cp RELEASE-NOTES /tmp
> svn update
> diff --ignore-space-change -U0 /tmp RELEASE-NOTES
>
> Often old lines are shown again, because somebody tidied their
> formatting. A svn diff wouldn't be any better.
>
> The worst thing is each 1.17 to 1.18, 1.18 to 1.19 change, when the
> whole file changes.
>
> So how do you folks track RELEASE-NOTES level changes (not source code
> level changes, too many), coinciding with your SVN updates?

One generally gets information out of a release notes file by reading it. 
The purpose of release notes is to be a list of changes between versions; if 
I want to upgrade from 1.15 to 1.17, I read the 1.16 and 1.17 release notes, 
and I know all the changes I should be expecting.  Why would I want the 1.16 
release notes to contain changes which do not affect 1.16?

If you are determined to deploy bleeding-edge code on production sites, you 
will need to be a little more adventurous in getting hold of the latest 
changes.  You can go to [1] and select the latest revision, and the last 
revision you checked out, and read the diff in a slightly prettier format. 
Pretty much by definition, there isn't a nicer way of reading them.

--HM

[1] http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/RELEASE-NOTES
 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   >