[Wikitech-l] Testing via GitHub actions

2020-07-07 Thread Jeroen De Dauw
Hey,

Has anyone created a GitHub action to run tests for a MediaWiki extension
yet?

I'd like to run the PHPUnit tests for some of my extensions via GitHub
actions. Unfortunately this is not as simple as in the typical PHP project,
since doing composer install + vendor/bin/phpunit does not cut it. It'd be
fantastic if someone already created a clean action to do this that I can
copy.

Best

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Entrepreneur | Software Architect | Open Source | Longtermism
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-06 Thread Jeroen De Dauw
Hey,

Related question: is it possible to edit existing slots and add new ones
via the API?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Entrepreneur | Software Crafter | Open Source | Wikimedia | Speaker
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Running JS for parser function preview in Visual Editor

2020-04-05 Thread Jeroen De Dauw
Hey Szabó,

Thanks for the pointers - much appreciated.

My impression is that the approach you are talking about is great for
creating a high level of integration with Visual Editor. I've looked at
Kartographer which is doing something similar. However at this point I just
want the map JS to be executed appropriately. Writing 2000+ lines of JS for
this is not reasonable. Probably I don't need quite that much, but it is
hard to tell. It certainly seems like I'd end up with way more code and
coupling to VE than I have now, making my current hack preferable. Or is
registering minimal DM/CE stuff really short and decoupled?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Entrepreneur | Software Crafter | Open Source | Wikimedia | Speaker
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Running JS for parser function preview in Visual Editor

2020-04-04 Thread Jeroen De Dauw
Hey,

I am maintainer of the Maps extension for MediaWiki.
https://github.com/JeroenDeDauw/Maps#maps

Recently I worked on improving integration with Visual Editor. Maps used to
not show up as a grey box [0] since their initialization JavaScript was not
being run. I got things working but am not happy with the solution and hope
there is a better approach.

After some searching I found that if I ran the initialization code in a
handler to the ve.activationComplete hook, the maps would get initialized
as desired [1]. That is not a complete solution though, since you can edit
the maps with Visual Editor [2]. This causes an API request that parses the
new wikitext to HTML, which then replaces the old HTML of the initialized
map. So initialization needs to happen again.

An hour or two searching through the docs and Visual Editor code did not
yield any usable hook/event. (This was quite surprising to me. Has no one
ever done something similar to what I am doing here?) So I settled on just
running initialization once every second with setInterval(). I found the
ve.deactivationComplete hook which I then used to stop the code running
every second on saving and existing the visual editor. Turns out that when
opening the visual editor after that does not result in a new
ve.activationComplete event, hence I had to remove termination of the
repeating initialization.

Surely there is a better way to run my code once per second (ad infinitum)
starting with Visual Editor activation? The perfect event for my usecase
would be "visual editor rendered a parser function". A broader event (ie
"something got rendered") would still be better than nothing.

You can see my current code here [3], including a few commented out bits
which I left in so you can see some of what did not work.

[0]
https://user-images.githubusercontent.com/146040/78461765-18d15780-76cc-11ea-82cd-eb69d0179fd7.png
[1]
https://user-images.githubusercontent.com/146040/78461769-21299280-76cc-11ea-9461-a2c343062482.png
[2]
https://user-images.githubusercontent.com/146040/78461779-369ebc80-76cc-11ea-9495-4a91a24a.png
[3]
https://github.com/JeroenDeDauw/Maps/blob/7.17.1/resources/leaflet/LeafletLoader.js#L24-L45

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Entrepreneur | Software Crafter | Open Source | Wikimedia | Speaker
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Resource Loader messes up Leaflet CSS

2020-03-31 Thread Jeroen De Dauw
Hey,

When using the Maps extension together with MediaWiki 1.34 or later the
markers on Leaflet maps do not show correctly.

Example:
https://user-images.githubusercontent.com/146040/78082503-7b1c1680-73b3-11ea-8c15-28552363a7f4.png

Issue tread: https://github.com/JeroenDeDauw/Maps/issues/607

They show fine on older versions of MediaWiki. Though only if Maps loads
the leaflet.css file by stuffing an Html::linkedStyle() into the header on
top of using the Resouce Loader. Without that hack the bug also shows up on
older MediaWiki versions.

As you can deduce from that hack working, Resouce Loader is somehow
involved. This can be confirmed by setting $wgResourceLoaderDebug to true,
which causes the bug to disappear. I tracked this down a bit and found that
this modification "fixes" the bug as well:
https://github.com/JeroenDeDauw/MediaWiki/commit/1713ccde9de7d59634b1a134c58ee3c84ba01642

Without knowing how the Leaflet library is being broken or knowing Resource
Loader internals, this is rather hard to track down further. Any help is
much appreciated.

Some relevant code:

* Resource loader module definition:
https://github.com/JeroenDeDauw/Maps/blob/master/extension.json#L111
* leaflet.css:
https://github.com/JeroenDeDauw/Maps/blob/master/resources/lib/leaflet/leaflet.css

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Entrepreneur | Software Crafter | Open Source | Wikimedia | Speaker
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Storing some values in the DB

2020-01-29 Thread Jeroen De Dauw
Hey,

Sounds like you just want a simple key value store that.
> I don't believe mediawiki has a table or interface for this.
>

Indeed.


> Slightly relevant would be https://phabricator.wikimedia.org/T227776
> although
> that RFC will likely come up with something that is in some way bound to a
>

The ticket Kosta Harlan linked before (
https://phabricator.wikimedia.org/T128602) seems more relevant. Even though
an extension providing this functionality would not be of much help for my
use case, unless it came bundled with core. Requiring an extra extension
that needs to be installed using update.php just to avoid needing to run
update.php for my extension does not make any sense.

If you are overwhelmed with SQL or mediawiki migration system please just
> ask for help
>

Thank you Jaime but I think I can manage ;)

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
contributor ~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Storing some values in the DB

2020-01-27 Thread Jeroen De Dauw
Hey,

> Why are you so reluctant to create a table?

It complicates extension installation and adds yet another thing to
maintain. Which is a bit silly for a handful of values. I was hoping
MediaWiki would have a generic table to put such things already. Page_props
is exactly that, just bound to specific pages.

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
contributor ~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Storing some values in the DB

2020-01-27 Thread Jeroen De Dauw
Hey Kosta,

The values in my extension are not user specific. It is really just a
handful of values for the entire wiki.

One approach I'm considering is to abuse the page props table and have the
values "linked to page" with ID 1 billion (or -1) or so. I'm not sure how
safe that is though. The data would only be stored in the database, so not
losing it due to the running of some maintenance process is important.

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
contributor ~=[,,_,,]:3


On Mon, 27 Jan 2020 at 12:30, Kosta Harlan  wrote:

> Hi,
>
> On 1/27/20 12:13 PM, Jeroen De Dauw wrote:
> > Hey,
> >
> > I have a MediaWiki extension in which I'd like to store a handful of
> > values. Example:
> >
> > * Cats 42
> > * Ravens 23
> > * Goats 0
> >
> > The most obvious thing to do would be to add a new database table with a
> > string and an integer column. However I'd like to avoid the need to run
> > update.php to install this extension. Is there some place in the
> MediaWiki
> > DB schema where I can put a few values like these? They could even be in
> a
> > single serialized blob (though preferably can be looked up by string
> key).
>
> You want to store these values per-user, I'm assuming? This is probably
> of interest then: https://phabricator.wikimedia.org/T128602
>
> In GrowthExperiments extension we store small bits of data like that as
> serialized blobs in hidden user preferences, but this isn't great, among
> other reasons because the values are delivered on on each page load
> whether you want them or not.
>
> Kosta
>
> >
> > Cheers
> >
> > --
> > Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
> > Professional wiki hosting and services: www.Professional.Wiki
> > <https://Professional.Wiki>
> > Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
> > contributor ~=[,,_,,]:3
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Storing some values in the DB

2020-01-27 Thread Jeroen De Dauw
Hey,

I have a MediaWiki extension in which I'd like to store a handful of
values. Example:

* Cats 42
* Ravens 23
* Goats 0

The most obvious thing to do would be to add a new database table with a
string and an integer column. However I'd like to avoid the need to run
update.php to install this extension. Is there some place in the MediaWiki
DB schema where I can put a few values like these? They could even be in a
single serialized blob (though preferably can be looked up by string key).

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf>
Professional wiki hosting and services: www.Professional.Wiki
<https://Professional.Wiki>
Software Crafter | Speaker | Entrepreneur | Open Source and Wikimedia
contributor ~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WebTestCase

2019-10-02 Thread Jeroen De Dauw
Hey,

Your best bet might be making actual web requests. You can use Selenium
> from PHPUnit, ...
>

Is there an example of a test doing either of those?

The web request case is apparently not as simple as stuffing
Title::newFromText()->getCannonicalUrl() into Http::get() and I'd rather
not be making an abstraction for this myself as it requires a bunch of MW
knowledge I do not have.

Therefore, you must call the "controller" directly from your test.
>

What is this "controller" in MediaWiki context? Do you have any example of
how this would work?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WebTestCase

2019-10-02 Thread Jeroen De Dauw
Hey,

> I think selenium is often used for that use case

I'm explicitly looking for a PHPUnit based approach. Similar to what
Symfony provides. Surely this is possible with MediaWiki, though perhaps
not as elegantly?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WebTestCase

2019-10-02 Thread Jeroen De Dauw
Hey,

I want to test against the HTML actually send to the browser. Not some
random intermediate step.

Example. Lets test this code:
https://github.com/JeroenDeDauw/Maps/blob/ccc85caf8bfb5becb6dd0d0035c1b582a7670d00/src/MediaWiki/MapsHooks.php#L93-L100

The code adds some special HTML to missing pages in the GeoJson namespace.
Now we want to make sure this HTML is not showing up on pages in the main
namespace. So we want to get the HTML for /wiki/ThisPageDoesNotExist.
OutputPage->getHtml() is going to return some empty response since this
page has no content, so it is not what we need.

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WebTestCase

2019-10-02 Thread Jeroen De Dauw
Hey,

> Sort of.  Once you have the HTML, you can assert various HTML matches and
non-matches using Hamcrest extensions

That's great, but how do I get the HTML in the first place?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] WebTestCase

2019-10-01 Thread Jeroen De Dauw
Hey,

Does MediaWiki have something similar to Symfony's WebTestCase? (
https://symfony.com/doc/current/testing.html#functional-tests)

I want to write some integration tests in the form of "does web page
/wiki/MyWikiPage contain HTML snippet XYZ".

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PLURAL in mw.msg

2019-10-01 Thread Jeroen De Dauw
Hey Bawolff,

Including mediawiki.jqueryMsg fixed the issue. Thanks a bunch!

I concur this is rather peculiar design. It'd be helpful to have some
indication of this in the method docs of the mw.msg() code.

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PLURAL in mw.msg

2019-09-29 Thread Jeroen De Dauw
Hey,

I have this message: Removed $1 {{PLURAL:$1|shape|shapes}}

This message is used in JavaScript via mw.msg() to create an edit summary.
Unfortunately this summary is showing up without the PLURAL function being
rendered:
https://user-images.githubusercontent.com/146040/65845180-6cd43b80-e339-11e9-9fb3-12642a835aa0.png

The docs state that PLURAL is supported in JS:
https://www.mediawiki.org/wiki/Manual:Messages_API#Feature_support_in_JavaScript.
I've also tried mw.message('my-key').parse() and some variants of that,
though these all gave me the same result.

Why is this not working and how can I fix it?

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://EntropyWins.wtf> |
www.Professional.Wiki <https://Professional.Wiki>
Entrepreneur | Software Crafter | Speaker | Open Souce and Wikimedia
contributor
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] mediawiki.api.edit

2019-09-24 Thread Jeroen De Dauw
Hey,

Yesterday I released a new feature in the Maps extension that uses the
mediawiki.api.edit resource loader module.

I developed this against MediaWiki 1.31 (the latest LTS and minimum version
of MediaWiki required by Maps). Today I found out that things are broken on
MediaWiki 1.33.0. Apparently the mediawiki.api.edit resource loader module
got removed.

What is the deprecation policy for such modules? I was rather surprised
that a module with no sign of deprecation is gone 2 releases later, without
any form of fallback.

And more importantly, how can I now make my code work with both MW 1.31 and
1.33? I see that on MW master, the relevant JS file is now part of the
mediawiki.api module. I could include both modules, which would work fine
on MW 1.31, though on 1.33 I'd still have a reference to a module that does
not exist. And since my own module definition is in JSON (extension.json),
I can't even check which MW version I'm dealing with (unless there is some
special syntax for this). I'm not seeing a sane solution that I can use
here, so help is much appreciated.

Cheers

--
Jeroen De Dauw | www.EntropyWins.wtf <https://entropywins.wtf> |
www.Professional.Wiki <https://professional.wiki/>
Software Crafter | Entrepreneur | Speaker | Strategist | Contributor
to Wikimedia
and Open Source
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit 6+ on older versions of MW

2018-05-15 Thread Jeroen De Dauw
Hey Legoktm,

This is for the Maps extension, though I will also need to deal with this
elsewhere.

Failing tests: https://travis-ci.org/JeroenDeDauw/Maps/builds/375344161

Code: https://github.com/JeroenDeDauw/Maps

I'd be ideal to have the older versions of MediaWiki use PHPUnit 6.x or
later, but I've not found a way to do so yet.

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software Crafter | Speaker | Student | Strategist | Contributor to Wikimedia
and Open Source
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PHPUnit 6+ on older versions of MW

2018-05-12 Thread Jeroen De Dauw
Hey,

I updated one of my extensions to use PHPUnit 6+ and am having troubles
getting the tests to run with older versions of MediaWiki. In particular,
I'd like my tests to run with MW 1.27. I suspect this is somehow possible
and has already been done in other extensions.

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software Crafter | Speaker | Student | Strategist | Contributor to Wikimedia
and Open Source
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HHVM vs. Zend divergence

2017-09-18 Thread Jeroen De Dauw
Hey,

Going with option 2 would be a rather big f***-*** to the non-WMF part of
the MediaWiki community. Furthermore, it would limit usage of common PHP
libraries, which sooner or later will end up using features not available
in HHVM. This, together with the already outlined reasons, makes me concur
with HHVM not being a viable long term option.

If only WMF was running PHP 7.1 already... guess I'll have more luck asking
for a pony though :)

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software craftsmanship advocate
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] use of @inheritdoc

2017-05-19 Thread Jeroen De Dauw
Hey,

Personally I'm not using this in my projects and would object to those tags
being added as they are IMO clutter. Those projects have some relevant
differences with MediaWiki that all contribute to those tags being more
clutter than helpful:

* Good OO: small classes and interfaces, favoring of composition over
inheritance and thus very rarely inheriting concrete code
* Simple well named methods with no or few parameters: documentation
explaining something beyond type is seldomly needed
* Usage of scalar type hints, nullable type hints and return type hints via
usage of PHP 7.1

And of course following any of those points has benefits way beyond not
needing inheritdoc tags as crutch.

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software craftsmanship advocate
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Rewrite of the Wikimedia Deutchland fundraising

2016-12-02 Thread Jeroen De Dauw
Hey all,

Apparently the links I send are broken in some email clients. These ones
should be OK:

[0]
https://www.entropywins.wtf/blog/2016/11/24/rewriting-the-wikimedia-deutschland-funrdraising/
[1]
https://www.entropywins.wtf/blog/2016/11/24/implementing-the-clean-architecture/

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software craftsmanship advocate | Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Rewrite of the Wikimedia Deutchland fundraising

2016-12-01 Thread Jeroen De Dauw
Hey all,

At Wikimedia Deutchland, we rewrote our fundraising application [0]. For
this rewrite we took an approach rather different than what is typically
done in Wikimedia-verse [1]. It is my hope that these differences and the
lessons we learned are of interest to some on this list.

[0] https://www.entropywins.wtf/blog/2016/11/24/rewriting-the-
wikimedia-deutschland-funrdraising/
[1] https://www.entropywins.wtf/blog/2016/11/24/implementing-
the-clean-architecture/

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software craftsmanship advocate | Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Crediting the work of our community members

2016-07-06 Thread Jeroen De Dauw
Hey,

No @author tags? But then we won't have treasures like this
https://twitter.com/JeroenDeDauw/status/750841758733463552 :(

Do you not somehow need them for the license info? Apart from that I can't
think of a good reason to add them. Finding out who wrote some code is what
git blame is for, and attribution can be done on a higher level.

Cheers

--
Jeroen De Dauw | https://entropywins.wtf | https://keybase.io/jeroendedauw
Software craftsmanship advocate | Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Versions of libraries used by MW core

2015-07-25 Thread Jeroen De Dauw
Hey,

Not all package maintainers follow semver perfectly. For example, the
> upgrade from monolog 1.11.0 -> 1.12.0 had a backwards compatibility
> break[1] which would have broken our logging if we had used "~1.11" in
> composer.json.
>

That is true. Often it's not pragmatic to follow to rules 100%. There'd
have been no issue if the range used had been "~1.11.0". What about the
libraries part of the MediaWiki project itself? Supposedly we can trust
those. If we can't, that seems like a bigger problem to begin with.

Normally people do this by putting ranges in the composer.json and
> commiting the composer.lock file to pin to a specific version, but that
> would prevent people from adding arbitrary dependencies to MW for
> extensions due to a dirty composer.lock file ([2], etc.)...so we just
> put the specific versions in composer.json instead.
>

Unfortunately those things are not equivalent. If you use a composer.lock,
one can still run composer update. That is not only needed when one wants
to get bugfixes. Imagine you want to install a MediaWiki extension that
requires version "^1.0.1" of some library while MediaWiki requires "1.0.0".
You end up not being able to install the extension, since MediaWiki's
composer.json says "no, you can't use that bugfix". That seems like a huge
usability fail to me. Am I missing something?

If there are bugfixes in libraries that affect MediaWiki, we
> should backport library updates just like any other bug fix that is
> backported.
>

I hope this was meant to say "that affect *the people using* MediaWiki".

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Versions of libraries used by MW core

2015-07-24 Thread Jeroen De Dauw
Hey all,

I just noticed that MediaWiki specifies specific versions of the libraries
it uses in its composer.json:
https://github.com/wikimedia/mediawiki/blob/ca56f1fbc32529622cf430fe2ed44347b85e3c24/composer.json#L19-L31

To me this is somewhat surprising and not something I've seen often. Why
are bugfix releases excluded from the version ranges? And is it really a
good idea considering it causes the users of the latest stable MediaWiki
release to download outdated versions of various libraries?
https://github.com/wikimedia/mediawiki/blob/REL1_25/composer.json#L19-L29

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding extensions under packagist's "mediawiki" vendor

2015-07-21 Thread Jeroen De Dauw
Hey Bryan,

What exactly justifies such an authoritarian "need to go though some
permission process" setup? Exactly what problems are we currently seeing?
I'm very sceptical about such an approach. Sure you can say things such as
that I'd be nice for other people to have access. The reality is that most
people don't care about most extensions and that a lot of them end up being
unmaintained and very low quality to begin with. Telling volunteers they
should go follow a process they do not want to follow and that they should
use a code hosting service they do not want to use has its down sides. This
was also not done in the past. You did not need approval to create a
"certified MediaWiki extension" or something like that.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfSuppressWarnings deprecation

2015-06-22 Thread Jeroen De Dauw
Hey,

Thanks for expalining.

I did notice the code now resides in a seperate library. Congratulations on
that work. I can see how this provides benefit.

It is also clear that naming it wfSuppressWarnings there does not make a
lot of sense. What is not clear to me is why existing MediaWiki extensions
that are completely dependent on MediaWiki should now preferrably call the
library functions directly. Why do they have to care that this refactoring
has been made? If you are writing some new code and are trying to decouple
it from MediaWiki, and it needs this functionality, then by all means, use
the namespaced one. For existing extensions where such decoupling is not
going to happen, I see only cost, not benefit.

I'm not suggesting that the library should not be there, or that the
namespacing in it does not make sense. My suggestion is to not deprecate
wfSuppressWarnings.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wfSuppressWarnings deprecation

2015-06-21 Thread Jeroen De Dauw
Hey all,

This thread is much in line with the "wfRunHooks deprecation" one from
January. Rather than turning global functions into static functions, this
time it's about namespacing global functions.

All extensions calling wfSuppressWarnings now are supposed to change this
to MediaWiki\suppressWarnings, for no obvious gain. Important to keep in
mind here is that this is not a simple search and replace, since that'd
make extensions incompatible with anything before MediaWiki 1.26 alpha.
Either we need to ignore the deprecations (which is not something you want
people to learn as good practice), or we need to add some kind of wrapper
in each extension.

There also is the question of consistency. Nearly all global functions are
still namespaced using the wf prefix. Will they all be changed? Or will
just a few functions be migrated?

I'd really prefer this kind of busywork for extension maintainers not to be
created without very good reason. There are enough breaking changes to keep
up with as it is.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Using "," as decimal separator in #expr

2015-06-14 Thread Jeroen De Dauw
Hey,

I'm using the #expr parser function provided by the Parser Functions
extension. I'd like it to use "," as decimal separator, and it currently
uses ".". Is there a way to change this nicer than installing String
Functions and doing a replace?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Please fix: breaking change for third parties

2015-05-27 Thread Jeroen De Dauw
Hey,

I'd like to draw some additional attention to the fact that the MediaWiki
release tags and branches have disappeared from the git repo mirror on
GitHub.

https://phabricator.wikimedia.org/T100409

Having this disruption for anyone who relies on this mirror be addressed in
a timely manner would be great.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia group on GitHub

2015-05-27 Thread Jeroen De Dauw
Hey,

I've noticed that it adds some of my commits to Wikimedia repos to my
> calendar and sometime it doesn't.
>

If you commit using different git email/name/whatever, then GitHub will not
recognize that as you unless you first do the needed config steps. This is
true for all such git using services that I am aware of.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Developer at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] server-side HTML templating now live in core

2015-02-25 Thread Jeroen De Dauw
Hey,

Cool to see MediaWiki move in this direction!

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] My experiences with PHPCS and PHPMD

2015-02-20 Thread Jeroen De Dauw
Hey,

I've written some stuff about my experiences with PHPCS and PHPMD in the
context of MediaWiki development, which might be of interest to people here.

http://www.bn2vs.com/blog/2015/02/20/phpcs-and-phpmd-my-experiences/

Also, in case you want to see how easy it is to add these tools to a
MediaWiki extension, see
https://github.com/SemanticMediaWiki/SemanticMetaTags/pull/25/files

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfRunHooks deprecation

2015-01-21 Thread Jeroen De Dauw
Hey,

Does the new syntax offer any advantage over the old one?
> Assuming that we want to switch to non-static function calls eventually
> (which I hope is the case), wouldn't it be friendlier towards extension
> maintainers to only deprecate once we are there, instead of forcing them to
> update twice?
>

Good points and questions. While this deprecation is not as problematic as
simply ditching the current hook system altogether, it does indeed seem a
bit of busy work.

The Hooks class has this comment "Used to supersede $wgHooks, because
globals are EVIL.", which is quite amusing if you consider all fields and
methods are static. So it's a switch from a global var to a global field,
thus adding a second global to get rid of the first one. I have this
presentation on static code which has a screenshot of this comment and
class in it :)

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wfRunHooks deprecation

2015-01-20 Thread Jeroen De Dauw
Hey,

> Hooks::run()

Oh, in that case there is not much to worry about.

Added a note in
https://gerrit.wikimedia.org/r/#/c/186115/1/includes/GlobalFunctions.php

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] wfRunHooks deprecation

2015-01-20 Thread Jeroen De Dauw
Hey,

I just noticed wfRunHooks got deprecated. The hook mechanism is heavily
depended on by extensions. So if it is going away, what will it be replaced
by? There is no hint of this in the method doc.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki 2.0 (was: No more Architecture Committee?)

2015-01-20 Thread Jeroen De Dauw
Hey,

FYI: we had this discussion for SMW some time ago. We where at version 1.9,
following a naming pattern very similar to the MediaWiki one. The question
there was if the next release containing a break should be either 1.10 (no
change), 2.0 (following semver) or 10.0 (dropping the "1."). People
generally preferred 2.0, mostly because "10.0 seems like such a big jump".
So this is what we went with. And while the 2.0 release was not more
significant than 1.9 or 1.8, no one seems to have gotten seriously confused
about it.

   - ​Get rid of wikitext on the server-side.
>   - HTML storage only. Remove MWParser from the codebase. All
>   extensions that hook into wikitext (so, almost all of them?) will
> need to
>   be re-written.
>

Just to confirm: this is not actually on the WMF roadmap right? :)

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Getting the full URL of an image

2015-01-20 Thread Jeroen De Dauw
Hey,

Now trying this code: wfFindFile( $title );

With this title object as input: http://pastebin.com/7B9n1NyN (constructed
with Title::makeTitle( NS_FILE, $fileName ))

The output is still false. At the same time, the page definitely exists,
and if I put "[[File:Blue marker.png]]" in a wiki page, it shows up just
fine. I double checked my file permissions, and they seem fine. What am I
missing?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Getting the full URL of an image

2015-01-19 Thread Jeroen De Dauw
Hey,

On my local wiki I have a page with the name "File:Blue marker.png". The
following code returns false:

$title = Title::makeTitle( NS_FILE, $file );
$title->exists();

That used to return true in the past. Not sure what is broken - my wiki or
MediaWiki itself.

What I want to do is go from string page name, such as "File:Blue
marker.png", to full URL, such as "
http://localhost/phase3/images/6/6f/Blue_marker.png";. What is the
recommended way of doing this now (that works on MW 1.19 and later)?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: New in Platform Engineering: James Douglas and Stas Malyshev

2014-11-20 Thread Jeroen De Dauw
Hey,

Welcome to James and Stas!

Much of our infrastructure isn't that specific, so you'll hear us talking a
> lot
> over the next year about moving toward "Service Oriented
> Architecture", which basically means taking our general purpose
> software and breaking it up into more focused components.
>

\o/

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revision metadata as a service?

2014-11-09 Thread Jeroen De Dauw
Hey,

 1. As an editor I'd like to flag a revision as reviewed/verified by me
> from the revision screen or list.
>  2. As an editor I want to see which revisions were verified/had second
> opinion by other editors.
>
> That's not a "service" by any definition I'm aware of, that's a user
> interaction, one that seems to presume the existence of the Wikimedia base.
>

You are right. I was thinking about what has been described in the first
post of this thread.


> There have been some successful products built using the "code first,
> design later" method, but that doesn't mean it should be encouraged.
>

Fully agree. Writing code and designing are things that go together. If you
write code without thinking about its design, you'll end up with bad code.
If you design without ever writing code, your designs will likely not hold
up well in the real world.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revision metadata as a service?

2014-11-08 Thread Jeroen De Dauw
Hey,

I suspect it isn't done because it isn't a very good way to modify a
> complex embedded base of software, Lila. Generally, when modifying a
> complex embedded base, one designs first, iterates implementation and
> internal testing, and then releases a relatively complete piece of
> functionality.
>

As far as I understand this is about adding a new service, not modifying
complex existing behaviour. Is that wrong?

Also, while I can't speak for Lila, it does seem to me she is suggesting to
go with a simple solution that tackles a specific need well, as opposed to
releasing something incomplete. The idea to build small dedicated units
rather than monoliths is not something new or controversial in the world of
software design. Even though it might be in certain insular communities.

See also: https://en.wikipedia.org/wiki/Unix_philosophy

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Technical Debt

2014-10-23 Thread Jeroen De Dauw
Hey,

> Since a friend introduced me to Scrutinizer yesterday and the graph above
seems to be based on it,

SensioLabsInsight != ScrutinizerCI

I added mediawiki/core to their interface:
>
> https://scrutinizer-ci.com/g/wikimedia/mediawiki/
>

Yay. I tried doing this a year ago or so, and back at that point the
analysis just aborted due to too many issues. Guess the limit was raised :)

That being said, is there any point in fixing all those "issues"? And if
> so how do we track them and make sure they are not reintroduced with new
> patchsets?
>

Going though the issue list and getting rid of all the warnings is probably
not a good use of your time. Going though and seeing if it points you to
something pressing might be worthwhile. What I personally find very
valuable is that you can get a list of classes sorted by complexity, or by
coupling, or by quality rating, to get an idea of what areas of the
codebase could use some love [0].

At Wikimedia Deutchland we use ScrutinizerCI for most of our PHP
components, and have it run for each commit merged into master. You can
then see the changes per commit [1], get weekly reports [2] and view the
overall trend [3].

[0]
https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/code-structure/master
[1]
https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/inspections/07fab814-f6bc-42df-aab4-80f745d7f0d9
[2] https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/reports/
[3] https://scrutinizer-ci.com/g/wmde/WikibaseDataModel/statistics/

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Technical Debt

2014-10-23 Thread Jeroen De Dauw
Hey,

I've looked at the code and it's quality of most of these projects at some
point in the last two years, and have to say this graph is much in line
with my impressions. And with what the "knowledgeable" people at PHP
conferences and meetups convey. Want an example of bad code? Wordpress!
Want one of good code? Symfony or Doctrine! Sadly enough MediaWiki seems to
be entirely absent as a topic or even something to rant about at such
meetups and conferences.

> It's probably a useless statistic

It's based on SensioLabsInsight, which I personally quite dislike.
ScrutinizerCI is hands down better at spotting issues (way to many false
positives on SensioLabsInsight).

> Good to know we still have less technical debt than WordPress. ;)

Perhaps not the best standard to set for oneself :)

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Invitation to beta-test HHVM

2014-09-21 Thread Jeroen De Dauw
Hey,

I know requiring unit tests was discussed in the past, anyone remembers
> where and what the result was?
>

The most cohesive discussion on this I can remember was in the thread
"Architecture Guidelines: Writing Testable Code".
https://lists.wikimedia.org/pipermail/wikitech-l/2013-June/thread.html

There was quite some discussion about testing at the time in general,
though that was quite scattered communication wise. This has presumably
resulted in people being more motivated to enhance the testing
infrastructure, learn about testing, and of course write tests. AFAIK there
has never been general agreement on that writing tests for most of the code
one writes is a good idea though.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I'm leaving the Wikimedia Foundation

2014-09-12 Thread Jeroen De Dauw
Hey Sumana,

Thanks for writing that up and the bits of helpful advice you gave me over
the years.

And like Micru, I have to agree with this part:

And I'd like to [...] exclude destructive communication from my life (yes,
> there's some amount of burnout on toxic people and entitlement).
>

This is such a tragic waste on many levels. It's very hard to make real
progress towards an organizations goals and have fun in doing so, if the
environment contains to much of this. Even if every single person involved
is putting effort towards those goals. You are definitely not the first
person to leave the WMF (in part) because of this.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Future platforms, devices, and consumers

2014-08-18 Thread Jeroen De Dauw
Hey,

Hi Rexford,
>
> What objectives would we achieve if we were to "revamp the MediaWiki system
> from ground up"? Even if we  pretend that we have infinite financial and
> human resources to do this, I am not sure what we would accomplsh, and we
> would likely introduce a lot of new reliability and security bugs.
>

I can't answer for Rexford, though can provide you with a reply of my own.

You are quite right that the resources needed to rewrite a software as big
and complex as MediaWiki from ground up in one go are not realistic. I
think this is simply not feasible and a bad idea to begin with. There is
plenty of good writing on the topic of migrating away from legacy systems,
often cautioning against "the big rewrite".

That does not mean that moving away from the current platform is a bad
idea. Of course there needs to be good reason to do so. Is the current
platform causing problems severe enough to warrant changing direction? Look
at all the effort being put into the software surrounding Wikipedia and
associated projects. A lot of enthusiastic and smart people are spending
plenty of time on it. So how come progress has slowed down so much? How
come we cannot easy add new functionality? What exactly is causing all this
effort to be spend so inefficiently? And how much more would we be able to
achieve if those issues where not there?

One concern with rewriting or redesigning things that I've seen outlined
often is that it is easy to just end up at the same place again. If no
effort is put into identifying why the old system ended up in a bad state,
then it's naive to expect the new one will not suffer the same fate.

> and we would likely introduce a lot of new reliability and security bugs.

Is that something inherent to writing new software or migrating away from
legacy systems? Or is that simply what would happen if such a task was
attempted without altering development practices first?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] more thoughts on project types; what is reusable?

2014-08-11 Thread Jeroen De Dauw
Hey,

> What reusable things come out of each Wikimedia Engineering project?

The Wikibase software, developed primarily by Wikimedia Deutchland,
includes a lot of reusable code. Basically the components listed as
"library" on this page: https://www.mediawiki.org/wiki/Wikibase/Components

This decoupling gives us a lot of flexibility and reduces the cost of
maintenance, so I'm all for moving further down that road.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ExtensionDistributor updates on mediawiki.org

2014-07-31 Thread Jeroen De Dauw
Hey,

> Feature request 3: be able to specify a bunch of extensions to download
> and
> > have a compat check be done (based on the package definitions), after
> which
> > they are bundled together
>
> Could you elaborate on this a bit? For which extensions would this be
> useful?
>

Those that have Composer defined dependencies. If someone wants to install
Wikibase Repo and Wikibase Query on their wiki, and they don't have CLI
access, then they need to create a build manually. Having a tool that does
this via a web UI would be awesome.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] ExtensionDistributor updates on mediawiki.org

2014-07-31 Thread Jeroen De Dauw
Hey,

\o/

Feature request 1: be able to download specific versions of extensions
based on the tags their provide.

Feature request 2: include dependencies defined in composer.json.

Feature request 3: be able to specify a bunch of extensions to download and
have a compat check be done (based on the package definitions), after which
they are bundled together

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC] Extension registration

2014-06-14 Thread Jeroen De Dauw
Hey,

I don't see any reason to continue supporting register_globals at any
> level. It's been turned off by default in 4.2.0 (circa 2002), deprecated in
> 5.3.0 and removed in 5.4.0. There's no reason to keep supporting it, it's
> not a good use of our resources to maintain that support.
>

I very much agree with this. Having a check for register_globals being on
that exits the code seems like a good way to address the issue with minimal
effort.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to make MediaWiki easier to install: use cases

2014-06-11 Thread Jeroen De Dauw
Also: people with shell but no git.

Sent from my mobile device
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-03 Thread Jeroen De Dauw
Hey,

As for the accusation that the current approach favors the WMF, it's almost
> not worth responding to.
>

It seems like what James is getting at is the core community, not the WMF.
The problem being that several people seem to thing that the concerns
raised are "not relevant" and "not worth responding to".

Composer is simply not meant to be used in the way it was shoe-horned into
> MediaWiki.
>

Starwman. I happen to have discussed the situation and the approach with
the main people behind Composer in person, as well as gone over details
with contributors on IRC. They did not seem to share your opinion.

I’m not going to re-explain this every time because it is in multiple
> places on Bugzilla and in this mailing list.

Who is asking you to re-explain this? What I want is you and others to stop
dismissing the concerns raised and coming up with a solution for the
problems caused, rather then repeating the same lines over and over again.

We are not going to misuse libraries and hack together MediaWiki just so
> extension installation can be *slightly* easier.
>

In other words, you are fine with breaking existing support, and screwing
over both users and developers of extensions such as SMW. In case of SMW,
the different is not slight, as it uses libraries itself.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Jeroen De Dauw
Hey,

> To make things worse, I noticed on my development environment that our
> > own scap-equivalent will just go on to run composer update even if the
> > file conflicted. This causes it to remove the extensions and libraries
> > we currently install via composer, also breaking the site.
>
> I hope for the sake of all-non WMF users that already use Composer to
> install extensions that the proposed changes are not making things
> worse (well it doesn't seem so).
>

I would also like to express my disappointment at third party users being
thrown under the bus once again. Several people have been putting a lot of
effort into supporting third party users, so it really saddens me that this
is dismissed as an irrelevance by some so easily.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Jettisoning our history?

2014-05-31 Thread Jeroen De Dauw
Hey,

One thing I have noticed is that it is much faster for me to clone core
from GitHub then from WMF. Guess that having the thing also hosted in the
EU would help.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Config class and 1.23

2014-04-18 Thread Jeroen De Dauw
Hey,

Is there some kind of description of the responsibility of the context
source stuff anywhere? And the design vision behind it? I find the whole
thing extremely dubious, as it appears to try make you bind to a whole
group of rather scary classes. Perhaps I am missing something?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-04-06 Thread Jeroen De Dauw
Hey,

> Sounds reasonable to me --> https://gerrit.wikimedia.org/r/124130

\o/ Thanks. This will save extension maintainers quite some hassle.

> Couldn't you just create a MyExtensionNameSpecialPage class like below
and extend that for your special pages instead of regular SpecialPage

You sure can, and I have done such things many times in the past. This
however still does cost effort and is in the end not so nice, esp if it
needs to happen for all extensions that care about it. And in case they
derive from different SpecialPage subclasses, they each have to do it
multiple times.

This all hints as a basic design guideline: only bind to frameworks where
you have to, and avoid putting any domain and application logic in
derivatives of their base classes. That's an entirely general guideline,
yet one often disregarded in MediaWiki extensions.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-04-05 Thread Jeroen De Dauw
Hey,

> Are you simply looking for a revert

Not of the whole commit. Just the wfDeprecated call.

I seem to remember that at one point the policy was to add a @deprecated
tag, and then two releases later or so add the wfDeprecated call. Actually
I made some commits which violated this policy and then got reverted.

With this approach extension authors can retain compatibility with the last
few versions without doing ugly things and without running into piles of
notices. Then when 1.25 comes they can do a search and replace and bump
their min mw version to 1.23. That saves both the extension devs a lot of
hassle, and does not throw the users of those extensions under the bus
either.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-04-05 Thread Jeroen De Dauw
Hey,

Yes. You're meant to:
>
>- write your extension for $CURRENTVERSION;
>

So does this mean extension authors are not supposed to write extensions
compatible with multiple MediaWiki versions?

If you *also* want to backport bug fixes and new functionality to older
> branched versions, that's great, but a lot of work.
>

Yeah, maintaining multiple versions is more work then maintain
compatibility with MediaWiki. Unless of course this task is made difficult,
which is the issue I referred to.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SpecialPage::getTitle deprecated?!

2014-04-05 Thread Jeroen De Dauw
Hey,

Is anyone working on resolving this issue? The current state means that
extension authors need to either

* use if-else checks based on MediaWiki versions
* live with having deprecation notices on 1.23
* break compatibility with 1.22 and earlier

Or am I missing something?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data to improve our code review queue

2014-04-04 Thread Jeroen De Dauw
Hey,

> http://korma.wmflabs.org/browser/gerrit_review_queue.html

Is this just for core, or for all repos?

For Wikia, it sort of seems like they just created one commit that did not
get merged. The graph seems quite skewed. At least for WMDE contributors I
would find it strange that our typical commit takes over two weeks to get
merged if our most contributed to repos where held into account.

> http://korma.wmflabs.org/browser/scr.html

So here I noticed my name in the list on the right and clicked it.
http://korma.wmflabs.org/browser/people.html?id=127&name=jeroendedauw

These graphs seem very wrong. It is only showing commits starting in 2012,
rather than 2009. At the same time it lists emails from 2006, while I only
joined the list in 2009. It also has "jeroen_dedauw" at the top, which has
never been my commit name. So what's going on there?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Multimedia team architecture update

2014-03-07 Thread Jeroen De Dauw
Hey,

This overview seems quite reasonable to me until this point:

on the other hand it [using ie dependency injection] would mean a lot of
> gadgets break every
> time we change things, and some possibly do even if we don't.
>

I am unsure how you are reaching that conclusion.

Dependency Injection, and more generally speaking Inversion of Control, is
an often useful approach in creating cohesive code with low coupling.
However in itself it does not magically make your architecture and code
quality good. You can still create low quality code and a bad architecture
when using Dependency Injection (granted, it's easier to do so when not
using DI).

If gadgets break whenever you change something, you are suffering from
whats called fragility. The main cause for this is coupling that should not
be there. On an architecture level this means there is something wrong with
your boundaries. That is the first thing I'd be scrutinizing in your
situation.

Some sort of hook system (with try-catch blocks, strict
> validation etc) would be much more stable
>

I'm not sure either approach is inherently more stable. Stability depends a
lot on how you are implementing it. The main thing to hold in mind here for
either approach is to be careful with what you expose. Narrow well defined
interfaces are typically the way you want to go. When exposing to much, one
ends up in a situation where flexibility and stability are at odds - ie
either you cannot make a certain change in your code, or you need to break
the public interface.

Decision: go with the closed model; reach out for potential plugin writers
> and collect requirements; do not guess, only add plugin functionality where
> it is actually requested by someone.
>

What I like about this is that you are not randomly adding a bunch of
hooks. Each hook decreases your flexibility a tiny bit and each hook is
additional maintenance cost. If you introduce them on an as-needed basis
and make sure your hook has a nice interface and well defined goal, then
you are well on your way to nicely implementing this.

This approach is however not disjoint with Inversion of Control and good
architecture, which you apparently agree with:

most of it is just good architecture like
> services or dependency injection which would make sense even if we did not
> want plugins
>

So I do hope you will also be pursuing that road.

It is not clear to me from your email that you are holding the difference
between libraries and applications into account, so I'll briefly explain.

Either a component is reusable or not, having a component with reusable and
non-reusable code in it, and trying to reuse that reusable code from it is
bad practice. Reusable code typically ends up in libraries. These libraries
have no state. Non-reusable components, such as applications and
extensions, typically do have their own state. If you want some code to be
truly reusable, going with the library approach, as the Symfony Components
are doing, is almost certainly the right approach. And indeed, this to is
merely generally good practice one should probably follow even when not
having the explicit goal to make something reusable.

The problem with having this supposedly reusable functionality in one
component with state that cannot be used as library is that it forces as
users to work together. It is like a global variable or a static class.
When modifying the thing (ie via a hook), one needs to be careful to not do
this in a way that breaks the other users. When at some point two different
users have constraints for the state that conflict, you have a problem. At
that point you are faced with the choice between painful refactoring or
creating an even bigger mess.

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SpecialPage::getTitle deprecated?!

2014-03-01 Thread Jeroen De Dauw
Hey,

Regarding
https://github.com/wikimedia/mediawiki-core/blob/793f17481ea937275898c91b9ba3c92c5a3e908b/includes/specialpage/SpecialPage.php#L467-488

So now all extensions that want to retain compatibility with MediaWiki 1.22
and at the same time not have deprecation notices on 1.23 need to do an
if-else for all calls to this method? That sucks big time. And for what?
The old method is just forwarding to the new one. All this hassle for a
rename? How about removing the wfDeprecated call and not annoying all
extension maintainers?

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password Hash

2014-02-26 Thread Jeroen De Dauw
Hey,

I just stumbled across this wrapper [0] for the password functions
introduced in PHP 5.5. Figured this stuff is also relevant in the
discussion.

[0] https://github.com/ircmaxell/password_compat
[1] http://de1.php.net/password

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unlicking the cookie: "ExtensionStatus" extension

2014-02-22 Thread Jeroen De Dauw
Hey,

Checking such extension status is essentially solved by dependency
management. And dependency management is solved by various tools, the one
being relevant to MediaWiki and other PHP projects being Composer [0]. In
fact this capability already exists for several extensions, though there is
no nice GUI for it yet. An example of such version checking can be seen at
[1] which checks if the versions required for a component are up to date
with the ones it provides. In that case the thing being checked for out of
date dependencies in a component itself, though the exact same code can
work for a wiki. Some relevant pages on the existing support are [2, 3, 4].

Implementing the ideas I outlined in [4] is not terribly difficult and will
result in very powerful capabilities. Or rather, it will make them a lot
more visible and accessible, which will then likely result in wider
adoption of the underlying paradigm (dependency management) amongst
extension authors. I estimate that building a working prototype can take as
little as two days, as indeed most of the groundwork has already been done
[5].

That is in contrast to scraping things of gerrit, or relying on info on
MediaWiki.org being comprehensive and up to date, combined with building an
own solution. I anticipate such a solution to be way less powerful, much
harder to maintain, being rather brittle, not being interoperable at all,
and harder to create in the first place.

[0] http://getcomposer.org/
[1] https://www.versioneye.com/php/mediawiki:semantic-mediawiki
[2] https://www.mediawiki.org/wiki/Composer
[3]
http://www.bn2vs.com/blog/2014/02/15/introduction-to-composer-for-mediawiki-developers/
[4] http://sourceforge.net/mailarchive/message.php?msg_id=31680734
[5] If you think I'm full of shit, throw me 2k USD and I will prove you
wrong

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Making it possible to specify MediaWiki compatibility in extensions

2014-02-22 Thread Jeroen De Dauw
Hey,

As you are probably aware of, it has been possible for some time now to
install Composer compatible MediaWiki extensions via Composer.

Markus Glaser recently wrote an RFC titled "Extension management with
Composer" [0]. This RFC mentioned that it is not possible for extensions to
specify which version of MediaWiki they are compatible with. After
discussing the problem with some people from the Composer community, I
created a commit that addresses this pain point [1]. It's been sitting on
gerrit getting stale, so some input there is appreciated.

[0]
https://www.mediawiki.org/wiki/Requests_for_comment/Extension_management_with_Composer
[1] https://gerrit.wikimedia.org/r/#/c/105092/

For your convenience, a copy of the commit message:

~~

Make it possible for extensions to specify which version of MediaWiki

they support via Composer.

This change allows extensions to specify they depend on a specific
version or version range of MediaWiki. This is done by adding the
package mediawiki/mediawiki in their composer.json require section.

As MediaWiki itself is not a Composer package and is quite far away
from becoming one, a workaround was needed, which is provided by
this commit.

It works as follows. When "composer install" or "composer update"
is run, a Composer hook is invoked. This hook programmatically
indicates the root package provides MediaWiki, as it indeed does
when extensions are installed into MediaWiki. The package link
of type "provides" includes the MediaWiki version, which is read
from DefaultSettings.php.

This functionality has been tested and confirmed to work. One needs
a recent Composer version for it to have an effect. The upcoming
Composer alpha8 release will suffice. See
https://github.com/composer/composer/issues/2520

Tests are included. Composer independent tests will run always,
while the Composer specific ones are skipped when Composer is
not installed.

People that already have a composer.json file in their MediaWiki
root directory will need to make the same additions there as this
commit makes to composer-json.example. If this is not done, the
new behaviour will not work for them (though no existing behaviour
will break). The change to the json file has been made in such a
way to minimize the likelihood that any future modifications there
will be needed.

Thanks go to @beausimensen (Sculpin) and @seldaek (Composer) for
their support.

~~

I also wrote up a little blog post on the topic:
http://www.bn2vs.com/blog/2014/02/15/mediawiki-extensions-to-define-their-mediawiki-compatibility/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-20 Thread Jeroen De Dauw
Hey,

I would like to present Jamie with the official barn-kitten of useful data
brought to a wikitech thread.
https://pbs.twimg.com/media/Bg9lHl8CMAAhXz-.jpg

Congratulations Jamie!

I'd also like to take this opportunity to start an RFC on redesigning all
of MediaWiki's UI, with me doing the work. Clearly my Gimp skills can bring
us to the next level.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] February '14 appreciation thread

2014-02-12 Thread Jeroen De Dauw
Hey,

Thanks to Tpt for doing a great job in implementing things in
WikibaseDataModelSerialization.

Thanks to MWJames for his huge amount of contributions to the SMW project.

Thanks to Jamie Thingelstad for providing very helpful early adoption
feedback on SMW and running the awesome resource that is WikiApiary. This
makes up for me having to look up how to spell that last name!

Thanks to Karsten (kgh) for providing great user support, enhancing
translations and doing many other things.

Thanks to addshore for improving the Wikibase code even when I am not
looking.

Thanks to those who are trying to make the MediaWiki dev community a more
friendly and open minded place.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Password Hash

2014-02-05 Thread Jeroen De Dauw
Hey,

Is the 72-byte truncation a general bcrypt problem or specific to
> password_hash()? Any concerns or a non-issue? Note that some non-Latin
> strings can only fit 24 chars in 72 bytes of UTF-8. Long enough for most
> passwords, but some people like passphrases. :)
>

I have a 100 char password.

The Whirlpool algorithm by Tim would force password cracking software to do
> a custom implementation for our hashes. It has very similar work effort to
> bcrypt, and should keep our passwords as safe as using bcrypt. The theory
> behind it seems good, but obviously, we might discover a gaping hole in it
> at some point.
>

I'm very concerned about implementing our own crypto. After all, the first
rule of crypto is to not roll your own.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TitleValue

2014-01-24 Thread Jeroen De Dauw
Hey,

Java is a technology with strong and weak sides, like any other.
Religiously labeling anything that resembles something from it as evil
because you do not like it is perhaps not the most constructive approach
one can take. That is quite obvious of course. From my vantage point, it
definitely seems like a lot of this is going on in the MediaWiki community
with regards to this topic and several others. Do you think this is not the
case - good for you. It is however the reason I am very reluctant to join
any such discussions, and I know this is the case for many other people as
well.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Jeroen De Dauw
Hey,

> can you use a phar file for loading a library and not just executing a
script?

Yeah, you can include the phar (with a PHP include statement).

> Can we use the phar in core?

Sure. One reason I've seen brought forward to bundle such a phar with a
project is that then everyone runs the same version of PHPUnit.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit versioning

2014-01-14 Thread Jeroen De Dauw
Hey,

What version is available via PEAR? Installing via that is no more
> manual than apt.
>

Sebastian recommends that you use the phar, which is a lot easier then
PEAR. Instructions on how to use it can be found at:

https://github.com/sebastianbergmann/phpunit#installation

This morning I tried to run some unit tests, and to my surprise it failed
> with an error that PHPUnit 3.7.0 is now required.
>

For Diff this has been required for close to a year I think.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Watchlist and RC refactor, and why you will like it

2014-01-09 Thread Jeroen De Dauw
Hey,

Some suggestions regarding code quality and design:

* The MediaWiki SpecialPage system has serious design issues and is best
treated as legacy API. I recommend not binding domain or application logic
to it where this can be avoided. This means not having most code in a
subclass of SpecialPage.

* Creating inheritance hierarchies to share code is very likely a bad idea.
This is done often in MediaWiki, and in all cases I'm aware of has very
negative consequences. Try separating concerns and using composition.

* Use hooks with caution. They can easily lead to lots of tight coupling
and harmful procedural control flows. I suggest only introducing hooks that
are clearly needed, and keeping argument lists of these to a minimum. This
means both argument count and argument size (passing in a whole context
object is likely bad for instance).

That being said, great you are working on improving this code! Also note
how these are general suggestions/concerns that came to my mind after
reading your email, not feedback on specific code or changes.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikibase DataModel released!

2013-12-23 Thread Jeroen De Dauw
Hey,

I’m happy to announce the 0.6 release of Wikibase DataModel. This is the
first real release of this component. The awesome thing about this release
is that it finally makes the component ready for third party usage.

You can find a short description of Wikibase DataModel, some history and
highlights for this release at
http://www.bn2vs.com/blog/2013/12/23/wikibase-datamodel-released/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Mailing list etiquette and trolling

2013-12-11 Thread Jeroen De Dauw
Hey,

In recent months I've come across a few mails on this list that only
contained accusations of trolling. Those are very much not constructive and
only serve to antagonize. I know some forums that have an explicit rule
against this, which results in a ban on second violation. If there is a
definition of the etiquette for this list somewhere, I suggest having a
similar rule be added there. Thoughts?

(I'm now half expecting someone to claim this mail is a troll. Perhaps we
ought to make a contest out of making the accusation first, at least then
it will have general amusement value :D)

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wiki -> Gerrit was Re: FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites

2013-12-11 Thread Jeroen De Dauw
Hey,

Has there been thought on how GitHub can potentially help here? I'm not
sure it fits the workflow well, though can make the following observations:

* People can click an "edit" button on GH to edit the code, much like on
wiki.
* If the GH web UI is used, people do not have to install git
* They do not even need to understand git or know what it is
* A workflow only involving code in actual source control can potentially
be more streamlined and rely less on custom written solutions that also
need to be maintained
* Having such code in an "easy to use" (compared to git+gerrit) system that
nevertheless provides a way to move to doing it more "professionally" might
well have more people make the jump at some point

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Do we have MediaWiki activities at 30C3 (Chaos Communication Congress) Hamburg 27.-30.12.2013 ?

2013-12-10 Thread Jeroen De Dauw
Hey,

At least half a dozen people from WMDE will be there. Unlike last year, we
will not have a stand. MediaWiki as a whole does seem out of scope for the
congress, if one considers the topics covered there. I did hear some thing
about the FOSDEM wiki dev room needing more input and participation :)

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-09 Thread Jeroen De Dauw
Hey,

On 9 December 2013 04:22, Rob Lanphier  wrote:

> > That'd be fanatic if it happened for many other reasons as well. For all
> > intents and purposes it is a big caste in the sky though. I expect this
> to
> > not happen in the coming years, unless there is a big shift in opinion on
> > what constitutes good software design and architecture in the community.
>

>
It sounds like you're retreating from an argument you haven't even started
> yet.
>

I outlined something I expect to happen, a prediction of the probable
future. It is not an argument, not a suggestion, not a wish and not
something I'm saying should be done. Do you disagree it is unlikely we will
get to the point where MediaWiki can behave library like (where applicable)
in the near future?

On 9 December 2013 04:22, Rob Lanphier  wrote:

> A successful proposal will likely be one that can be executed
> incrementally without huge workflow shifts, so there may have been
> resistance in the past to a particular "blow it all up and start over"
> strategy.
>

Exactly. Incrementalism is generally the way to go. See also: agile. I for
one think it is a very bad idea to rewrite huge chunks of a codebase at a
time, due to high risk, high cost and sometimes plain infeasibility. Though
note how that there are many people that tend to scream "why did you not
fix the whole of x" whenever an incremental solution is created, or that
demand one goes all the way when it is proposed.

On 9 December 2013 04:22, Rob Lanphier  wrote:

> However, I haven't yet heard anyone put forward the argument
> that MediaWiki's monolithic architecture is the correct long-term
> architecture.
>

The argument is definitely there. People that take this point of view are
obviously not phrasing it as "I like big monolithic architectures" though.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-08 Thread Jeroen De Dauw
Hey,

Finding a way to separate MW the library from MW the application may be a
> solution to this conflict. I don't think this would be a trivial
> project, but it doesn't seem impossible either.
>

That'd be fanatic if it happened for many other reasons as well. For all
intents and purposes it is a big caste in the sky though. I expect this to
not happen in the coming years, unless there is a big shift in opinion on
what constitutes good software design and architecture in the community.

The side effect is that it removed the ability to use Composer to
> manage external components used by MW the library which is Tyler's
> proposed use case [0].
>

If the core community actually gets to a point where potential usage of
third party libraries via Composer is actually taken seriously, this will
indeed need to be tackled. I do not think we are quite there yet. For
instance, if we go down this road, getting a clone of MW will no longer be
sufficient to have your wiki run, as some libraries will first need to be
obtained. This forces change to very basic workflow. People who dislike
Composer will thus scream murder. Hence I tend to regard this as a moot
point for now. Though lets pretend this has already bee taken care off and
look at solutions to the technical problem.

One approach, that is a lot more feasible then making MediaWiki (partially)
library like, is to specify the MW dependencies somewhere else. Not in
composer.json. Then when you install MediaWiki, these dependencies get
added automatically to composer.json. And when you update it, new ones also
get added. In a way, this is similar to the generation of LocalSettings.
This is the approach I'd be investigating further if we actual where at a
point where a technical solution is needed.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MessageBuilder interface

2013-12-07 Thread Jeroen De Dauw
Hey,

A while back I created a properly segregated interface for turning message
keys into messages. Since I've seen someone create almost the exact same
thing, and now ended up needing such an interface in another project as
well, I decided to put this into its own little library.

https://github.com/JeroenDeDauw/i18n

At this time, only msgText is exposed in the general interface, as this is
the only thing I've needed so far. This can easily be extended for other
use cases though. Just released version 0.1 of this library, which is fully
tested, and is installable via Composer.

Hope this is of use to some people, and of great amusement to those who do
not understand interface segregation and IoC.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making MediaWiki extensions installable via Composer

2013-12-07 Thread Jeroen De Dauw
Hey,

Is adding Composer support orthogonal to the idea of making it trivial to
> install MediaWiki extensions?
>

It is not.

 Strictly speaking one thing did get past the "wouldn't it be nice
> if" stage. It used to be possible to install PHPUnit and run our
> tests with that simply by running `{composer} install`, until someone
> went and deleted composer.json as a "feature".


This comment seems to be based on misunderstanding of both the past and
present situation. Let me clarify. Before the mentioned change:

* PHPUnit would be installed by default whenever doing an install via
Composer
* People could not install anything else via Composer as this would cause
conflicts in composer.json

Present situation:

* By default nothing is installed
* Users can choose whatever they want to add, _including_ PHPUnit

It seems that the idea of using Composer more in MediaWiki generally comes
> up at least once every 2-3 months, but we never get beyond the "wouldn't it
> be nice if" stage.
>

Several extensions currently support installation via Composer, and this
capability is already utilized by users of those extensions. For the next
release of Semantic MediaWiki, installation via Composer will be the
recommended approach. So we are not only past the "wouldn't it be nice
if" stage, we also are past the initial implementation stage. There are
still many additional "wouldn't it be nice if" things that can be done
on top, such as creating a GUI for installing extensions. Perhaps writing
an RFC for those would be of actual use, though in a lot of cases creating
a proof of concept and discussing that might be better use of ones time.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Uninstalling hooks for tests?

2013-12-05 Thread Jeroen De Dauw
Hey,

I am not very happy about this but we came to the case
> where it might be useful to explicitly uninstall some
> hook(s) for out unit tests.
>
> You might want to checkout MediaWikiTestCase::
> uninstallHook
>
> https://gerrit.wikimedia.org/r/#/c/99349/
>
> I am not happy about blurring differences between unit
> and integration testing, but breaking core with extensions
> and vice versa is sometimes useful.
>

If you have a test that is impacted by the hook system, it is not a real
unit test. Making real unit tests for most MediaWiki code is easier said
then done, though this is caused by the code, not the testing framework. To
quote Misko Hervey, "the secret in testing is writing testable code". I
recommend trying to minimize not needed binding in both production code and
tests. For instance deriving from MediaWikiTestCase should not be needed
often. Bad test code is as much a liability as bad production code.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Making MediaWiki extensions installable via Composer

2013-11-24 Thread Jeroen De Dauw
Hey all,

Several MediaWiki extensions, such as Maps and Semantic MediaWiki, are now
installable via Composer. Lately I've been getting quite a few questions
from developers on how to have their extensions support this awesome
dependency manager, and decided to write down the most important points
that are not obviously evident in the Composer documentation down.

http://www.bn2vs.com/blog/2013/11/24/introduction-to-composer-for-mediawiki-developers/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New project on labs for unit tests

2013-11-08 Thread Jeroen De Dauw
Hey,

What about using TravisCI or any of the many alternatives? I'm not sure
what the benefit would be of setting up and maintaining something like that
yourself.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-06 Thread Jeroen De Dauw
Hey,

> * MediaWiki is not a shining beacon of good architecture. I'd argue it
> > should mostly be considered as a big blob of generally badly designed
> > legacy code
>
> This has nothing to do with the subject at hand.
>

It does, as it implies serious mistakes where made in the past, and that
one should not be religious about existing practices being the true one
way. And it is relevant for some of the other thoughts I listed when
thinking through their implications.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community

2013-11-06 Thread Jeroen De Dauw
Hey,

Some thoughts, in no particular order:

* Titles do not reflect ability, they reflect how titles are assigned in an
organization
* Some people see titles such as "software architect" as a stamp of
superiority
* Some people abuse their titles
* We would indeed be advised to keep the distinction between WMF and
MediaWiki in mind
* MediaWiki is not a shining beacon of good architecture. I'd argue it
should mostly be considered as a big blob of generally badly designed
legacy code
* Length of participation in a community, or the number of contributions,
does not linearly relate to ability
* Having a community process to decide who gets $fancyTitle, seems more
likely to result in the most popular people of the strongest subgroups to
get it then the most qualified ones. Same as what happens in politics.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikibase Database 0.1 released!

2013-11-01 Thread Jeroen De Dauw
Hey all,

I'm happy to announce the immediate availability of Wikibase Database 0.1.

https://github.com/wmde/WikibaseDatabase/releases

An overview of the functionality, installation instructions, usage
instructions and technical documentation can be found at

https://github.com/wmde/WikibaseDatabase/blob/master/README.md

For a writeup on why this library was created and some technical notes, see

http://www.bn2vs.com/blog/2013/11/01/new-database-abstraction-layer-for-mediawiki/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code Climate for Ruby and JavaScript

2013-10-25 Thread Jeroen De Dauw
Hey,

That's very interesting, thanks for sharing.

This service looks very similar to Scrutinizer, which we are using for some
the of Wikidata related PHP components.

https://scrutinizer-ci.com/g/wikimedia/mediawiki-extensions-WikibaseDatabase/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Refactoring the Title object

2013-10-24 Thread Jeroen De Dauw
Hey,

Proposed:
>
> $tp = new TextTitleParser();
> try {
> $title = $tp->parse( $text );
> $tf = new UrlTitleFormatter( $title, 'foo=bar );
> return $tf->format();
> } catch( MWException ) {
> return null;
> }
>

I hope this is your own interpretation of what would happen on top of what
is proposed and not what is actually proposed, since this code snippet has
some issues.

Those parser and Formatter objects contain configuration specific state. If
you instantiate them like this, you are creating code that is just as bad
as Title::newFromText, where this newFromText method relies on similar
config specific state. In both cases you are using global state and
programming against concretions.

Does the RFC also sate exceptions should be used rather than returned error
values? Seems like a quite disjoint concern to me. And if you use
exceptions, please use them properly and do not copy-paste MWException all
over.

This example also seems rather fabricated and not representative of most
use cases. Ignoring the points above, it strikes me as only making sense in
a transaction script, ie something which goes from handling user input
(such as parsing a title), through domain logic (altering a title), to
presentation (ie formatting a title). Given what MW does, it is much more
likely any particular piece of code only resides in one layer, rather then
3. Though I can imagine how we have lots of code that for no good reason
does this anyway, quite plausibly prompted by the fact that the Title
object caters to all 3 layers.

I read that as namespaces other than the one you're obviously referring
> to with a TitleValue--so Category:Foo only knows about Category:, not
> Project: or User:.
>
> Maybe someone can clarify here?
>

It is good to have a value object to not be dependent on configuration, so
instances of it do not get messed up if the config is changed. So
presumably in this case it'd contain an int, rather then an
internationalized sting or whatever.

Also,
> http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html
>

I stopped reading at

All Java people love "use cases"
>

Perhaps you could summarize what the RFC in question does wrong according
to the post you linked?

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core code coverage report

2013-10-18 Thread Jeroen De Dauw
Hey,

For those interested in having coverage reports for their extensions, you
can look at the source of any of the extensions listed here:

https://coveralls.io/r/wikimedia

These extensions allow running their tests by executing "phpunit" in their
root directory. That makes creating the coverage for those as simple as
running "phpunit --coverage-html". All those extensions have their tests
run on TravisCI, which on successful build submits the coverage report to
the coveralls.io service, which keeps track of coverage changes over time.

One thing that is lacking in this setup is having the project risk and CRAP
reports, which are in themselves also very useful. I'd be very cool if
someone set something generally usable up for getting those reports for
extensions.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core code coverage report

2013-10-18 Thread Jeroen De Dauw
Hey,

Uh, what does test coverage on tests/ directory even mean?  Who
> watches the watchers tests the tests?
>

Good point. Non-production code typically ought to be excluded. This is
pretty simple to do with some PHPUnit config:

https://github.com/wikimedia/mediawiki-extensions-Diff/blob/master/phpunit.xml.dist

The relevant section there is "filter". This makes sure only the files in
the "src" directory get held into account for the coverage report. This
config also makes sure that all such files get held into account, and
prevents the ones with no coverage at all from being omitted (which is the
default behavior).

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki core code coverage report

2013-10-18 Thread Jeroen De Dauw
Hey,

Thanks for setting this up hashar! It's a very useful metric to have.

I'd like to bring some attention to the "dashboard" link at the top of
coverage pages, which leads to for instance

https://integration.wikimedia.org/cover/mediawiki-core/master/php/index.dashboard.html

This page is very useful in finding the things most in need of attention.
It also lists classes by complexity, in this particular case making it very
clear that classes such as Parser are doing way to much stuff.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Removing oneself from commits on gerrit

2013-10-13 Thread Jeroen De Dauw
Hey,

I have a big pile of commits that I have been added as reviewer on Gerrit,
most of which I really do not care about. As a result, the entirely list
becomes pretty useless and is just being a big waste of screen space. I
could go through all those commits one by one and remove myself from them,
though that is rather tedious. Is there a way to remove such commits
quickly from my review list by just clicking them in the list, or doing a
"select, select, select, remove selected" thing?

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SubPageList 1.0 released

2013-10-13 Thread Jeroen De Dauw
Hey,

I am happy to announce the immediate release of SubPageList 1.0.

This release packs a ton of improvements over the last one, which was
almost two years ago. Particular attention has been paid to making the
extension more maintainable, solid and flexible. The documentation has also
been updated and now includes descriptions for all supported parameters.

Installation and usage instructions can be found via the README:
https://github.com/wikimedia/mediawiki-extensions-SubPageList/blob/master/README.md

Enjoy!

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] include google api with ResourceLoader

2013-10-11 Thread Jeroen De Dauw
Hey,

I don't know what the current state of affairs is, but when I looked into
this for the Maps extension shortly after RL was introduced, it was not
possible.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Exceptions, return false/null, and other error handling possibilities.

2013-10-07 Thread Jeroen De Dauw
Hey,

> What are the actual issues with Status, and how is this proposal
different?

The Status object is many ways is a fancy error code. It causes the same
issues as returning error codes. It differs with the proposal in that the
Status object deals with error cases, while the proposal does not.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Exceptions, return false/null, and other error handling possibilities.

2013-10-07 Thread Jeroen De Dauw
Hey,

We use lots of libraries that happen to use composer. We just don't
> use composer to deploy them.
>

Oh? Lots? Is there a list somewhere? Are most of those libraries forked?
Are a good portion of them semi-assimilated into core? I hope the answer to
the later two is "no".

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Exceptions, return false/null, and other error handling possibilities.

2013-10-07 Thread Jeroen De Dauw
Hey,

Returning error codes and null force then handler to deal the error case.
Often this is for no good reason, and often the handler cannot decide how
to deal with such an error, and is forced to simply pass it on. In such
cases, it is just silly to have in-band error handling. This is what
exceptions are for.

The linked library has documentation that describes some of the problems of
using such return values. Based on those docs, it seems like a good way to
get around these problems for the null/false case. So if the choice is
between returning null/false or using this library, I'd at least give the
library a try.

Some further remarks:

* Exceptions should not be used for control flow of non-error cases
* Returning error codes is problematic and is not solved by this library,
Exceptions should typically be used

Remarks on the usage of this library:

* Is a WMF team really going to use a third-party library distributed via
Composer? :)
* It has over half a million downloads on Packagist, so is probably pretty
solid
* It has tests and runs them on TravisCI \o/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Escaping for field and index names

2013-10-07 Thread Jeroen De Dauw
Hey,

When constructing an SQL string, how should the following things be
escaped, if at all?

* Field names
* Index names

It looks like when doing a select using the Database MW thing, the field
names provided do not get escaped at all.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   >