Re: [Wikitech-l] pt.wikimedia.org - database naming

2016-02-24 Thread This, that and the other
Why not just delete the old ptwikimedia site and put the new one in its place, 
using the same dbname?


The old wiki is inaccessible, since pt.wikimedia.org redirects offsite, so it's 
unclear if the old DB even needs to be preserved. And presumably any 
configuration bits that refer to ptwikimedia will still be relevant to the new 
site.


If for some reason that is not feasible, I guess pt2wikimedia is acceptable, 
though only as a last resort. As I've said before, there really needs to be a 
better way to rename wikis without wasting hours of everyone's time...


TTO

--
"Alex Monk"  wrote in message 
news:CALMPGzX_E4ML0xD2A_2vTRZ+2a+nWtpC9KkJe58x=mdg-uo...@mail.gmail.com...


Hi all,

A request has come up (https://phabricator.wikimedia.org/T126832) to
re-create pt.wikimedia.org on the wikimedia cluster. Unfortunately it was
previously hosted there and so the 'ptwikimedia' database name is already
taken.
Since database renaming does not really appear to be an option, does anyone
have any objections to using 'pt2wikimedia' (or similar, suggestions
welcome) instead for the new wiki? I know this doesn't fit the existing
pattern so I'm unsure about just going ahead without asking for input from
a wider audience.

Alex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] renaming Wikimedia domains

2015-08-26 Thread This, that and the other

Thanks for bringing this up, Amir.

I would point out that since there are such a lot of wikis waiting to be 
renamed, there is an opportunity for economy of scale here. If all the 
departments/people you list were able to set aside a couple of days to sit down 
together and rename the 15+ wikis waiting to be renamed, having figured out a 
process and renamed a trial wiki beforehand, I think it could be made 
worthwhile.


I also think many of the communities, especially if small, would view a brief 
period of downtime as an acceptable tradeoff for having their domain name 
corrected. Especially be-x-old, I've always thought that one was pretty ugly, 
and wouldn't be surprised if the Taraškievica Belarusian Wikipedia community 
felt the same way.


I agree that getting community engagement/community liaisons involved and 
talking with relevant developers/ops folks and the affected editing communities 
would be a good next step.


TTO

(Sorry for not replying inline: my news client is pretty dumb, as you can 
probably guess from the header line below.)


--
Jaime Crespo  wrote in message 
news:cabassrl0hm1t9shd-qh1npk8j-opupttbr6_mudan3c5bwu...@mail.gmail.com...


(this is not an official response, just my opinion after some research on
the topic)

Due to internal (and growing) complexity of the mediawiki software, and WMF
installation (regarding numerous plugins and services/servers), this is a
non trivial task. It also involves many moving pieces and many people-
network admins (dns), general operations (load control/downtime), dbas
(import/export), services, deployment engineers and developers (mediawiki
configuration changes, patches).

What's worse, is that it would almost certainly create downtime for the
wikis involved (not being able to edit) -specially given that it is not a
common operation-, and some of them are smaller communities, and I would be
worried be to annoy or discourage editing on those wikis (when we want the
opposite!).

It would be great to have someone in contact with the community so that we
can identify which sites have a great consensus about renaming the wiki,
and are perfectly informed about the potential problems and still are ok to
go forward. Maybe someone in Community engagement can evaluate risks vs.
return?

On Wed, Aug 26, 2015 at 9:53 AM, Antoine Musso hashar+...@free.fr wrote:


Le 26/08/2015 07:20, Amir E. Aharoni a écrit :
 In the past when requests to rename such domains were raised, the usual
 replies were along the lines of it's impossible or it's not worth the
 technical effort, but I don't know the details.

 Is this still correct in 2015?

As pointed out: https://phabricator.wikimedia.org/T21986

For what it is worth, in 2011 JeLuF wrote a list of actions needed to
rename a wiki.  It is outdated nowadays but that is sufficient to state
renaming a wiki is a non-trivial task:
https://wikitech.wikimedia.org/wiki/Rename_a_wiki

It would surely consume a lot of engineering time to come up with a
proper migration plan and actually conduct them.  I am not sure it is
worth the time and money unfortunately.

--
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l





--
Jaime Crespo
http://wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Post to a talk page without knowing whether it'sold-style or Flow

2015-06-03 Thread This, that and the other

Thanks for thinking of tools like Twinkle!

I wonder why this couldn't be implemented directly as an API action, though. 
That would probably be helpful to bots and non-JS-based tools, as well as to 
Twinkle itself (since we use our own XML-based MediaWiki API wrapper for 
consistent error handling and user experience).


TTO

Matthew Flaschen  wrote in message news:556e561f.9090...@wikimedia.org...

Many user scripts and gadgets, such as Twinkle, UserMessages, etc., need
to post on various talk pages.

With the development of Flow, there are now two possible types of talk
pages to deal with (not counting LQT, which this solution does not
handle), old-style and Flow.  User talk pages are already starting to
use Flow on a very limited opt-in basis (I think the number is still
less than I have fingers).

Now, you can use MessagePoster to post to a page without knowing this
information ahead of time.  The documentation is at
https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.messagePoster.factory
, and you can see an example of how it's used at
https://git.wikimedia.org/blob/mediawiki%2Fcore.git/977f7ad8ade23a7ce5326a993bdf48c8beb42db0/resources%2Fsrc%2Fmediawiki%2Fmediawiki.feedback.js
.

Feel free to reply, or ask on #wikimedia-collaboration, if you have any
questions.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Google thinks the Anonymous entry on enwp is hacked

2015-02-13 Thread This, that and the other
See https://phabricator.wikimedia.org/T75305 for a previous instance. The VPT 
thread linked there (now archived at [1]) has some other relevant discussions, 
including an IRC log.


TTO

--
[1] 
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)/Archive_132#Google_thinks_Neuron_may_be_hacked


==
John Mark Vandenberg  wrote in message 
news:cao9u_z5v0agwurjnu1eedcgy3qi4hea5juhd15zu-muawth...@mail.gmail.com...


Thanks James.  Do you recall what the previous articles were?

--
John Vandenberg

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New feature: tool edit

2015-02-11 Thread This, that and the other
It's funny, it just so happens that Anomie and I are working on something [1] 
right now, based on the existing change tagging infrastructure, which is quite 
similar to what you are asking for, and with much the same purpose in mind.


There have been discussions at [2] and [3] relating to this topic, both of which 
contain eerily similar ideas and questions to some of the messages in this 
thread. It just goes to show that great minds think alike...


TTO

--
[1] https://gerrit.wikimedia.org/r/#/c/188543/
[2] 
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(proposals)/Archive_117#Bot_tagging_of_edits

[3] https://phabricator.wikimedia.org/T20670

Petr Bena  wrote in message 
news:CA+4EQ5fO=tom0-gnau1iimtht1ozaczlvv+8-um0c9qqx7e...@mail.gmail.com...


Hi,

I think I proposed this once but I forgot the outcome.

I would like to implement a new feature called tool edit it would be
pretty much the same as bot edit but with following differences:

-- Every registered user would be able to flag edit as tool edit (bot
needs special user group)
-- The flag wouldn't be intended for use by robots, but regular users
who used some automated tool in order to make the edit
-- Users could optionally mark any edit as tool edit through API only

The rationale is pretty clear: there is a number of tools, like AWB
and many others that produce incredible amounts of edits every day.
They are spamming recent changes page -
https://en.wikipedia.org/wiki/Special:RecentChanges can't be filtered
out and most of regular users are not interested in them. This would
make it possible to filter them out and it would also make it easier
to figure out how many real edits some user has made, compared to
automated edits made by tools.

Is it worth implementing? I think yes, but not so sure.

Thanks

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New feature: tool edit

2015-02-11 Thread This, that and the other
Chris Grant  wrote in message 
news:caf_zkbp-abgzgcy4lqqvbtxur-2tjo8opmbwxtrosfvihuc...@mail.gmail.com...



On 11 Feb 2015 17:57, Petr Bena benap...@gmail.com wrote:

 As I said, I belive that any registered user should be able to use,
 with no need for permissions as I see no way to abuse it.

If anyone can use it, wouldn't the smarter vandals just use it to avoid the
RC patrollers?


How does a user prove that they're using a particular tool a way that can't be 
faked? Something like OAuth comes to mind. All edits made via an OAuth consumer 
are already tagged with a unique tag, and I would assume that it is not possible 
to falsely represent an OAuth consumer.


I'm not sure whether this could work for common tools like AWB or Twinkle, 
though:


* I don't know whether OAuth works for client-side downloadable programs like 
AWB.
* JavaScript tools edit as the user from the user's browser, and as such, OAuth 
is not relevant to them. In any case, anything they do (like adding a specific 
string to edit summaries, adding a tag to their edits, or the like) can be 
easily spoofed or faked by a tech-savvy user.


Before change tagging could be used as a way to *filter out* particular tool 
edits (as opposed to being simply a way of identifying revisions that satisfy 
some criterion) the RC tag filter would need to be improved.


(I'm not pretending that change tagging is the only solution for Petr's tool 
edits idea: I just think it is the most likely candidate for implementing 
something like this.)


TTO 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Permanently deleting change tags available by default

2015-02-05 Thread This, that and the other
I would suggest that the potential danger of such a right is mitigated by 
the following:


* It is not possible to delete change tags defined by an extension unless 
the extension specifically allows it (which none currently do). That means 
that active AbuseFilter tags, tags like `visualeditor`, etc. cannot be 
deleted. The exception to this is OAuth, which should really be fixed not to 
allow its tags to be deleted.


* The feature is currently limited to deleting tags used by = 5,000 
revisions. (This may change in the future.)


On enwiki, this would allow old, inactive AbuseFilter tags to be cleaned up 
(like michael jackson). These tags are in most cases applied to old 
revisions, and seeing as the intention of tags is to help patrolling of 
recent edits, I don't think this is a big enough deal to require logging 
every single revision to which the tag was (in many cases, probably 
erroneously) applied.


TTO

Jackmcbarn  wrote in message 
news:CAOx5P=+gAkH_-k3uUAFF4fCTy16BQH9ibihV57t56x=h4QB=b...@mail.gmail.com...


Gerrit change 181958[1] was recently merged, which allows (among other
things) the ability for sysops to irrecoverably delete change tags. Since
irrecoverable deletion of anything from on-wiki is rather unprecedented, I
think we should stop granting it to all sysops in DefaultSettings.php, so
that wikis have to opt-in for this feature to be enabled. Thoughts?

[1]https://gerrit.wikimedia.org/r/#/c/181958/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Invitation to beta-test HHVM

2014-09-19 Thread This, that and the other

I should point out that any sysops, importers, stewards etc. who use
Special:Import should not enable the HHVM beta feature, as importing does not
currently work on HHVM [1].

Only once that issue is fixed, and the API is running via HHVM, will I be able
to get excited...

TTO

[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=66023

--
Ori Livneh  wrote...

(apologies for cross-posting)

I'm happy to announce that HHVM is available on all Wikimedia wikis for
intrepid beta testers. HHVM, you'll recall, is an alternative runtime for
PHP that provides substantial performance improvements over the standard
PHP interpreter. Simply put: HHVM is software that runs on Wikimedia's
servers to make your reading and editing experience faster.

You can read more about HHVM here: https://www.mediawiki.org/wiki/HHVM

* How do I enable HHVM?

You can enable HHVM by opting in to the beta feature. This short animated
gif will show you how: http://people.wikimedia.org/~ori/hhvm_beta.gif.

Enabling the beta feature will set a special cookie in your browser. Our
servers are configured to route requests bearing this cookie to a pool of
servers that are running HHVM.

* How do I know that it's working?

Opting-in to the beta feature does not change the user interface in any
way. If you like, you can copy the following code snippet to the global.js
subpage of your user page on MetaWiki:

https://meta.wikimedia.org/wiki/User:Ori.livneh/global.js

If you copy this script to your global.js, the personal bar will be
annotated with the name of the PHP runtime used to generate the page and
the backend response time. It looks like this:

http://people.wikimedia.org/~ori/hhvm_script.png

Edits made by users with HHVM enabled will be tagged with 'HHVM'. The tag
is there as a precaution, to help us clean up if we discover that HHVM is
mangling edits somehow. We don't expect this to happen.

* What sort of performance changes should I expect?

We expect HHVM to have a substantial impact on the time it takes to load,
preview, and save pages.

At the moment, API requests are not being handled by HHVM. Because
VisualEditor uses the API to save articles, opting in to the HHVM beta
feature will not impact the performance of VisualEditor. We hope to have
HHVM handling API requests next week.

* What sort of issues might I encounter?

Most of the bugs that we have encountered so far resulted from minute
differences in how PHP5 and HHVM handle various edge-cases. These bugs
typically cause a MediaWiki error page to be shown.

If you encounter an error, please report it on Bugzilla and tag with it the
'HHVM' keyword.


We're not done yet, but this is an important milestone. The roll-out of
HHVM as a beta feature caps many months of hard work from many developers,
both salaried and volunteer, from the Wikimedia Foundation, Wikimedia
Deutschland, and the broader Wikimedia movement.  I want to take this
opportunity to express my appreciation to the following individuals, listed
in alphabetical order:

Aaron Schulz, Alexandros Kosiaris, Brad Jorsch, Brandon Black, Brett
Simmers, Bryan Davis, Chad Horohoe, Chris Steipp, Erik Bernhardson, Erik
Möller, Faidon Liambotis, Filippo Giunchedi, Giuseppe Lavagetto, Greg
Grossmeier, Jack McBarn, Katie Filbert, Kunal Mehta, Mark Bergsma, Max
Semenik, Niklas Laxström, Rob Lanphier, and Tim Starling.

More good things to come! :)
___
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines
wikimedi...@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
mailto:wikimedia-l-requ...@lists.wikimedia.org?subject=unsubscribe 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help: Upload without text revision - broken

2014-02-23 Thread This, that and the other
There are currently 96 pages on commonswiki with page_latest = 0 (i.e. they are 
missing a revision to display). Some go back to 2013 (oldest is [1]), but a spate of 
about 40 or so have appeared in the last few days.


Very odd... I guess they all need to be fixed and the underlying issue checked 
out.

TTO

--
[1] https://commons.wikimedia.org/wiki/File:Montreal_Place_Londres_2013.JPG

Manuel Schneider  wrote in message news:5309b3b2.9040...@wikimedia.ch...

Hi,

I need help with an uploaded file on Commons which is broken:
https://commons.wikimedia.org/wiki/File:Caspary,_Daniel_%28de%29.webm

This file was - among many others - uploaded to Commons using the
Commonist. Now as you see the wikitext is completely missing and there
is no way to add it. The wikitext revision seems to be missing,
rendering the database broken.

* the history is empty
* trying to edit the page results in an edit conflict which is not
resolvable (trying to overwrite just triggers the next edit conflict)

Can someone please have a look at the database and fix this issue?

Thanks,


Manuel
--
Manuel Schneider - Chief Information Officer
Wikimedia CH - Verein zur Förderung Freien Wissens
Lausanne, +41 (21) 340 66 22 - www.wikimedia.ch

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] The poem tag to be known as lines

2014-02-07 Thread This, that and the other
For some time, work has been ongoing on a merge of the Poem extension into MediaWiki 
core [1] [2]. (For those not aware, this extension [3] implements a simple poem 
tag which, among other things, preserves newlines.)


Several developers have expressed the desire for an alternative name for this tag, 
alongside poem (which of course will be kept for backward compatibility). This is 
because the tag is sometimes used for various other uses besides poetry.


There were many suggestions (see the bug report [1]), but it was eventually agreed 
to use Michael M.'s suggestion of lines. This name puts the focus on the 
individual lines of the content, which is exactly what the tag is doing.


We almost had a collision with a previous proposal (verbatim conflicted with a tag 
in use on Wikia), so we wish to ensure that no-one else is using lines. No-one is 
yet aware of any MediaWiki extensions or other code using a tag named lines in 
wikitext.


If you think the name lines will be an issue, or if you have any other concerns 
with this merge, please speak up, either here or at the bug report.


TTO

--
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=52061
[2] https://gerrit.wikimedia.org/r/#/c/106861/
[3] https://www.mediawiki.org/wiki/Extension:Poem 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread This, that and the other

Tim Starling  wrote in message news:lba9ld$8pj$1...@ger.gmane.org...


I think the interwiki map should be retired. I think broken links
should be removed from it, and no new wikis should be added.

Interwiki prefixes, local namespaces and article titles containing a
plain colon intractably conflict. Every time you add a new interwiki
prefix, main namespace articles which had that prefix in their title
become inaccessible and need to be recovered with a maintenance script.

There is a very good, standardised system for linking to arbitrary
remote wikis -- URLs. URLs have the advantage of not sharing a
namespace with local article titles.

Even the introduction of new WMF-to-WMF interwiki prefixes has caused
the breakage of large numbers of article titles. I can see that is
convenient, but I think it should be replaced even in that use case.
UI convenience, link styling and rel=nofollow can be dealt with in
other ways.

-- Tim Starling


The one main advantage of interwiki mapping is the convenience you mention.  They 
save a great amount of unnecessary typing and remembering of URLs.  Whenever we go 
to any WMF wiki, we can simply type [[gerrit:12345]] and know that the link will 
point where we want it to.


Some possible alternatives to our current system would include:
* to make people manually type out URLs everywhere (silly)
* to use cross-wiki linking templates instead of interwikis.  This has its own set 
of problems: cross-wiki transclusion is another area in sore need of attention (see 
bug 4547); we need to decide which wikis get their own linking templates; how do we 
deal with collisions between local and global (cross-wiki) templates?  etc.  To me, 
it doesn't seem worth the effort.
* to introduce a new syntax for interwiki links that does not collide with internal 
links (too ambitious?)


I personally favour keeping interwikis as we know them, as collisions are very rare, 
and none of the alternatives seem viable or practical.  Maybe the advent of 
interactive editing systems like VisualEditor and Flow will make them obsolete, but 
until then, editors need the convenience and flexibility that they offer when 
writing wikitext.


It seems as though your proposal, Tim, relates to the WMF cluster.  I'd be 
interested to know what your thoughts are with relation to the interwiki table in 
external MediaWiki installations.


TTO
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-17 Thread This, that and the other
Nathan is right that I am contradicting myself a bit.  It's true that if you don't 
look at the interwiki map, you'll never know what's there - you'll never know that 
WMF is stuffing the default map full of its own junk.  What I really meant to say is 
that external users will feel short-changed that we get to add our internal 
interwikis to the global map, yet they aren't allowed to add their internal wikis 
(equivalent to our strategy, outreach, etc) to the global map, for any given reason.


I'm not getting a coherent sense of a direction to take.  Do we split the existing 
interwiki map into a local and a global map (as I originally proposed)?  Do we start 
from scratch, rewriting the interwiki map from a blank slate, or do we start with 
what we've got?  Do we flood external MW users with a ton of new prefixes, or do we 
ship a mostly empty table to new MW installations?  Do we scale right back and limit 
ourselves to a small core of interwiki prefixes?  Do we take up Tim's idea and toss 
interwikis altogether? 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread This, that and the other

Sorry about the borked line wrapping in the previous message - I'm
resending it so you can read it properly!



This is a proposal to try and bring order to the messy area of interwiki
linking and interwiki prefixes, particularly for non-WMF users of
MediaWiki.

At the moment, anyone who installs MediaWiki gets a default interwiki
table that is hopelessly out of date.  Some of the URLs listed there
have seemingly been broken for 7 years [1].  Meanwhile, WMF wikis have
access to a nice, updated interwiki map, stored on Meta, that is
difficult for anyone else to use.  Clearly something needs to be done.

What I propose we do to improve the situation is along the lines of
bug 58369:

1. Split the existing interwiki map on Meta [2] into a global
   interwiki map, located on MediaWiki.org (draft at [3]), and a
   WMF-specific interwiki map on Meta (draft at [4]).
   Wikimedia-specific interwiki prefixes, like bugzilla:, gerrit:, and
   irc: would be located in the map on Meta, whereas general-purpose
   interwikis, like orthodoxwiki: and wikisource: would go to the
   global map at MediaWiki.org.

2. Create a bot, similar to l10n-bot, that periodically updates the
   default interwiki data in mediawiki/core based on the contents of
   the global map. (Right now, the default map is duplicated in two
   different formats [5] [6]which is quite messy.)

3. Write a version of the rebuildInterwiki.php maintenance script [7]
   that can be bundled with MediaWiki, and which can be run by server
   admins to pull in new entries to their interwiki table from the
   global map.

This way, fresh installations of MediaWiki get a set of current, useful
interwiki prefixes, and they have the ability to pull in updates as
required.  It also has the benefit of separating out the WMF-specific
stuff from the global MediaWiki logic, which is a win for external users
of MW.

Two other things it would be nice to do:

* Define a proper scope for the interwiki map.  At the moment it is a
  bit unclear what should and shouldn't be there.  The fact that we
  currently have a Linux users' group from New Zealand and someone's
  personal blog on the map suggests the scope of the map have not been
  well thought out over the years.
  My suggested criterion at [3] is:

Most well-established and active wikis should have interwiki
prefixes, regardless of whether or not they are using MediaWiki
software.
Sites that are not wikis may be acceptable in some cases,
particularly if they are very commonly linked to (e.g. Google,
OEIS).

* Take this opportunity to CLEAN UP the global interwiki map!
** Many of the links are long dead.
** Many new wikis have sprung up in the last few years that deserve to
   be added.
** Broken prefixes can be moved to the WMF-specific map so existing
   links on WMF sites can be cleaned up and dealt with appropriately.
** We could add API URLs to fill the iw_api column in the database
   (currently empty by default).

I'm interested to hear your thoughts on these ideas.

Sorry for the long message, but I really think this topic has been
neglected for such a long time.

TTO



PS. I am aware of an RFC on MediaWiki.org relating to this, but I can't
see that gaining traction any time soon.  This proposal would be a more
light-weight way of dealing with the problem at hand.

[1] https://gerrit.wikimedia.org/r/#/c/84303/
[2] https://meta.wikimedia.org/wiki/Interwiki_map
[3] 
https://www.mediawiki.org/wiki/User:This,_that_and_the_other/Interwiki_map
[4] 
https://meta.wikimedia.org/wiki/User:This,_that_and_the_other/Local_interwiki_map
[5] 
http://git.wikimedia.org/blob/mediawiki%2Fcore.git/master/maintenance%2Finterwiki.list
[6] 
http://git.wikimedia.org/blob/mediawiki%2Fcore.git/master/maintenance%2Finterwiki.sql
[7] 
https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FWikimediaMaintenance.git/master/rebuildInterwiki.php 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Revamping interwiki prefixes

2014-01-16 Thread This, that and the other
Nathan Larson  wrote in message 
news:CAF-JeUxsM-jQ85nij+OALA=rlolnppmhx7yhka1_hiz7m0a...@mail.gmail.com...



Why is it worth the trouble of maintaining two separate lists? Do the
Wikimedia-specific interwiki prefixes get in people's way, e.g. when
they're reading through the interwiki list and encounter what is, to 
them,

useless clutter?


I can't say I care about people reading through the interwiki list. 
It's just that with the one interwiki map, we are projecting our 
internal interwikis, like strategy:, foundation:, sulutil:, wmch: onto 
external MediaWiki installations.  No-one needs these prefixes except 
WMF wikis, and having these in the global map makes MediaWiki look too 
WMF-centric.



Sometimes I do use those
Wikimedia-specific prefixes on third-party wikis (e.g. if I'm talking 
about

MediaWiki development issues)


This is a good argument to include gerrit:, rev:, mediazilla: etc. on 
the global interwiki map.



and they might also end up getting used if
people import content from Wikimedia wikis.


They're mainly used in meta-discussions, so I doubt this is a concern.

People will say we should keep those interwikis for historical 
reasons. So,
I think we should have a bot ready to go through the various wikis and 
make
edits converting those interwiki links to regular links. We should 
make
this tool available to the third-party wikis too. Perhaps it could be 
a

maintenance script.


Amen to this.  https://bugzilla.wikimedia.org/show_bug.cgi?id=60135

Can we come up with numerical cutoffs for what count as 
well-established,
active, and very commonly linked to, so that people know what to 
expect
before they put a proposal forth, or will it be like notability 
debates,
and come down to people's individual opinions of what should count as 
very

commonly linked to (as well as a certain amount of
ILIKEIThttps://en.wikipedia.org/wiki/Wikipedia:Arguments_to_avoid_in_deletion_discussions#I_like_itand
IDONTLIKEIT, even if users deny that's the basis for their decision)?
We might get the help of WikiIndex and (especially) WikiApiary in 
getting

the necessary statistics.


I don't see the need for instruction creep here.  I'm for an inclusive 
interwiki map.  Inactive wikis (e.g. RecentChanges shows only sporadic 
non-spam edits) and non-established wikis (e.g. AllPages shows little 
content) should be excluded.  So far, there have been no issues with 
using subjective criteria at meta:Talk:Interwiki map.


It's okay, it's a complicated subject with a lot of tricky 
implementation
decisions that need to be made (which is probably part of why it's 
been

neglected). Thanks for taking the time to do a thorough analysis.


And thank you, Nathan, for your contributions.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l 




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l