Re: [Wikitech-l] IE7 tax

2012-06-14 Thread Andre Engels
On Thu, Jun 14, 2012 at 1:46 PM, Chad innocentkil...@gmail.com wrote:

 Absolutely not. We have debated the show notice to broken browsers
 thing multiple times--and the answer is always it's annoying as hell
 when sites do it and it's not our place to do so.

 The stance on supporting crappy old browsers has largely over time
 turned into--continue supporting all browsers with at least 1% of our
 readers (roughly,I don't believe that number's ever been set in stone).
 Once they are less than 1%, continue supporting unless it's a burden
 to do so and/or makes support for newer browsers impossible. And lastly,
 never purposefully break a browser if you can help it.

Just to give some data: Looking at May, this 1% limit would mean
supporting the following browser versions (May 2012 data):

* Chrome 18.0 and 19.0
* MSIE 6.0, 7.0, 8.0 and 9.0
* Firefox 3.6, 11.0 and 12.0
* Safari 534.55 (desktop), 6533.18 and 7534.48 (iOS)
* Opera 11.62 and 11.64
* Safari 533.1 (Android browser)

Furthermore, the following have no version at or over 1%, but do get
there or at least near when all versions are combined:
* Opera Mini
* WikipediaMobile (our own mobile app)
* BlackBerry browser
* Apple PubSub (rss reader)

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SOPA banner implementation

2012-01-17 Thread Andre Engels
On Tue, Jan 17, 2012 at 7:23 AM, Roan Kattouw roan.katt...@gmail.com wrote:
 When the Italians did their blackout, Google asked us to block them
 (!) from bits.wm.o (our JS/CSS domain), which in the specific case of
 the Italian blackout had the effect of not honoring the blackout at
 all.

 I'm guessing we'd probably have to block search engines from enwiki
 entirely to avoid the blackout screwing things up, yeah.

It seems Google has advised to serve a 503 (Service unavailable) HTTP
code during blackouts to avoid them influencing their search engine.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] SOPA banner implementation

2012-01-17 Thread Andre Engels
On Tue, Jan 17, 2012 at 3:33 PM, Tei oscar.vi...@gmail.com wrote:

 *cough*

 USA can take over hostnames .com from other countries.

 Then blackout the frontpage of these websites with this image:
 http://rojadirecta.com/IPRC_Seized_2011_02_NY.gif
 http://rojadirecta.org/IPRC_Seized_2011_02_NY.gif

 So SOPA is not just a US visitor concern, but worldwide.

One more chilling case that is current at the moment is that of
Richard O'Dwyer (http://en.wikipedia.org/wiki/Richard_O%27Dwyer). He
hosted a site where people would share links to (usually pirated)
videos of television shows. He is to be extradited from the United
Kingdom to the United States for having that site, even though the
servers were in the Netherlands and he was in the UK. In Europe he
would have good chances with a defense that linking to pirated
material is not in itself illegal, but there is no such defense in the
United States.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Files with unknown copyright status on MediaWiki wiki

2012-01-05 Thread Andre Engels
On Thu, Jan 5, 2012 at 1:47 AM, K. Peachey p858sn...@gmail.com wrote:
 Commons folk doesn't magically delete anything unless there is a
 reason, nor do they instantly delete in most cases. If peoples uploads
 are continently getting tagged for deletion, They should look at why
 they are getting tagged.

For one example, recently a fellow wikipedian got a picture he had
taken himself deleted because someone else had uploaded it somewhere
else, so they didn't believe it was his. I myself have had pictures
from US government almost deleted because I only specified which
government institution it came from, not what page they could be found
(saved because I happened to find them after being notified of my
terrible crime). Copyright diligence is good, copyright paranoia is
not.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipedia-l] Removing wrong iw

2011-12-28 Thread Andre Engels
Sending a Cc: to wikitech-l, hoping that someone knows something about it there.

What is the actual situation with respect to this extension? Who
decides whether and when it will be put into effect on Wikipedia?

André Engels

On Tue, Dec 27, 2011 at 6:20 PM, Tisza Gergo gti...@gmail.com wrote:
 Andre Engels andreengels at gmail.com writes:

 A more far-reaching and better solution has already been discussed for
 years, namely porting the interwikis to a separate (wiki) site, so
 that such changes can be made at once for all languages rather than
 having to be done separately at each. Maybe that will be worked on
 with the data project the Germans are setting up.

 You mean the Interlanguage extension[1], right? The extension page says it is
 stable, so maybe the projects just need to start actually using it?


 [1] https://www.mediawiki.org/wiki/Extension:Interlanguage


 ___
 Pywikipedia-l mailing list
 pywikipedi...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l



-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Overzealous Commons deletionists

2011-11-14 Thread Andre Engels
On Mon, Nov 14, 2011 at 3:44 AM, John Erling Blad jeb...@gmail.com wrote:

 If someone reports something (s)he thinks is an error, even if the
 wording seems insulting, there are usually something important in the
 report. Don't attack what you think is wrong about the report, try to
 figure out whats the root cause behind it, neglecting the insults.

 Something happen and the outcome was less satisfactory for at least
 one of the involved users. Why was that so, how can thing be changed?
 In this situation there are known errors that occur fairly often.  A
 serious company would ensure that such errors would not impact normal
 operation, especially if those errors has an impact on their
 customers.

 The uploader is the customer in this situation, and as such the
 likelihood of this uploader returning back to fix the situation or
 reupload the same or a similar picture at a later time drops very fast
 when the communication is harsh and unfriendly.

 Sorry but going for a flamewar against the messenger in a situation
 like this is a loosers game. Forget mistakes, forget insults, find
 solutions!


I'm sorry, but if hey, you'd better go there-and-there is already a flame
war in your opinion, then I really wonder how you ever survived your first
week at Wikipedia.


-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Northern Soto Wikipedia

2011-11-05 Thread Andre Engels
There seems to be a Northern Soto Wikipedia at http://nso.wikipedia.org, at
least that's what http://incubator.wikimedia.org/wiki/Wp/nso claims.
However, when I go to that site I see the following text:

Unstub loop detected on call of $wgLang-getCode from MessageCache::get

Backtrace:

#0 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(57):
StubObject-_unstub('getCode', 5)
#1 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(147):
StubObject-_call('getCode', Array)
#2 [internal function]: StubUserLang-__call('getCode', Array)
#3
/usr/local/apache/common-local/php-1.18/includes/cache/MessageCache.php(611):
StubUserLang-getCode()
#4
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(1339):
MessageCache-get('gadgets-definit...', true, false)
#5
/usr/local/apache/common-local/php-1.18/extensions/Gadgets/Gadgets_body.php(510):
wfEmptyMsg('gadgets-definit...', 'lt;gadgets-def...')
#6
/usr/local/apache/common-local/php-1.18/extensions/Gadgets/Gadgets_body.php(38):
Gadget::loadStructuredList()
#7 [internal function]: GadgetHooks::userGetDefaultOptions(Array)
#8 /usr/local/apache/common-local/php-1.18/includes/Hooks.php(216):
call_user_func_array('GadgetHooks::us...', Array)
#9
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(3621):
Hooks::run('UserGetDefaultO...', Array)
#10 /usr/local/apache/common-local/php-1.18/includes/User.php(1211):
wfRunHooks('UserGetDefaultO...', Array)
#11 /usr/local/apache/common-local/php-1.18/includes/User.php(2131):
User::getDefaultOptions()
#12
/usr/local/apache/common-local/php-1.18/includes/RequestContext.php(213):
User-getOption('language')
#13 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(151):
RequestContext-getLang()
#14 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(103):
StubUserLang-_newObject()
#15 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(57):
StubObject-_unstub('getCode', 5)
#16 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(147):
StubObject-_call('getCode', Array)
#17 [internal function]: StubUserLang-__call('getCode', Array)
#18
/usr/local/apache/common-local/php-1.18/includes/cache/MessageCache.php(611):
StubUserLang-getCode()
#19
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(1173):
MessageCache-get('titleblacklist-...', true, false)
#20
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(1242):
wfMsgGetKey('titleblacklist-...')
#21
/usr/local/apache/common-local/php-1.18/extensions/TitleBlacklist/TitleBlacklist.hooks.php(85):
wfMsgWikiHtml('titleblacklist-...', '.* noedit # T...', 'Andre Engels')
#22
/usr/local/apache/common-local/php-1.18/extensions/TitleBlacklist/TitleBlacklist.hooks.php(106):
TitleBlacklistHooks::acceptNewUserName('Andre Engels', Object(User), '')
#23 [internal function]:
TitleBlacklistHooks::centralAuthAutoCreate(Object(User), 'Andre Engels')
#24 /usr/local/apache/common-local/php-1.18/includes/Hooks.php(216):
call_user_func_array('TitleBlacklistH...', Array)
#25
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(3621):
Hooks::run('CentralAuthAuto...', Array)
#26
/usr/local/apache/common-local/php-1.18/extensions/CentralAuth/CentralAuthHooks.php(469):
wfRunHooks('CentralAuthAuto...', Array)
#27
/usr/local/apache/common-local/php-1.18/extensions/CentralAuth/CentralAuthHooks.php(253):
CentralAuthHooks::attemptAddUser(Object(User), 'Andre Engels')
#28 [internal function]:
CentralAuthHooks::onUserLoadFromSession(Object(User), NULL)
#29 /usr/local/apache/common-local/php-1.18/includes/Hooks.php(216):
call_user_func_array('CentralAuthHook...', Array)
#30
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(3621):
Hooks::run('UserLoadFromSes...', Array)
#31 /usr/local/apache/common-local/php-1.18/includes/User.php(930):
wfRunHooks('UserLoadFromSes...', Array)
#32 /usr/local/apache/common-local/php-1.18/includes/User.php(272):
User-loadFromSession()
#33 /usr/local/apache/common-local/php-1.18/includes/User.php(3928):
User-load()
#34 /usr/local/apache/common-local/php-1.18/includes/User.php(2125):
User-loadOptions()
#35
/usr/local/apache/common-local/php-1.18/includes/RequestContext.php(213):
User-getOption('language')
#36 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(151):
RequestContext-getLang()
#37 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(103):
StubUserLang-_newObject()
#38 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(57):
StubObject-_unstub('getCode', 5)
#39 /usr/local/apache/common-local/php-1.18/includes/StubObject.php(147):
StubObject-_call('getCode', Array)
#40 [internal function]: StubUserLang-__call('getCode', Array)
#41
/usr/local/apache/common-local/php-1.18/includes/cache/MessageCache.php(611):
StubUserLang-getCode()
#42
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php(1173):
MessageCache-get('pagetitle', true, false)
#43
/usr/local/apache/common

Re: [Wikitech-l] Northern Soto Wikipedia

2011-11-05 Thread Andre Engels
On Sat, Nov 5, 2011 at 3:51 PM, Schneelocke schneelo...@gmail.com wrote:

 On Sat, Nov 5, 2011 at 15:46, Andre Engels andreeng...@gmail.com wrote:
  Same data for me, plus that pywikipediabot does not get something good
  either.

 I've checked with both Opera 11 and Firefox 3.6 on Windows Vista; I'm
 getting these errors when I'm logged in (globally), but not when I'm
 logged out. Presumably, it's got to do with the auto-creation of a
 local user account.


Good call. Attempting to log in from the wiki itself gives more information:

pThe user name Robbot has been banned from creation. It matches the
following blacklist entry: code.* noedit # This wiki will be
re-imported/code /p

(I used Robbot's account instead of my own because the password is simpler)

Thus, what might be going on is that every user name is blacklisted.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] There will be no other notifications in case of further changes unless you visit this page.

2011-11-04 Thread Andre Engels
On Fri, Nov 4, 2011 at 3:27 AM, jida...@jidanni.org wrote:

 Does anybody else feel a pang of spite in
 There will be no other notifications in case of further changes unless
 you visit this page.
 especially when other software doesn't force the user to prove his
 loyalty each time like something from
 * Your World As I See It - Guilt
 http://www.youtube.com/watch?v=qhwKqq0j1S8list=PL6E40919035151385

 Why can't he be given a choice between subscriptions that keep dying
 and subscriptions that act like ones everybody else is used to?


Is this 'what everybody else is used to'? I myself get such notifications
from just one place, which is a phpBB forum, and that one behaves the same
way: I get a notification when there is a new message in the discussion,
but after that I get no email until I have read that discussion.


-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Temporary password too short

2011-10-30 Thread Andre Engels
On Sun, Oct 30, 2011 at 2:20 PM, Antoine Musso hashar+...@free.fr wrote:

 On 30/10/11 12:28, William Allen Simpson wrote:
It might perhaps be worth adding one more character,
  Really, how*hard*  is it to generate a longer string?

 Have a look at the method User::randomPassword() in the file
 includes/User.php :

 Password is at least 7 characters long and can be made longer with the
 global $wgMinimalPasswordLength (which will require longer password for
 everyone).

 So we could just change that 7 to 10 and we will get longer temporary
 passwords.


We could, but why would we? As has been shown by me and others in this
thread, any brute force attempt that has a reasonable chance to crack the
current passwords would already include an amount of traffic to the
Wikimedia servers amounting ot a Denial of Service attack.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Temporary password too short

2011-10-26 Thread Andre Engels
On Wed, Oct 26, 2011 at 1:55 PM, William Allen Simpson 
william.allen.simp...@gmail.com wrote:

 # Temporary password: YH2MnDD
 #
 # This temporary password will expire in 7 days.
 # You should log in and choose a new password now. If someone else made
 this
 # request, or if you have remembered your original password, and you no
 longer
 # wish to change it, you may ignore this message and continue using your
 old
 # password.
 #
 I use fairly long passwords with special characters (a 96 character set
 including space).  This replacement password is much more easily guessed.
 The account could have been stolen within minutes or hours.

   https://secure.wikimedia.org/wikipedia/en/wiki/Password_strength

 (Merely 7 case insensitive alphanumeric characters is equivalent to only
 40-bits of strength.)



I do seriously wonder whether it is possible to steal such a password
'within minutes or hours'. My calculation says that to do it within 24
hours, one needs to test 40 million passwords per second. And remember that
'testing' in this case means sending a message to the Wikimedia servers and
waiting for an answer. Surely getting over 1000 times the normal  number of
requests per second (I have no number for the total number of requests, but
the number of page requests seems to be around 6000 per second) is something
that would not remain unnoticed at the Wikimedia servers for 24 hours.



 Please update the password generator to use at least 17 characters, with
 at least some punctuation!  (Users reading the text might have trouble
 noticing blanks, so don't use the space character.)


The more sensitive way of working, in my opinion, would be to invalidate the
temporary password after a certain, low, number of tries, and allow a
temporary password only a restricted number of times within a certain
period. For example, if the password is expired after 5 failed login
attempts, and a new temporary password is only sent once a minute, an
attacker is effectively reduced to one attempt per 12 seconds, making
cracking a 62-alphabet, 7-character key such as this one a task which takes
in the order of one million years.


-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] serious interwiki.py issues on MW 1.18 wikis

2011-09-30 Thread Andre Engels
On Fri, Sep 30, 2011 at 1:06 AM, Platonides platoni...@gmail.com wrote:

 Merlijn van Deen wrote:
  Hello to both the wikitech and pywikipedia lists -- please keep both
  informed when replying. Thanks.
 
  A few days ago, we - the pywikipedia developers - received alarming
  reports of interwiki bots removing content from pages. This does not
  seem to happen often, and we have not been able to reproduce the
  conditions in which this happens.
 
  However, the common denominator is the fact it seems to be happening
  only on the wikipedia's that run MediaWiki 1.18 wikis. As such, I
  think this topic might be relevant for wikitech-l, too. In addition,
  there is no-one in the pywikipedia team with a clear idea of why this
  is happening. As such, we would appreciate any ideas.
 
  1. What happens?
  Essentially, the interwiki bot does its job, retrieves the graph and
  determines the correct interwiki links.

 Does it use the page content to retrieve the interwiki links? Or is it
 retrieved eg. by doing a different query to the API?


The interwiki links are retrieved from page content. The page content has
been received through a call to Special:Export.


 I.e. would receiving no content (from the bot POV) produce that behavior?


Yes, the only reasonable explanation seems to be that the bot interprets
what it gets from the server as an empty page.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] serious interwiki.py issues on MW 1.18 wikis

2011-09-30 Thread Andre Engels
On Fri, Sep 30, 2011 at 9:39 AM, Ariel T. Glenn ar...@wikimedia.org wrote:

 Out of curiosity... If the new revisions of one of these badly edited
 pages are deleted, leaving the top revision as the one just before the
 bad iw bot edit, does a rerun of the bot on the page fail?


I did a test, and the result was very interesting, which might point to the
cause of this bug:

I deleted the page [[nl:Blankenbach]], then restored the 2 versions before
the problematic bot edit. When now I look at the page, instead of the page
content I get:

In de database is geen inhoud aangetroffen voor de pagina met .

Dit kan voorkomen als u een verouderde verwijzing naar het verschil tussen
twee versies van een pagina volgt of een versie opvraagt die is verwijderd.

Als dit niet het geval is, hebt u wellicht een fout in de software gevonden.
Maak hiervan melding bij een
systeembeheerderhttp://nl.wikipedia.org/wiki/Speciaal:Gebruikerslijst/sysopvan
Wikipedia en vermeld daarbij de URL van deze pagina.


Going to the specific version that after the deletion-and-partial-restore
should be the newest (
http://nl.wikipedia.org/w/index.php?title=Blankenbacholdid=10676248), it
claims that there is a newer version, but going to the newer version or the
newest version, I get the abovementioned message again.

As an extra test, I did the
delete-then-restore-some-versions-but-not-the-most-recent action with
another page (http://nl.wikipedia.org/wiki/Gebruiker:Andre_Engels/Test), and
there I found no such problem. From this I conclude that the bug has not
been caused by that process, but that for some reason the page had a wrong
(or empty) version number for its 'most recent' version, or something like
that.




-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] serious interwiki.py issues on MW 1.18 wikis

2011-09-30 Thread Andre Engels
On Fri, Sep 30, 2011 at 11:12 AM, Max Semenik maxsem.w...@gmail.com wrote:

 On Fri, Sep 30, 2011 at 12:56 PM, Andre Engels andreeng...@gmail.com
 wrote:

 
  The interwiki links are retrieved from page content. The page content has
  been received through a call to Special:Export.
 
 
   I.e. would receiving no content (from the bot POV) produce that
 behavior?
  
 
  Yes, the only reasonable explanation seems to be that the bot interprets
  what it gets from the server as an empty page.
 

 So you screen-scrape? No surprise it breaks. Why? For example, due to
 protocol-relative URLs. Or some other changes to HTML output. Why not just
 use API?


Basically, because most of the core functionality comes from before the API
came into existence. At least, that would be my explanation.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] serious interwiki.py issues on MW 1.18 wikis

2011-09-30 Thread Andre Engels
On Fri, Sep 30, 2011 at 11:13 AM, Andre Engels andreeng...@gmail.comwrote:

 On Fri, Sep 30, 2011 at 9:39 AM, Ariel T. Glenn ar...@wikimedia.orgwrote:

 Out of curiosity... If the new revisions of one of these badly edited
 pages are deleted, leaving the top revision as the one just before the
 bad iw bot edit, does a rerun of the bot on the page fail?


 I did a test, and the result was very interesting, which might point to the
 cause of this bug:

 I deleted the page [[nl:Blankenbach]], then restored the 2 versions before
 the problematic bot edit. When now I look at the page, instead of the page
 content I get:

 In de database is geen inhoud aangetroffen voor de pagina met .

 Dit kan voorkomen als u een verouderde verwijzing naar het verschil tussen
 twee versies van een pagina volgt of een versie opvraagt die is verwijderd.

 Als dit niet het geval is, hebt u wellicht een fout in de software
 gevonden. Maak hiervan melding bij een 
 systeembeheerderhttp://nl.wikipedia.org/wiki/Speciaal:Gebruikerslijst/sysopvan
  Wikipedia en vermeld daarbij de URL van deze pagina.


 Going to the specific version that after the deletion-and-partial-restore
 should be the newest (
 http://nl.wikipedia.org/w/index.php?title=Blankenbacholdid=10676248), it
 claims that there is a newer version, but going to the newer version or the
 newest version, I get the abovementioned message again.

 As an extra test, I did the
 delete-then-restore-some-versions-but-not-the-most-recent action with
 another page (http://nl.wikipedia.org/wiki/Gebruiker:Andre_Engels/Test),
 and there I found no such problem. From this I conclude that the bug has not
 been caused by that process, but that for some reason the page had a wrong
 (or empty) version number for its 'most recent' version, or something like
 that.C

Curiouser and curiouser... I now see that when I click the edit button from
the abovementioned page, I do get to edit the page at is it shown, even
though that one is not in the history (the page is a copy of [[
MediaWiki:Missing-articlehttp://nl.wikipedia.org/wiki/MediaWiki:Missing-article]]
with the empty string filled in for $2).

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki and a standard template library

2011-08-25 Thread Andre Engels
On Sat, Aug 20, 2011 at 11:31 PM, go moko gom...@yahoo.com wrote:


 So, if I can express some simple opinion, perhaps what would be useful is a
 mean to easily export/import a template and all those which are necessary
 for it to work properly.
 I don't know if such a tool exist, but could it bea (partial) solution?


This should not be too hard to program - what is needed is that one
exports/imports not only the template but also all templates included in it.
There is already functionality to get this list - they are shown on the edit
page. Thus, all that is needed is some kind of interface to export this
page and all templates directly or indirectly included in it. Also useful
in case someone wants to fork with only  a subset of the pages.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki and a standard template library

2011-08-25 Thread Andre Engels
On Thu, Aug 25, 2011 at 9:23 AM, Petr Kadlec petr.kad...@gmail.com wrote:


 You mean like… checking the “Include templates” box on Special:Export?


I guess that's what I mean, it's been a few years since I last saw
Special:Export - long enough that I have no idea whether that box is from
before or after that time.


 But the problem is more difficult than that. MediaWiki has no chance
 of knowing which templates are “necessary for it to work properly”, it
 can only detect those that are _actually used_ in a specific use case
 (as you say, those which are shown on the edit page), which is just a
 subset of those required in general.


No, but it does at least mean that exporting and importing any single
template should be possible in a very short time indeed.

-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] XKCD: Extended Mind

2011-05-25 Thread Andre Engels
On Wed, May 25, 2011 at 10:36 AM, Svip svi...@gmail.com wrote:
 On 25 May 2011 08:35, K. Peachey p858sn...@gmail.com wrote:

 http://xkcd.org/903/

 Wikipedia trivia: if you take any article, click on the first link in the 
 article text not
 in parentheses or italics, and then repeat, you will eventually end up at 
 Philosophy.

 This is also true.  I came from Groom Range to Philosophy.

No, it's not. The exceptions may be rare, but they do exist. My third
attempt was Masquerade (theatre group), which quicklyl ended up in a
loop of Renaming of cities in India and Mumbai. Philosophy
itself has Reason and Rationality as the other members of its
loop.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] some diff lines appear not to be different

2011-04-28 Thread Andre Engels
On Thu, Apr 28, 2011 at 9:40 PM,  jida...@jidanni.org wrote:
 It doesn't seem clear to the user just what changed on many lines of
 http://en.wikipedia.org/w/index.php?title=Allen_Swiftaction=historysubmitdiff=426037925oldid=426037103
 E.g.,
 |-+--+-+--|
 |-|| ALTERNATIVE NAMES =                                     |+|| ALTERNATIVE 
 NAMES =                                     |
 |-+--+-+--|
 |-|| SHORT DESCRIPTION =                                     |+|| SHORT 
 DESCRIPTION =                                     |
 |-+--+-+--|
 Maybe it is removing or adding invisible characters or something.
 True I could use action=raw to see what was really in there, but to the
 average user it looks like one big head scratcher.

Checking the edit page before and after, I found that the difference
in those lines is adding a space after the =. Unfortunately such
end-of-line spaces are not shown in the diff tool.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipedia-l] Humans vs bots: 0:1

2011-02-03 Thread Andre Engels
The first issue is indeed that the wrong interwiki has to be removed
on _all_ languages to stop it from returning, but even with that one
could still get into problems because there might be bots that visited
some languages _before_ your removal, and others _after_ it. They
would then consider the wrong interwiki to be a missing one on the
languages visited afterward, and re-add them there.

Working with {{nobots}} as you have done is not a good solution, I
think. Adding it on the Polish page could be justified, but on the
English one it also stops a good amount of correct edits.

This particular issue I have now resolved by finding that there is a
Dutch page on the same subject as the Polish one, and adding an
interwiki to that one. This way, even if someone mistakenly adds the
incorrect link again, for the bots this will lead to an interwiki
conflict, so they will not automatically propagate the wrong link any
more.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Ratio of interwiki bots' edits

2011-01-14 Thread Andre Engels
On Thu, Jan 13, 2011 at 1:23 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:

 That makes it:
 * 1 human edit in sl.wp.
 * 1 human edit in en.wp.
 * 20 bot edits in other Wikipedias.

 After the Interlanguage extensions will be enabled, it will be:
 * 1 human edit in sl.wp.
 * 1 human edit in en.wp.
 * 0 bot edits (some behind-the-scenes magic pushes the changes to 20
 wikis, but it's not seen in Recent Changes.)

That second human edit will be on the special interwiki wiki, not on en:.

 This is a major reason to have the Interlanguage extension finally
 enabled. Besides a MAJOR cleaning-up in Recent Changes in all
 Wikipedias, it will give a somewhat clearer picture of the activity in
 the ones.

Another one is that it will be easier to handle cases where interwiki
is not simply one-to-one. On the other hand, it would probably become
harder to notice the real interwiki mistakes, although easier to
resolve them once they _are_ noticed.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Must 3rd party skins and extensions be distributed under GPL?

2010-07-26 Thread Andre Engels
On Mon, Jul 26, 2010 at 4:22 AM, Tim Starling tstarl...@wikimedia.org wrote:

 The two legal theories which would allow the use of an interface are:

 * A function name or class name are not creative enough or substantial
 enough to be eligible for copyright.

 * A function name or class name is a small excerpt from a work, used
 for the purposes of precisely referring to a larger part of the work.
 Such reproduction of an excerpt is fair use. This is the same legal
 theory which allows us to reproduce things like chapter titles and
 character names in Wikipedia articles about copyright novels.

I think that is disingenuous. If one considers a skin or extension to
be a derived work, that is not because it uses function and class
names from the original product, but because they do not have any
meaning without the original code. The product, people would argue, is
not extension but mediawiki+extension, which clearly _is_ a
derivative of mediawiki.

Line numbers are even less creative than function or class names, but
if someone took a series of instructions like

* Take Mediawiki version 11.0
* In such-and-such-file replace lines so-and-so to so-and-so by (my own code)
* same with several other files/lines

I would argue that what you have is a derived work from MediaWiki.
Whether the same holds for extensions and skins, I would not argue
from either side, not knowing enough about either the code or the
legal side of the matter, but I think your rejection is too easy.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Take me back too hip

2010-07-21 Thread Andre Engels
On Wed, Jul 21, 2010 at 5:30 PM, Mike.lifeguard
mike.lifegu...@gmail.com wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 On 37-01--10 03:59 PM, Aryeh Gregor wrote:
 Or just get rid of it entirely.  At this point, it's been the default
 skin for some time, and almost anyone who wants to switch back will
 have done so.

 Are all wikis migrated? Maybe it has been the default for enwiki for a
 while, but I'm not sure about most of our other wikis.

The bigger wikis switched some time ago, although later than en:, but
the smaller ones are still on MonoBook standard.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Uploads on small wikis

2010-03-12 Thread Andre Engels
On 3/11/10, Gerard Meijssen gerard.meijs...@gmail.com wrote:
 Hoi,
 I am really happy that we want to help the smaller Wikipedias with uploading
 their pictures. It helps when it is made easy. One of the ways in which we
 can make it easy is to reduce the number of choices involved. When the only
 choice is cc-by-sa, we only need to explain one license.

 When the software involved is properly localised in their language,
 confusion will be a lot less.  One of the ways that makes a big difference
 to people is, when they know that a small number of messages provide exactl
 functionality.. The WikiReader and the mobile Wikimedia experience point in
 that direction. There are however two observations; it produces overhead at
 translatewiki.net and it helps when you target people who are actively
 translating for that language.

 What you can do is make the local upload facility available once these
 limited number of messages have been localised. In this way it is a reward
 and not depended on an arbitrary number.

I think that's a bad proposal, in at least two ways:
1. It does not answer the _reason_ for this restriction. The problem
is people uploading many copyright violations. Having the interface in
the own language is not going to solve that problem, as is shown by
the simple fact that this policy was started based on experience in
English and other large Wikipedias, which have had full localization
for years
2. The reward that you mention does not exist: Getting autoconfirmed
is in general a lesser issue than translating a large part of the
interface, and furthermore the first will happen automatically when
one is active, the second can only be done on a special place that a
random newbie will not be looking for or stumbling upon.

If we're going to make this dependant on something at all, it should
be the presence on active sysops, who are willing to check for
copyright violations. However, personally I would prefer some people
from Commons and/or Stewards taking such a role for wikis which do not
have such sysops. Localizing this part of the interface is definitely
a useful thing to do, but its connection with the problem at hand is
very tenuous.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Disambiguation while editing

2009-10-01 Thread Andre Engels
On Thu, Oct 1, 2009 at 8:55 AM, Tei oscar.vi...@gmail.com wrote:
 On Thu, Oct 1, 2009 at 12:37 AM, Lars Aronsson l...@aronsson.se wrote:

 In the edit box, when I type [[John Doe]], I want some chance to
 verify that I'm linking to the right article,

 Humm?

 I don't know the wikipedia, but on other wikis is like that:

(snip)

The issue was about disambiguations, that is, the case where one term
has two meanings. [[John Doe]] may lead you to an article about
another person than you intended to, but with the same name. Or to a
page only specifying that there are several people by that name, and
giving you links to the various articles.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats

2009-08-31 Thread Andre Engels
On Mon, Aug 31, 2009 at 11:03 AM, Domas Mituzasmidom.li...@gmail.com wrote:

 en2 is, um, http://en2.wikipedia.org/ ;-) it used to exist once upon a
 time, and apparently there're some referrals.

Wikimedia news, October 2003:
--
A portion of traffic to www.wikipedia.org will be diverted to
en2.wikipedia.org, while most of it will go to en.wikipedia.org,
where all logins will be directed. Until the server configuration is
more stable and transparent load-sharing is set up, this should help
share some of the traffic without burdening the other wikis too
greatly.
--

I think the reason that en got the lion's share is that en2 was on one
machine with the other languages whereas en was on a machine on its
own. At that time apparently en: still had significantly more traffic
than all other languages taken together.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Two ways to grant rollback and patrol right ?

2009-02-18 Thread Andre Engels
On Wed, Feb 18, 2009 at 11:34 AM, Jason Wong jwongw...@ymail.com wrote:
 The discussions about the open of rollback and patrol right are in progress
 at Chinese Wikipedia (zh.wikipedia.org). Some suggested that we can have two
 ways to grant right to the qualified users. One is auto-promote (by system),
 and one is by hand (by admin). That's mean two requirements are needed. The
 requirement for auto-promotion is higher, and if user met this requirement,
 he must be promoted by system. Before that, if users met another
 requirement, he can make an application, and admin can consider whether or
 not. I would like to know that, is this possible to run out practically?

Yes, it is possible. The German Wikipedia is using this exact same
system for allowing people to mark sighted versions.

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] second-class wikis

2009-02-06 Thread Andre Engels
On Fri, Feb 6, 2009 at 7:32 AM, Lars Aronsson l...@aronsson.se wrote:

 We could go further: People complain about uncivilized admins,
 scaring newcomers away.  But the fundraiser doesn't pay for
 keeping Wikipedia civilized.  Perhaps it should, and that could
 need an annual budget of $60 million rather than $6 million, a
 staff of 200 rather than 20.

That nothing is done about incivility (unfortunately - I really
believe it is chasing some good contributors away), is not because of
financial issues, but because nobody really has an idea _what_ to do
about it.


-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] 403 with content to Python?

2009-01-23 Thread Andre Engels
Through a message on another list, I found that when one tries to
reach wikipedia (or at least wikipedia-en) specifying the User Agent
as Python-urllib/1.17, the server gives a 403 Forbidden response,
together with the content of the page.

Two questions:
1. Why is this User Agent getting this response? If I remember
correctly, this was installed in the early days of the pywikipediabot,
when Brion wanted to block it because it had a programming error
causing it to fetch each page twice (sometimes even more?). If that is
the actual reason, I see no reason why it should still be active years
afterward...
2. If this User Agent is really to be blocked, why do we still provide
the content of the page that is forbidden?

-- 
André Engels, andreeng...@gmail.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interwiki conflicts

2009-01-06 Thread Andre Engels
On Wed, Jan 7, 2009 at 6:08 AM, Lars Aronsson l...@aronsson.se wrote:

 The Category:Politicians in many languages has an interwiki link
 to the Armenian (hy:) category for political scientists.  I fixed
 the English Wikipedia (manually) and the North European languages
 (by bot), but some 50 languages remain to be edited.

 If interwiki.py supported SUL and if I had a truly global bot
 flag, I could do it. But I'm reluctant to edit 50 languages
 manually, especially since there are hundreds of such conflicts.

You can do it by bot as things are. I myself use Robbot on all
languages; the only thing that could be improved regarding SUL is that
I have to type in its password once for each language rather than one
time for all, and as regards bot flags - it seems it has one on every
language where it needs it.

 One problem here is that interwiki.py only adds links.  Both
 correct ones and errors are quickly propagated.  But corrections
 are not propagated, because the conflicts make it give up.  An
 easy way to remove that hy: interwiki link would be a great help.

Well, as said, I use Robbot on all languages, the code I use for that is:

from family import Family
for lang in Family().alphabetic:
usernames['wikipedia'][lang] = 'Robbot'

This gives me 2 warnings every time I start the bot, but I just ignore
them. With such a setting, whenever I get to a conflict of which I
know the resolution, I start a separate interwiki.py with the
necessary -ignore or -neverlink and -force, and the bot will remove at
least that problem everywhere it exists.


-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interwiki conflicts

2009-01-06 Thread Andre Engels
There's one problem with these interwiki links that has not yet been
mentioned in this thread: Not rarely when I have finally sorted out
two subjects, and kept only those interwiki that are to the same
subject, someone comes around and tells me that I should not be
removing correct interwiki links.


-- 
André Engels, andreeng...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l