Re: [Wikitech-l] [Selenium] How to use?
Thanks! The reason I asked is that SMW currently is tested for MW versions below 1.18. Best, Benedikt -- Karlsruhe Institute of Technology (KIT) Institute of Applied Informatics and Formal Description Methods (AIFB) Benedikt Kämpgen Research Associate Kaiserstraße 12 Building 11.40 76131 Karlsruhe, Germany Phone: +49 721 608-47946 (!new since 1 January 2011!) Fax: +49 721 608-46580 (!new since 1 January 2011!) Email: benedikt.kaemp...@kit.edu Web: http://www.kit.edu/ KIT University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association -Original Message- From: wikitech-l-boun...@lists.wikimedia.org [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Markus Glaser Sent: Thursday, February 03, 2011 4:57 PM To: Wikimedia developers Subject: Re: [Wikitech-l] [Selenium] How to use? Hi Benedict, at the moment, the framework is still work in progress, so it is not shipped with any current releases (afaik). Also, using it requires some changes in the includes folder as well as the new maintenance class, which is not available until MW 1.16. But there is hope for you, I know at least one implementation of the framework with MW 1.15.3 ;) I put some notes on backporting the framework on http://www.mediawiki.org/wiki/Selenium_Framework#Backporting, although this may not yet be exhaustive. Cheers, Markus -Ursprüngliche Nachricht- Von: wikitech-l-boun...@lists.wikimedia.org [mailto:wikitech-l-boun...@lists.wikimedia.org] Im Auftrag von Benedikt Kaempgen Gesendet: Donnerstag, 3. Februar 2011 16:17 An: Janesh Kodikara; Wikimedia developers Betreff: Re: [Wikitech-l] [Selenium] How to use? Thanks for the quick answer. Unfortunately, I still don't know how to apply testing to older MW versions. I am familiar with the documentation, it is good, but does not answer all relevant questions. But I will figure out... Keep up the good work! Best, Benedikt -- Karlsruhe Institute of Technology (KIT) Institute of Applied Informatics and Formal Description Methods (AIFB) Benedikt Kämpgen Research Associate Kaiserstraße 12 Building 11.40 76131 Karlsruhe, Germany Phone: +49 721 608-47946 (!new since 1 January 2011!) Fax: +49 721 608-46580 (!new since 1 January 2011!) Email: benedikt.kaemp...@kit.edu Web: http://www.kit.edu/ KIT - University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association -Original Message- From: wikitech-l-boun...@lists.wikimedia.org [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Janesh Kodikara Sent: Thursday, February 03, 2011 9:11 AM To: Wikimedia developers Subject: Re: [Wikitech-l] [Selenium] How to use? - Original Message - From: Benedikt Kaempgen benedikt.kaemp...@kit.edu Newsgroups: gmane.science.linguistics.wikipedia.technical To: wikitech-l@lists.wikimedia.org Sent: Wednesday, February 02, 2011 6:27 PM Subject: [Selenium] How to use? I got following for your answer. Hi Janesh, We checked with latest trunk and test scripts available only at tests/selenium. Earlier test were located at maintenance/tests/selenium but later moved to one level up. So now the tests should be available only at tests/selenium level. The tests were written against latest code because the idea is to regress test the system after latest changes. We can use the scripts against older versions if there are no major changes which would break the script. Details of Selenium framework is available at http://www.mediawiki.org/wiki/SeleniumFramework and there is a readme file which describes the behavior for installer test scripts. Regards, Jinesh De Silva ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
On 5 February 2011 05:19, MZMcBride z...@mzmcbride.com wrote: This is the subject of bug 5309, Localize captcha images: https://bugzilla.wikimedia.org/show_bug.cgi?id=5309 This is the subject of bug 14230, Add a button to request a new fancy captcha (code): https://bugzilla.wikimedia.org/show_bug.cgi?id=14230 Generally it's a good idea to search Bugzilla before mailing this list. More often than not, Bugzilla will contain the relevant problem and a discussion of it. This comes across as dismissive. Saying we have old bugs filed that no-one is working on is not a reason to dismiss discussion of a real problem. Tim has noted how badly our captcha solutions suck. (It's a real pity reCaptcha is third-party and proprietary.) - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
2011/2/5 David Gerard dger...@gmail.com (It's a real pity reCaptcha is third-party and proprietary.) Well, we it.source fellow are writing our communication about (it will be published into wikisource-l), but a brief mention to good news is mandatory here. We have a simple script that extracts word images, corresponding to doubtful OCR interpretation, from any djvu file with a text layer; scripts to upload into djvu layer again fixed words are simple too. We posted first communication into John Vandenberg en.source user page, and a wikicaptcha is now something possible. See John's lalk here: http://en.wikisource.org/wiki/User_talk:John_Vandenberg#reCAPTCHA_for_source Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
I have just checked how they do at Baidu Baike (the well kown Chinese online encyclopedia) : http://baike.baidu.com/page/userlogin.html#reg Their captcha is a set of 4 characters : either arabic numbers or latin capital letters. It looks easier than our current mediawiki captcha, and they have a You can't see? button providing another try. 2011/2/5 Liangent liang...@gmail.com: I hate the case that I'm asked with a Chinese captcha when I'm surfing some Chinese websites without IME available. Besides I don't prefer Chinese captchas personally because Chinese characters usually require more key hits. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
2011/2/5 MZMcBride z...@mzmcbride.com: While I sympathize with non-English speakers, I must confess that it's been quite a while since I filled out a CAPTCHA on a Wikimedia wiki. Surely most users use unified login, requiring a CAPTCHA to only be filled out once for all Wikimedia wikis? I was thinking that this first step into entering a wiki (as a registered user) might be the most difficult thing people will ever be required to perform in their life as a Wikimedia user. It is like we require our users to have an IQ above 130, while very simple on-wiki tasks such as correcting typing mistakes don't require more than an average IQ. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
Being technically able to type the local script somehow is a prerequisite for participation in the wiki. Therefore it should be okay to have the captcha in local script. It won't impede those users familiar with the wiki's local language. But it will potentially impede foreign users. Therefore it would be useful to provide a drop-down menu that allows you to choose the script of the captcha. That way every user can choose the script that fits best. Instead of captchas like shipsneeds we of course need words in the local language. It shouldn't be hard to do some statistical analysis of existing articles on the wiki and to collect a sample of common words of limited length that can be combined to form local captchas. (I guess the above-mentioned script drop-down should be a script/language combination drop-down then.) Marcus Buck User:Slomox ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
2011/2/5 Marcus Buck w...@marcusbuck.org Instead of captchas like shipsneeds we of course need words in the local language. It shouldn't be hard to do some statistical analysis of existing articles on the wiki and to collect a sample of common words of limited length that can be combined to form local captchas. (I guess the above-mentioned script drop-down should be a script/language combination drop-down then.) Just to let you know that Aubrey just prestented it.source idea for wikicaptcha into wikisource-l :-) Obviously, if a wikicaptcha tool will be built and will run, we can do anything and while interpreting words (in any language) any user will contribute to source transcriptions in a very valuable way. Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
2011/2/5 River Tarnell r.tarn...@ieee.org In article AANLkTikWLU5Y8C2UokYRN=v1-zwhb1kthnxi4xtbm...@mail.gmail.com, David Gerard dger...@gmail.com wrote: On 5 February 2011 15:12, Alex Brollo alex.bro...@gmail.com wrote: Just to let you know that Aubrey just prestented it.source idea for wikicaptcha into wikisource-l What would it take to get this into place? What's the captcha load on WMF sites? Would e.g. the toolserver melt under the load? Perhaps on one project at a time? I don't think this should be hosted on the Toolserver; as CAPTCHAs are a core part of the site, they should not rely on the TS to work. - river. IMHO, it could be an opportunity to think again to the role of Commons as a central library. I imagine something like this: 1. as soon as a djvu file with a text layer is uploaded, a complete set of pages text layers is extracted, saving words coordinates too; 2. such text layers could be browsed by a script, extracting all words marked as doubtful (usually with a ^ characters), but extracting too words which don't match with a good dictionary; 3. a dynamic recaptcha database is updated and word images are submitted to wiki contributors, both as a formal captcha for unlogged user edits, and as a volunteer job to help wikisource projects; updates will fix text files; 4. a tool should be build, to upload pure text from such text files into any wikisource project; 5. finally refined text could be re-uploaded into djvu file, so converting it into a djvu file with a wiki text layer. Alex 4. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
MZMcBride wrote: My intention wasn't to come across as dismissive. On the other hand, if people begin new conversations without having read the old conversations, it sets back progress dramatically. The opening post didn't make any mention of the old bugs or their progress, so I was trying to point out that these issues were already known and there were already forums in which they could and should be discussed. There are such misunderstandings, so is sometimes the internet. I think it's a real pity that CAPTCHAs are needed at all. They're a pain-in-the-ass and their effectiveness against coordinated or sophisticated attacks is dubious at best. And then you have projects like ptwiki which permanently make IPs pass captchas due to a bot attack which was being done three years ago [1]. After running this way for three years, it probably needs community consensus to change now the status quo. 1- http://pt.wikipedia.org/wiki/Wikip%C3%A9dia:Esplanada/Arquivo/2008/Janeiro#Activa.C3.A7.C3.A3o_de_Captcha_para_edi.C3.A7.C3.B5es_por_IP If you looked at all of the CAPTCHA-related bugs as a group (including possibly removing the Python dependency), there's more than enough to be at least considered for Summer of Code 2011. We should create a captcha tracking bug. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] WMF and IPv6
Wouldn't it be nice to just set up ipv6 a test? Something similar to https://secure.wikimedia.org/ . That way I can just open https://ipv6.wikimedia.org/wikipedia/en/wiki/Main_Page if I want to browse Wikipedia using ipv6. Maarten Ps. Of course https://secure6.wikimedia.org/ is even better ;-) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] WMF and IPv6
On 3 February 2011 21:04, Robert Leverington rob...@rhl.me.uk wrote: [2] http://ipv6and4.labs.wikimedia.org/ [3] http://wikitech.wikimedia.org/view/IPv6_deployment Someone actually emailed the press queue today asking if we were participating in IPv6 Day. I passed them those two links and said the issues were under discussion on wikitech-l. As soon as the sysadmins have some idea what's happening and the likely effects, a techblog post would probably be a good idea - with IPv4 running out, the rest of the world is starting to wonder about this. - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Captchas and non-English speakers
On 05/02/11 22:36, Platonides wrote: praveenp wrote: There is atleast one successful captcha php script for Malayalam Language (http://sourceforge.net/projects/mlcaptcha/ , http://mlcaptcha.blogspot.com/2010/02/blog-post_24.html ). I don't know whether it can work with mediawiki. It could be added, although I find that particular captcha easier for bots than for humans. And an attacker can easily play with the parameters to weaken it even more. Alternatively, you could add a Malayalam wordlist to the current Wikipedia captcha, which would have the same effect, and has (I hope) a better visual obfuscation method than this, designed specifically to resist some of the most recent bot decoding methods. Perhaps we could have a place to add these wordlists on the meta-wiki or on translatewiki, to allow people without transmit rights to create them? All that is needed is about 2,000 short words for each language, which can be used to create around 4,000,000 possible challenge words, which will in turn will be used to create an endless stream of captcha images, no two of which should ever be alike. The wordlists themselves need not be secret: they are only needed to create easily-typed strings that are sufficiently large in number to provide a moderate challenge to brute force guessing. -- Neil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] CAPTCHA spell checker
To help both non-English speakers and people who can't type very well, I've created a GreaseMonkey script which checks your response to the FancyCaptcha challenges seen on Wikimedia and elsewhere. http://userscripts.org/scripts/show/96233 The script is not specific to GreaseMonkey and could easily be provided in some other way, such as a gadget or site JS. The check is done as you type, and a tick icon is shown instantly if the response you have given matches a pair of words in the dictionary. This helps you to spot typos before you press submit. Restrictions in FancyCaptcha mean that the dictionary only needs to have about 8500 words, so it's easily embedded in the script. My original idea was to search for near matches and to provide an autocomplete drop-down, but the necessary UI code for that seemed a bit too complicated for a quick weekend project. Maybe later. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] NNTP access for Wikimedia mailing lists
RT == River Tarnell r.tarn...@ieee.org writes: and retention is set to forever, so it also acts as an archive. Fetch from 'news.tcx.org.uk' Connected to 77.75.105.169:119 Updating groupinfo Could not change to group wikimedia.gendergap Retrieving article list Fetch from 'news.tcx.org.uk' finished RT I've now added a basic web interface: RT http://news.tcx.org.uk/group/wikimedia. $ w3m -dump http://news.tcx.org.uk/group/wikimedia. #which you also use on http://news.tcx.org.uk/wikimedia.html # Nothing. Are you sure you want the dot appended to the URL? Let's try it with none: $ w3m -dump http://news.tcx.org.uk/group/wikimedia Status: 500 Content-type: text/html Software error: Cannot find group: wikimedia. at /aux0/srv/news.tcx.org.uk/fcgi-bin/group.fcgi line 141. For help, please send mail to the webmaster (r.tarn...@ieee.org), giving this error message and the time and date of the error. RT as well as a search index: RT http://news.tcx.org.uk/search RT (since lists.wikimedia.org disables Google searching and doesn't provide RT its own search interface). Better not tell the N*zis who made it that way, lest they try to shut you down. I would quote the last discussion thread about that, but of course, I can't find it, just like they intended. They can save the URLs for discussions favorable to their position, but nobody can search for opposing articles. Or maybe they have softened their views now. Or maybe they have moved to Egypt or North Korea. RT Obviously this will become more useful once the archives are imported. Oh no, that will just add one more nail in the coffin of thinking there are no other sites which already archive and index wikimedia groups. (And don't tell me to do my aforementioned search there, as they don't exist, right?) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] NNTP access for Wikimedia mailing lists
RT == River Tarnell r.tarn...@ieee.org writes: RT Unlike email Ah young man, you just _assume_ the reader is using a certain software. Well I'm here to tell you that I am using a certain Lars Magne Ingebrigtsen creation that makes mail look like news and news look like mail. Plus, what subscriber in their right mind post a reply to any mailing list via NNTP when years and years of experience tells one that it would probably get posted mangled, if at all. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] upcoming 1.17 deployment and the xml dumps
A little bit before the scheduled deployment of the 1.17 branch on our production servers, I will be halting production of XML dumps. Deployment is set for Tuesday Feb 8 at 07:00 UTC, so a few hours before that I'll start shutting down processes. This is a precautionary measure; after the deployment and any hasty fixes that may be needed, I will be doing some testing to ensure that dumps are not impacted, before we restart them. Barring some bizarre problem, we should be back up and running within a day or two. Ariel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l