Re: [Pywikipedia-l] [Pywikipedia-svn] SVN: [8539] trunk/pywikipedia/wikipedia.py
Does it mean that trunk users may use now asLink instead of page.aslink(), or is this only a formal parameter? 2010/9/12 x...@svn.wikimedia.org --- wikipedia.Page(asLink=False) is implemented for easier merging to rewrite branch ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] category.py has a problem with big categories
I already wanted to write that I had a problem with a big category, I had to stop the script after an hour with recurse=1 and after 8 hours with recurse=true, possibly an infinite loop, or very slow run. I will write the details later, now I have to leave. 2010/9/11 Patrol110 patrol110.wikipe...@gmail.com I've recently noticed that category.py has a problem with big categories and got down during getting categories or just before saving results. ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] [Pywikipedia-svn] SVN: [8499] trunk/pywikipedia/wikicomserver.py
2010/9/8 x...@svn.wikimedia.org Revision: 8499 pywikipediaDir = c:\\Projects\\Personal\\wiki\\pywikipedia I have never used this script, but is this really OK for everyone? ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to read special pages?
As far as I understand, the existing special page reader methods do the same, i. e. they read HTML. They belong to the Site object rather than to the Page. But there is a terrible regex in them which I had to copy and modify. I copied from the reader of another special page, but I could not make the urlpath properly, so it is now a constant that points to huwiki. I will upload it for those who are not looking for elegance in the scripts. :-) It may be useful for others: looks for red (unexisting) categories in user subpages, and puts them in 'nowiki', while sending a message to the owner. This is useful because Special:WantedCategories is flooded with these and people don't feel like correcting the others, if they have to choose from so many redlinks. 2010/9/6 Alex Brollo alex.bro...@gmail.com I'm a very rough user of API's... so when I can't find what I search, I simply go beck to the old system:* I read html* so I did when I needed a list of WantedCategories. I guess that this horrible confession will elicit some reaction, and the solution you'll are searching for. ;-) Alex ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] [ pywikipediabot-Bugs-3060262 ] commonscat.py addition for sv.wikipedia
So this is the way? If it works, I will use your idea. :- Bugs item #3060262, was opened at 2010-09-06 11:25 Message generated for change (Settings changed) made by wikimercy Priority: 9 ignoreTemplates = { 'sv' : [u'commonscatbox'], -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] How to read special pages?
I want to read a special page with Page.get(). The message is: File C:\Program Files\Pywikipedia\wikipedia.py, line 601, in get raise NoPage('%s is in the Special namespace!' % self.aslink()) pywikibot.exceptions.NoPage What is the solution? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to read special pages?
Well, I wrote it for myself during Starwars III. Not easily, but I learnt something. Nice conversation with myself. :-) I don't understand, why are there only some special pages in wikipedia.py while others are not. 2010/9/5 Bináris wikipo...@gmail.com I have read wikipedia.py, and I found that class Site has some methods for reading special pages, but not for all of them! I need * Special:WantedCategories* which is not listed there. Please help! ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Need some help sorting out a couple of issues
Please give the exact command you use, and the error dump on your screen. Have you tried the same action on a real wiki? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Request for feedback on rewrite branch
A few questions about the rewrite branch, returning to an old thread: 2010/4/15 i...@gno.de Hi Russel, the main reason not to join to the rewrite branch is, I did not got it running yet. I get an importError for simplejson. And I have no idea seting PYTHONPATH playing with idle. Whereas the trunk is easy to use: install python, download the bot and expand it, run it. This is the usability I would expect. 1. What is the actual situation, can we easily install the rewrite branch? 2. Is it a working bot, is it worth to use? 3. What ideas do you have about the date of change to new version? 4. Is http://www.botwiki.sno.cc/wiki/Rewrite/Conversion_HOWTO#Installationstill correct and actual? Regards, -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Which is official?
http://www.mediawiki.org/wiki/Special:Code/pywikipedia http://svn.wikimedia.org/viewvc/pywikipedia/trunk/pywikipedia/ What is the difference, or which is to be regarded as more up-do-date? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Which is official?
Thank you! -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Please add translation to catlib.py
In catlib.py, msg_created_for_renaming = { 'hu':u'Bottal áthelyezve innen: %s. Eredeti szerzők: %s', Thx -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to mass rename articles?
I tried it, and there is a funny thing. I was wondering, how this factory yields articles with more than 3 digits in title. I modified your expression first: gen = RegexFilterPageGenerator(PrefixingPageGenerator(kkStB), rkkStB [0-9]) and after it: gen = RegexFilterPageGenerator(PrefixingPageGenerator(kkStB), rkk) and the same list was generated in every case, as your one. So the outer generator seems useless. 2010/8/10 Merlijn van Deen valhall...@arctus.nl gen = RegexFilterPageGenerator(PrefixingPageGenerator(kkStB), rkkStB [0-9][0-9]?[0-9]?) for page in gen: ... print page ... [[KkStB 1]] [[KkStB 10]] [[KkStB 105]] [[KkStB 106]] [[KkStB 108]] [[KkStB 110]] [[KkStB 110.500]] [[KkStB 129]] [[KkStB 14]] [[KkStB 151]] [[KkStB 166]] [[KkStB 17]] [[KkStB 179]] [[KkStB 180]] [[KkStB 180.500]] [[KkStB 205]] [[KkStB 206]] [[KkStB 207]] [[KkStB 210]] [[KkStB 211]] [[KkStB 229]] [[KkStB 231]] [[KkStB 27]] [[KkStB 270]] [[KkStB 306]] [[KkStB 31.01–11]] [[KkStB 310]] [[KkStB 310.300]] [[KkStB 32]] [[KkStB 329]] [[KkStB 378]] [[KkStB 393]] [[KkStB 4]] [[KkStB 406]] [[KkStB 46]] [[KkStB 5]] [[KkStB 506]] [[KkStB 571]] [[KkStB 6]] [[KkStB 60]] [[KkStB 7]] [[KkStB 76 sorozatú szerkocsi]] -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Strange behavior of wikipedia.py when protecting a page
I have put a time.sleep(20) between the page creation and the page protection, just in case. Perhaps the script was too fast. But nothing happened, the result is the same error as mentioned above. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] userlib.py -- outdated parameters
In userlib.py, under the header: def block(self, expiry = None, reason = None, anon= True, noCreate = False, onAutoblock = False, banMail = False, watchUser = False, allowUsertalk = True, reBlock = False): the given parameters in comment are outdated. (Inconsistent with the real parameter names.) Could any developer please rewrite them? This is NOT a bug, this is only annoying, not very urgent. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] userlib.py -- outdated parameters
Oh, I cannot give the block, poor vandal! THIS is a bug: File C:\Program Files\Pywikipedia\userlib.py, line 377, in block self.site()._getActionUser('block', sysop=True) AttributeError: 'Site' object has no attribute '_getActionUser' -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Strange behavior of wikipedia.py when protecting a page
The idea is to create a user page and a talk page for a blocked user with a template, and immediately after creating to protect them. The bot creates the pages and gives an error: Creating page [[Szerkesztö:SZDSZ]] via API Sleeping for 6.7 seconds, 2010-08-19 23:04:37 Creating page [[Szerkesztövita:SZDSZ]] via API Note: Your sysop account on wikipedia:hu does not have a bot flag. Its edits wil l be visible in the recent changes. Sleeping for 4.8 seconds, 2010-08-19 23:04:49 Do you want to change the protection level of [[hu:Szerkesztö:SZDSZ]]? ([Y]es, [ N]o, [A]ll) a {u'error': {u'info': u*Existing titles can't be protected with 'create'*, u'code ': u'create-titleexists'}} Sleeping for 6.9 seconds, 2010-08-19 23:04:57 {u'error': {u'info': uExisting titles can't be protected with 'create', u'code ': u'create-titleexists'}} Now comes the funny thing: I run the script again. It saves the page with the same text (of course, this will not appear in page history), and now it protects the pages successfully! Updating page [[Szerkesztö:SZDSZ]] via API Sleeping for 7.9 seconds, 2010-08-19 23:14:28 Updating page [[Szerkesztövita:SZDSZ]] via API Note: Your sysop account on wikipedia:hu does not have a bot flag. Its edits wil l be visible in the recent changes. Sleeping for 8.9 seconds, 2010-08-19 23:14:37 Do you want to change the protection level of [[hu:Szerkesztö:SZDSZ]]? ([Y]es, [ N]o, [A]ll) a Changed protection level of page [[Szerkesztö:SZDSZ]]. Sleeping for 7.1 seconds, 2010-08-19 23:14:49 Changed protection level of page [[Szerkesztövita:SZDSZ]]. How is it possible that the second time the pages could be protected, although they were existing titles, too? I tried to change the order of script lines: first protection, and saving afterwards. This works! But this way the bot creates the pages under my sysop name, because they are already protected. I just have used the lines: userlap.protect(reason=self.summary) vitalap.protect(reason=self.summary) In wikipedia.py protection stands as follows: def protect(self, editcreate = 'sysop', move = 'sysop', etc. So it has only a common editcreate parameter. Please help, what is wrong in my solution, and how is it possible, that first time a page cannot be protected, and the second time it can? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Strange behavior of wikipedia.py when protecting a page
2010/8/19 Bináris wikipo...@gmail.com I tried to change the order of script lines: first protection, and saving afterwards. This works! But this way the bot creates the pages under my sysop name, because they are already protected. ... and removes the protection, becuse protection was only against creating, not against editing. I had to protect it again. So not good at all. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] How to mass rename articles?
Hello, we got a request to rename a lot of articles in huwiki. See http://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia:Botgazd%C3%A1k_%C3%BCzen%C5%91falacurid=69093diff=8179481oldid=8178513#G.C5.91zmozdonyok_sz.C3.B3cikk-c.C3.ADmei All the 0 chars in titles mean any digit. As far as I see, move.py does not handle regexps in the way replace.py does, although I would just need this feature. How would you solve this problem? Earlier in another problem I generated move X Y lines with Excel, and put them in a batch file, calling move.py as many times as the number of articles to rename, what is not too nice, but now I don't know all the original titles, only patterns. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to mass rename articles?
Thank you, I know this syntax, but the problem is that I don't know the old names, beacuse I have only a pattern of the old names that could be handled by regexps. I don't know even of the approximate number of the old names. The real solution would be make movepages.py recognize the regexps. 2010/8/10 Alex Brollo alex.bro...@gmail.com Use movepages.py, with -pairs parameter. It will read a text file, where old names and new names are paired linke this: [[old name1]] [[new name1]] [[old name2]] [[new name2]] http://meta.wikimedia.org/wiki/Pywikipediabot/movepages.py Alex -- Alex ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to mass rename articles?
2010/8/10 Merlijn van Deen valhall...@arctus.nl Use the framework instead of relying ready-to-use bots. Use the PrefixingPageGenerator combined with the RegexFilterPageGenerator (from pagegenerators.py) to yield the correct pages to move. Something like: Thank you, this is really useful, I will try this! Is your name from Merlin the wizard? :-) I have read movepages.py, and I have found something interesting. Movepages.py can handle regexes in some way, but it is not documented at all. Press ctrl F and search for regex in the source. r is an option that may be used during the program, not as command line parameter, I guess. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] How to get an orange stripe?
My bot edited my talk page with page.put(blabla..., botflag=False), and I got no orange stripes. How can I solve it? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] How to get an orange stripe?
Solved. Sorry for asking before searching. I have already asked this two years ago, [?] and the answer is minorEdit=False. 2010/8/2 Bináris wikipo...@gmail.com My bot edited my talk page with page.put(blabla..., botflag=False), and I got no orange stripes. How can I solve it? -- Bináris -- Bináris 328.png___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Using more in replace.py?
Dear developers! In solve_sisambiguation.py I can use more option to see a wider environment of the selected text. Repeating this, I see mpre and more of the text. Would it be possible to implement this useful option in replace.py, too? Sometimes I need it very much, when the text is in a short line. I can open the article in editor or browser, but it is too slow, and not always necessary. Pressing m in solve_sisambiguation.py is much faster. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Replacing texts in namespace 6
I usually collect the suspicious articles on my subpage ( http://hu.wikipedia.org/wiki/Szerkeszt%C5%91:BinBot/try), then I run replece.py with option -links:User:BinBot/try. This does not work, if the article is in File (in Hungarian: Fájl) namespace. I tried to put semicolons into the links (*[[:Fájl:something]] instead of *[[Fájl:something]]), nothing happened. Is there an issue that -links doesn't work with files? It works properly directly, e. g. wit -xml:... or -transcludes:... options. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] replace.py - annoying characters on display
Thank you very much, André! -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Hungarian localisation update
2010/2/4 Bence Damokos bdamo...@gmail.com And also selflink.py? Add to msg after line 48: 'hu':u'Bot: Önmagára mutató hivatkozások eltávolítása', The correct version instead the above one is: 'hu':u'Bot: Önmagukra mutató hivatkozások eltávolítása', Sorry, Dami. :-) -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Need help with page histories
Although nobody replied, I share the result of my experiments. You have to know, tehat on huwiki we introduced FlaggedRevs ( http://www.mediawiki.org/wiki/Extension:FlaggedRevs). Now, getVersionHistoryTable() and getVersionHistory() give back only the unrevised versions, and none of the revised! fullVersionHistory() seems to work correctly for each version IF the page has ANY unrevised versions; but it results in an out of range error if the page has only revised versions. None of these methods gives the edit comments. In getVersionHistoryTable() they appear as nowiki/nowiki, and in getVersionHistory() as u''. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Need help with page histories
Hi, I have modified the get.py modul of the pywikibot. The purpose is to display the date of creation of the article. I guessed from wikipedia.py that I need either page.getVersionHistory(), which is a list, or page.getVersionHistoryTable(), which gives the history as a wikitable. Now it seems to work, but sometimes it gives back a shorter page history, than the real one, and, unfortunately, sometimes both methods give an empty page history, and thus en error in the program flow. Here are some concerned pages: [[hu:Ehrenfeld Samu]] (http://hu.wikipedia.org/wiki/Ehrenfeld_Saul) and [[hu:TCSEC]] (http://hu.wikipedia.org/wiki/TCSEC) seems to have a completely empty page history, which is not true and results in an error. (Theoretically, no article could exist with an empty history.) [[hu:Mark VII]] (http://hu.wikipedia.org/wiki/Mark_VII) displays only one line, just as it had a single-edit history. Where is the mistake? I attach proba.py which is the modified version of get.py. I hope you will not take my head for that, it's very small. :-) The Hungarian comments are not relevant, the essence is written here. An article can be given as a command line parameter. Thanks! -- Bináris proba.py Description: Binary data ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] skip to next occurrence in a page?
2009/11/26 Chris Watkins chriswater...@appropedia.org Notice that the -regex parameter is used, and the search text ends with (.*$), which matches the entire rest of the article. Not bad, not bad. :-) Nice solution. \\2 is strange for me, because it should be \2, and it does work that way. I thought, \\2 should be interpreted as a \ mark followed by a 2 number, not \2 (second group). So I don't understand again. :-) -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] Regex help, please!
Good evening or something, In fixes.py I would like to change something this way: repeat the first group with a trailing 0 (zero) character. Something like (ur'(x|y|z)', ur'\10') This results in the tenth group, and execution stops with no such group message. How can I explain to the bot that I want to copy \1 and a zero character immediately after it? Thank you, -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] Regex help, please!
Thank you for the answers, Francesco's version works, and Pathoschild's version is for some reason not very happy with the idea of working. :-) That's what I tried previously. Now I have a quite pretty package of Hungarian spellcheck fixes. I am sorry I can't share it with you. :-) One more question: how can I substitute a new line? \n works in the query, but not in the expression to be substituted. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] -excepttext failure in replace.py
Since no useful answer has arrived, and I have spent a plenty of time with experiments, finally I took excepttext out from the replace command, and inserted teh exceptions into the fix itself. Now it works. Is it possible that an 'exceptions': part in fix totally overwrites the command line parameters? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
Re: [Pywikipedia-l] -excepttext failure in replace.py
I have a nice conversation with myself. :-) Experiments show that settings within the fix itself suppress any parameter on command line. -summary parameter works only if I completely delete the 'msg': section from the fix. This is true even if I change the 'hu': in the 'msg': of the fix to any other language code, while working in huwiki! 2009/10/10 Bináris wikipo...@gmail.com Since no useful answer has arrived, and I have spent a plenty of time with experiments, finally I took excepttext out from the replace command, and inserted teh exceptions into the fix itself. Now it works. Is it possible that an 'exceptions': part in fix totally overwrites the command line parameters? -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
[Pywikipedia-l] -excepttext failure in replace.py
Hi folks, I have a batch file that is: replace.py -xml:e:\huwiki-latest-pages-articles.xml -summary:x -namespace:0 -fix:vegyesjav4 -regex -excepttext:\{\{[Ss]zinnyei\}\} -excepttext:\{\{[Pp]allas\}\} -excepttext:V.lyi Andr.s -excepttext:F.nyes Elek Now, exceptions don't work, and I keep getting pages with the wannabe excepted texts. What is the mistake? V.lyi Andr.s and F.nyes Elek stand for Vályi András and Fényes Elek, because there is a difference between DOS and Windows characters and I wantes to avoid this problem. replace.py id is: __version__='$Id: replace.py 7309 2009-09-25 00:18:58Z siebrand $' But the same error appeared earlier, too. -- Bináris ___ Pywikipedia-l mailing list Pywikipedia-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l