Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update
Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync dataset1001.wikimedia.org:: However, the system limits the rsyncers to only mirrors, to prevent others from rsyncing directly from Wikimedia. On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com wrote: I'm still intressted in running a mirror also, like noted on Meta and send out earlier per mail also. I'm just wondering, why is there no rsync possibility from the main server? Its strange when we need to rsync from a mirror. -- *Kind regards, Huib Laurens** Certified cPanel Specialist Certified Kaspersky Specialist ** WickedWay Webhosting, webhosting the wicked way! www.wickedway.nl - www.wickedway.be* . ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update
Ariel will do that :) BTW just dig around inside their puppet configuration repository on Gerrit and you can know more :) On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com wrote: Ok, cool. And how will I get wikimedia to allow our IP to rsync? Best, Huib On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote: Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync dataset1001.wikimedia.org:: However, the system limits the rsyncers to only mirrors, to prevent others from rsyncing directly from Wikimedia. On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com wrote: I'm still intressted in running a mirror also, like noted on Meta and send out earlier per mail also. I'm just wondering, why is there no rsync possibility from the main server? Its strange when we need to rsync from a mirror. -- *Kind regards, Huib Laurens** Certified cPanel Specialist Certified Kaspersky Specialist ** WickedWay Webhosting, webhosting the wicked way! www.wickedway.nl - www.wickedway.be* . ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Kind regards, Huib Laurens WickedWay.nl Webhosting the wicked way. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update
Do you have a url that you can reveal so that some of us can have a sneak peak? :P On Wed, May 30, 2012 at 5:16 PM, Huib Laurens sterke...@gmail.com wrote: Ok. I mailed Ariel about this, if all goes will I can have the mirror up and running by Friday. Best, Huib On Wed, May 30, 2012 at 10:59 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote: Ariel will do that :) BTW just dig around inside their puppet configuration repository on Gerrit and you can know more :) On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com wrote: Ok, cool. And how will I get wikimedia to allow our IP to rsync? Best, Huib On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote: Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync dataset1001.wikimedia.org:: However, the system limits the rsyncers to only mirrors, to prevent others from rsyncing directly from Wikimedia. On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com wrote: I'm still intressted in running a mirror also, like noted on Meta and send out earlier per mail also. I'm just wondering, why is there no rsync possibility from the main server? Its strange when we need to rsync from a mirror. -- *Kind regards, Huib Laurens** Certified cPanel Specialist Certified Kaspersky Specialist ** WickedWay Webhosting, webhosting the wicked way! www.wickedway.nl - www.wickedway.be* . ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Kind regards, Huib Laurens WickedWay.nl Webhosting the wicked way. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Kind regards, Huib Laurens WickedWay.nl Webhosting the wicked way. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update
This is quite nice, though the item's metadata is too little :) On Tue, May 29, 2012 at 3:40 AM, Mike Dupont jamesmikedup...@googlemail.com wrote: first version of the Script is ready , it gets the versions, puts them in a zip and puts that on archive.org https://github.com/h4ck3rm1k3/pywikipediabot/blob/master/export_deleted.py here is an example output : http://archive.org/details/wikipedia-delete-2012-05 http://ia601203.us.archive.org/24/items/wikipedia-delete-2012-05/archive2012-05-28T21:34:02.302183.zip I will cron this, and it should give a start of saving deleted data. Articles will be exported once a day, even if they they were exported yesterday as long as they are in one of the categories. mike On Mon, May 21, 2012 at 7:21 PM, Mike Dupont jamesmikedup...@googlemail.com wrote: Thanks! and run that 1 time per day, they dont get deleted that quickly. mike On Mon, May 21, 2012 at 9:11 PM, emijrp emi...@gmail.com wrote: Create a script that makes a request to Special:Export using this category as feed https://en.wikipedia.org/wiki/Category:Candidates_for_speedy_deletion More info https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export 2012/5/21 Mike Dupont jamesmikedup...@googlemail.com Well I whould be happy for items like this : http://en.wikipedia.org/wiki/Template:Db-a7 would it be possible to extract them easily? mike On Thu, May 17, 2012 at 2:23 PM, Ariel T. Glenn ar...@wikimedia.org wrote: There's a few other reasons articles get deleted: copyright issues, personal identifying data, etc. This makes maintaning the sort of mirror you propose problematic, although a similar mirror is here: http://deletionpedia.dbatley.com/w/index.php?title=Main_Page The dumps contain only data publically available at the time of the run, without deleted data. The articles aren't permanently deleted of course. The revisions texts live on in the database, so a query on toolserver, for example, could be used to get at them, but that would need to be for research purposes. Ariel Στις 17-05-2012, ημέρα Πεμ, και ώρα 13:30 +0200, ο/η Mike Dupont έγραψε: Hi, I am thinking about how to collect articles deleted based on the not notable criteria, is there any way we can extract them from the mysql binlogs? how are these mirrors working? I would be interested in setting up a mirror of deleted data, at least that which is not spam/vandalism based on tags. mike On Thu, May 17, 2012 at 1:09 PM, Ariel T. Glenn ar...@wikimedia.org wrote: We now have three mirror sites, yay! The full list is linked to from http://dumps.wikimedia.org/ and is also available at http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors Summarizing, we have: C3L (Brazil) with the last 5 good known dumps, Masaryk University (Czech Republic) with the last 5 known good dumps, Your.org (USA) with the complete archive of dumps, and for the latest version of uploaded media, Your.org with http/ftp/rsync access. Thanks to Carlos, Kevin and Yenya respectively at the above sites for volunteering space, time and effort to make this happen. As people noticed earlier, a series of media tarballs per-project (excluding commons) is being generated. As soon as the first run of these is complete we'll announce its location and start generating them on a semi-regular basis. As we've been getting the bugs out of the mirroring setup, it is getting easier to add new locations. Know anyone interested? Please let us know; we would love to have them. Ariel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- James Michael DuPont Member of Free Libre Open Source Software Kosova http://flossk.org Contributor FOSM, the CC-BY-SA map of the world http://fosm.org Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com Pre-doctoral student at the University of Cádiz (Spain) Projects: AVBOT | StatMediaWiki | WikiEvidens | WikiPapers | WikiTeam Personal website: https://sites.google.com/site/emijrp/ ___ Xmldatadumps-l mailing list xmldatadump...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l -- James
Re: [Wikitech-l] Etherpad bug
You are probably using the HTTPS version of this service, use the insecure gateway for now (if that is the error you are encountering). On Wed, Mar 21, 2012 at 10:28 PM, Samat sama...@gmail.com wrote: If I am writing to a wrong address, please let us know. I used etherpad for a long time without any problem. Now, if I open an etherpad window (etherpad.wikimedia.org) I got this message after a few second: Disconnected. *Lost connection with the EtherPad synchronization server.* This may be due to a loss of network connectivity. If this continues to happen, please let us knowhttp://etherpad.wikimedia.org/ep/support(opens in new window). If I reconnect, this happen again after a few second. My internet connection work properly. If I click let us know, I got an other error message: Oops! A server error occured. It's been logged. Please email supp...@etherpad.com if this persists. Does this problem come from my computer (which is a new one and I have never used etherpad on it)? Should I write a message to supp...@etherpad.com or how could I use etherpad? Thank you for your help. Best regards, Samat ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 403: User account expired toolserver.org/~soxred93
Tparis has the full source code of those tools, and looks like he has already brought them up on his own account. See https://toolserver.org/~tparis. I am not sure if he has everything up. On Mon, Mar 12, 2012 at 10:33 PM, Nasir Khan nasir8...@gmail.com wrote: Hi, I use to use some of the tools of the user soxred93. but for last few days it shows that the user account has been expired. Is there any way to up those tools? thanks nasir -- *Nasir Khan Saikat http://profiles.google.com/nasir8891* ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Regards, Hydriz We've created the greatest collection of shared knowledge in history. Help protect Wikipedia. Donate now: http://donate.wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bugzilla Weekly Report
Nice one, vandal topped the list of bug resolving... Regards, Hydriz Date: Mon, 28 Nov 2011 03:00:01 + To: wikitech-l@lists.wikimedia.org From: repor...@kaulen.wikimedia.org Subject: [Wikitech-l] Bugzilla Weekly Report MediaWiki Bugzilla Report for November 21, 2011 - November 28, 2011 Status changes this week Bugs NEW : 411 Bugs ASSIGNED : 9 Bugs REOPENED : 61 Bugs RESOLVED : 137 Total bugs still open: 6600 Resolutions for the week: Bugs marked FIXED : 79 Bugs marked REMIND : 0 Bugs marked INVALID: 35 Bugs marked DUPLICATE : 7 Bugs marked WONTFIX: 15 Bugs marked WORKSFORME : 4 Bugs marked LATER : 2 Bugs marked MOVED : 0 Specific Product/Component Resolutions User Metrics New Bugs Per Component General/Unknown 7 MobileFrontend 6 Site requests 6 Semantic MediaWiki 5 General/Unknown 4 New Bugs Per Product MediaWiki 21 Wikimedia 21 MediaWiki extensions33 Wikimedia Mobile6 CiviCRM 1 Top 5 Bug Resolvers tim.starling [AT] rocketmail.com77 innocentkiller [AT] gmail.com 22 sam [AT] reedyboy.net 8 john [AT] compwhizii.net8 jeroen_dedauw [AT] yahoo.com7 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Northern Soto Wikipedia
The wiki is being re-imported as there was an encoding error during the import from Incubator to this new wiki. Regards, Hydriz http://simple.wikipedia.org/wiki/User:Hydriz From: andreeng...@gmail.com Date: Sat, 5 Nov 2011 16:10:05 +0100 To: wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] Northern Soto Wikipedia On Sat, Nov 5, 2011 at 3:51 PM, Schneelocke schneelo...@gmail.com wrote: On Sat, Nov 5, 2011 at 15:46, Andre Engels andreeng...@gmail.com wrote: Same data for me, plus that pywikipediabot does not get something good either. I've checked with both Opera 11 and Firefox 3.6 on Windows Vista; I'm getting these errors when I'm logged in (globally), but not when I'm logged out. Presumably, it's got to do with the auto-creation of a local user account. Good call. Attempting to log in from the wiki itself gives more information: pThe user name Robbot has been banned from creation. It matches the following blacklist entry: code.* noedit # This wiki will be re-imported/code /p (I used Robbot's account instead of my own because the password is simpler) Thus, what might be going on is that every user name is blacklisted. -- André Engels, andreeng...@gmail.com ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] About LQT - on hold?
LiquidThreads is still undergoing a major rewrite. It will be enabled once it has been finished. Regards,Hydrizhttp://simple.wikipedia.org/wiki/User:Hydriz From: amir.ahar...@mail.huji.ac.il Date: Sun, 29 May 2011 16:35:58 +0300 To: wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] About LQT - on hold? 2011/5/29 HW waihor...@yahoo.com.hk Dear all, As of a member of Chinese Wikipedia community, we have submitted a bug https://bugzilla.wikimedia.org/show_bug.cgi?id=29114 on requestion for adding LQT to Chinese Wikipedia. However, it seems that it is on hold. Due to a large amount of comment on Chinese Wikipedia Village Pump, we need it as soon as possible. When will it be enable? Or, need to wait for central action? I also asked to enable LiquidThreads on the Hebrew Wikinews [1] and it was closed as RESOLVED LATER, with the explanation LQT isn't going to be deployed anywhere new atm. I think that that's a shame, because if there are people who are willing to sacrifice their projects for testing LQT, this should be allowed. I can't think of anything terribly destructive that LQT can do to a project, even in its current buggy and rapidly changing form. [1] https://bugzilla.wikimedia.org/show_bug.cgi?id=27937 -- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com We're living in pieces, I want to live in peace. - T. Moore ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l