Re: [Wikitech-l] Content handler feature merge (Wikidata branch) scheduled early next week
Hello, I would consider https://bugzilla.wikimedia.org/show_bug.cgi?id=40653 (filed by Daniel Kinzler after one of my comments) as a blocker for this merge. Also Nikerabbit's comments in https://gerrit.wikimedia.org/r/#/c/25736/1/includes/Content.php lines 222 and following would be great to fix before the merge. Cheers! Alexandre Emsenhuber (ialex) Le 3 oct. 2012 à 03:07, Rob Lanphier a écrit : Hi everyone, We now at around the time that we planned to merge the ContentHandler branch in. Questions: * Daniel/others: have you submitted a merge commit for this? If not, do you need help/clarification, or do you have it? * People who reviewed Denny's faux commit[1] and left comments. Any blockers, or mainly stylistic stuff? Everyone else, now is the last call before the merge. Rob [1] Faux commit of ContentHandler branch: https://gerrit.wikimedia.org/r/#/c/25736/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [MediaWiki-l] translatewiki.net ending localisation support for svn.wikimedia.org MediaWiki extensions
Just quick note that, as announced in August http://lists.wikimedia.org/pipermail/mediawiki-l/2012-August/039678.html, translatewiki.net yesterday dropped support for about 260 extensions/message groups. People still using code from SVN will no longer have the localisation updates they might expect. If you need or care about one of them and the respective localisation, you should get in contact with its developer(s) (or become one) so that it's migrated to Git.https://www.mediawiki.org/wiki/Git/Conversion/Extensions_queue Most or all of them need some tender loving care anyway. The list of removed message groups is in https://gerrit.wikimedia.org/r/#/c/26045/ and reproduced below for your convenience. Sorry for the crossposting, Nemo ABC, Advanced Meta, Advanced Random, Ajax Login, AJAX Poll, Ajax Query Pages, Ajax Show Editors, Amazon Plus, Api Explorer, Api SVG Proxy, Article Comments, Article To Category 2, Asksql, Autoincrement, Back and Forth, Bad Image, Batch User Rights, Block Titles, Book Information, CallCassandra, Categorize, Category Browser, Category Intersection, Category Members On Sidebar, Category Multisort, Category Multisort Chinese, Category On Upload, Category Sort Headers, Category Stepper, Category Watch, Change Author, Chemistry, Close Wikis, Collab Watchlist, Comment Spammer, Contributions Edit Count, Contributors, Contributors Add-on, Cooperation Statistics, Count Edits, Create Box, Create Redirect, Creative Commons Rdf, Cross Namespace Links, Crosswiki Block, Crowd Authentication, Data Transclusion, Date Diff, Delayed Definition, Delete Queue, Did You Mean, DPL Forum, Dublin Core Rdf, Duplicator, Edit count, Edit Messages, Edit Own, Edit Section Clearer Link, Edit Section Hilite Link, Edit Similar, Edit User, Elm Easy Ref, Email Address Image, Email Page, Email Users, Emergency DeSysop, Etherpad Lite, External Pages, Farmer, Favorites, File Attach, File Page Masking, Find Spam, Fixed Image, Flag Page, Flv Handler, Folk Tag Cloud, Form, Format Email, Framed Video, Freq Pattern Tag Cloud, Front Back Matter Forced Wikilinks, GeeQuBox, GeoLite, Get Family, Global User rights, Gnuplot, Google Analytics, Google Maps, Go To Category, Groups Sidebar, Hide Namespace, Honeypot Integration, HTMLets, Icon, Image Tagging, Import Free Images, Import Users, IM Status, Include WP, Index Function, Inline Categorizer, Inspect Cache, Interactive Block Message, Interface Concurrency, Interwiki List, IP Auth, JS Kit, Last User Login, Latex Doc, Link OpenID, Livelets, Local JQuery, Lockdown, Logo Functions, Lookup User, Lua, Magic No Numbered Headings, Mass Blank, Mass Edit Regex, Math Stat Functions, Media Functions, Metadata Edit, Meta Keywords, Metavid Wiki, MicroID, Mini Donation, Minimum Name Length, Mini Preview, Mirror Tools, Most Revisors, Multi Boilerplate, Multi Upload, Natural Language List, Navigation Popups, Network Auth, Newest Pages, News, News Channel, No Bogus Userpages, Notificator, Nss MySQL Auth, OpenID, Oracle Text Search, Other Sites, Package Force, Page By, Page In Cat, Page Object Model, Parser Wiki, Patroller, Pdf Book, People Categories, Pipe Escape, Piwik, Player, Plotters, POV Watch, Preloader, Preview Functions, Private Page Protection, Profile Monitor, Protect Section, Proxy Connect, PSI NoTocNum, Pure Wiki Deletion, Purge, Purge Cache, QPoll, Qr Code, Random Image, RDFIO, Record Admin, Redirect, Reflect, Regex Block, Research Tools, Resumable Upload, RPED, Search Box, See also, Semantic Data Types, Semantic Project Management, SemanticSignup, Semantic Web Browser, Sendmail To Wiki, Shared User Rights, Show Processlist, Sidebar Donate Box, Sign Document, Simple Security, Slippy Map, SNMP query, Sound Manager 2 Button, Spam Diff Tool, Spam Regex, Special 404, Special File List, Special Talk, SQL2Wiki, Stale Pages, Sternograph, Stock Charts, Storyboard, String Functions, Sudo, Suhosin, SVG Zoom, Tab0, Talk Here, Tasks, Template Info, Template Link, TidyTab, Todo, Todo Tasks, Toolserver Tools - Holeks Cite book template generator, Toolserver Tools - Kolossos Kml On Openlayers, Toolserver Tools - Soxred93, Tooltip, Transliterator, Tree And Menu, Trusted Math, TSPoll, UK Geocoding For Maps, Uniwiki - Authors, Uniwiki - Auto Create Category Pages, Uniwiki - Cat Box At Top, Uniwiki - Create Page, Uniwiki - Css Hooks, Uniwiki - Custom Toolbar, Uniwiki - Format Changes, Uniwiki - Format Search, Uniwiki - Generic Edit Page, Uniwiki - Javascript, Uniwiki - Layouts, Uniwiki - Moo Tools 12 core, Usage Statistics, User Contact Links, User Debug Info, User Images, User Page Edit Protection, User Rights Notif, VariVote, Watchers, Watch Subpages, Web Store, What Is My IP, White List Edit, Who Is Watching, Whos Online, Wiki Article Feeds, Wiki At Home, WikiBhasha, Wikidata, Wikilog, Wiki Sync, Wiki Tweet, Woopra, XMLRC, YouTube Auth Sub. ___ Wikitech-l mailing list
Re: [Wikitech-l] Using mediawiki from within the Social networks?
On Wed, Oct 3, 2012 at 7:43 AM, Yury Katkov katkov.ju...@gmail.com wrote: I'm not sure that '''editing''' can be made more easy with the help of social network client. Any ideas on that? any ideas on what else can be made more engaging with the power of social networks? Well what if people can click on a bit of text and comment on it, they could suggest in that comment that the text is replaced. mike -- James Michael DuPont Member of Free Libre Open Source Software Kosova http://flossk.org Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com Contributor FOSM, the CC-BY-SA map of the world http://fosm.org Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3 Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Using mediawiki from within the Social networks?
hmmm, much like the blips on Google Wave? - Yury Katkov On Wed, Oct 3, 2012 at 12:50 PM, Mike Dupont jamesmikedup...@googlemail.com wrote: On Wed, Oct 3, 2012 at 7:43 AM, Yury Katkov katkov.ju...@gmail.com wrote: I'm not sure that '''editing''' can be made more easy with the help of social network client. Any ideas on that? any ideas on what else can be made more engaging with the power of social networks? Well what if people can click on a bit of text and comment on it, they could suggest in that comment that the text is replaced. mike -- James Michael DuPont Member of Free Libre Open Source Software Kosova http://flossk.org Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com Contributor FOSM, the CC-BY-SA map of the world http://fosm.org Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3 Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Using mediawiki from within the Social networks?
On Wed, Oct 3, 2012 at 1:03 PM, Yury Katkov katkov.ju...@gmail.com wrote: hmmm, much like the blips on Google Wave? like line comments in github : https://github.com/h4ck3rm1k3/wikiteam/commit/4da7f7f4a813b53be13bff7e29a1e5325bb68a30#L0R58 We need a way to track and rate comments and then we can resolve them with changes after people have stormed over them. -- James Michael DuPont Member of Free Libre Open Source Software Kosova http://flossk.org Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com Contributor FOSM, the CC-BY-SA map of the world http://fosm.org Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3 Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3
As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please let me know ASAP if you discover any issues with mail sent from a @wikimedia.org address. Thanks! jg Jeff Green Operations Engineer, Special Projects Wikimedia Foundation 149 New Montgomery Street, 3rd Floor San Francisco, CA 94105 415-839-6885 x6807 jgr...@wikimedia.org P.S. Ops folks, rollback is simply a matter of reverting the wikimedia.org zone file and running authdns-update. I set the TTL to 10 min just in case. -- Forwarded message -- Date: Fri, 28 Sep 2012 11:00:08 -0700 (PDT) From: Jeff Green jgr...@wikimedia.org Reply-To: Wikimedia developers wikitech-l@lists.wikimedia.org To: wmf...@lists.wikimedia.org, wikimedi...@lists.wikimedia.org, wikitech-l@lists.wikimedia.org Subject: [Wikitech-l] SPF (email spoof prevention feature) test-rollout Weds 10/5 I'm planning to deploy Sender Policy Framework (SPF) for the wikimedia.org domain on Weds October 5. SPF is a framework for validating outgoing mail, which gives the receiving side useful information for spam filtering. The main goal is to cause spoofed @wikimedia.org mail to be correctly identified as such. It should also improve our odds of getting fundraiser mailings into inboxes rather than spam folders. The change should not be noticeable, but the most likely problem would be legitimate @wikimedia.org mail being treated as spam. If you hear of this happening please let me know. Technical details are below for anyone interested . . . Thanks, jg Jeff Green Operations Engineer, Special Projects Wikimedia Foundation 149 New Montgomery Street, 3rd Floor San Francisco, CA 94105 jgr...@wikimedia.org . . . . . . . SPF overview http://en.wikipedia.org/wiki/Sender_Policy_Framework The October 8 change will be simply a matter of adding a TXT record to the wikimedia.org DNS zone: wikimedia.org IN TXT v=spf1 ip4:91.198.174.0/24 ip4:208.80.152.0/22 ip6:2620:0:860::/46 include:_spf.google.com ip4:74.121.51.111 ?all The record is a list of subnets that we identify as senders (all wmf subnets, google apps, and the fundraiser mailhouse). The ?all is a neutral policy--it doesn't state either way how mail should be handled. Eventually we'll probably bump ?all to a stricter ~all aka SoftFail, which tells the receiving side that only mail coming from the listed subnets is valid. Most ISPs will route 'other' mail to a spam folder based on SoftFail. Please bug me with any questions/comments! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Wikimedia engineering September 2012 report
Hi, The report covering Wikimedia engineering activities in September 2012 is now available. Wiki version: https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/September Blog version: https://blog.wikimedia.org/2012/10/03/engineering-september-2012-report/ -- Guillaume Paumier ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3
2012/10/3 Jeff Green jgr...@wikimedia.org: As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please let me know ASAP if you discover any issues with mail sent from a @wikimedia.org address. Is allowing ALL IP's in all WMF ranges really needed by anyone? It would be much better if there will be only finite number of designated SMTP servers and all other machines should send mail via those servers, not directly into public internet. AJF/WarX ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3
On Wed, Oct 3, 2012 at 4:08 PM, Artur Fijałkowski wiki.w...@gmail.com wrote: 2012/10/3 Jeff Green jgr...@wikimedia.org: As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please let me know ASAP if you discover any issues with mail sent from a @wikimedia.org address. Is allowing ALL IP's in all WMF ranges really needed by anyone? It would be much better if there will be only finite number of designated SMTP servers and all other machines should send mail via those servers, not directly into public internet. It's closer to the status quo (and I've not heard people complain about spam from our blocks but maybe I just don't know) and therefore less work to make it happen. Being perfect can be deferred to a later date. -Jeremy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Github replication
Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
Awesome! I have a repo I'd love to try this with right now. I'll find you on IRC… On Oct 3, 2012, at 12:27 PM, Chad innocentkil...@gmail.com wrote: Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Oct 3, 2012, at 12:27 PM, Chad innocentkil...@gmail.com wrote: Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Yay. Finally we're allowing the world to fix our code :). Can has Github-Gerrit merge?! ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 6:27 PM, Chad innocentkil...@gmail.com wrote: Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core that is great news. mike ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On 12-10-03 09:27 AM, Chad wrote: Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Hi Chad, Will all extensions be replicated? Are we also looking to replicate to, e.g., Gitorious? I'm sure there are docs for this decision, but I haven't seen them--do you have them handy? Thanks, -- Mark Holmquist Software Engineer, Wikimedia Foundation mtrac...@member.fsf.org http://marktraceur.info ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 9:57 PM, Chad innocentkil...@gmail.com wrote: Just letting everyone know: mediawiki/core is now replicating from gerrit to github. Sweeet! Any plans for pull-request integration? -- Yuvi Panda T http://yuvi.in/blog ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 12:36 PM, Mark Holmquist mtrac...@member.fsf.org wrote: On 12-10-03 09:27 AM, Chad wrote: Hi everyone, Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Hi Chad, Will all extensions be replicated? Yes. Are we also looking to replicate to, e.g., Gitorious? No plans yet, but a lot of the heavy lifting re: replication has been done, so this wouldn't be impossible. I'm sure there are docs for this decision, but I haven't seen them--do you have them handy? Just bugzilla requests for it [0], [1]. I can't remember when the original decision was made, but this has been a goal for some time. -Chad [0] https://bugzilla.wikimedia.org/35429 [1] https://bugzilla.wikimedia.org/35497 [2] https://bugzilla.wikimedia.org/38196 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 12:36 PM, Yuvi Panda yuvipa...@gmail.com wrote: On Wed, Oct 3, 2012 at 9:57 PM, Chad innocentkil...@gmail.com wrote: Just letting everyone know: mediawiki/core is now replicating from gerrit to github. Sweeet! Any plans for pull-request integration? Yes! https://bugzilla.wikimedia.org/35497 A bit harder than pushing out, but definitely on the roadmap. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC
Hello, Our scheduled Git+Gerrit session starts in ca. 40 minutes from now. Everything will happen via SIP audioconference and SSH connection. Please make sure your SIP and SSH clients works! More information on the setup: https://www.mediawiki.org/wiki/Git/Workshop I am already available on SIP as well as on IRC (#git-gerrit on Freenode) if you would like to test your setup. See you soon! Marcin Cieślak (saper) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
Le 03/10/12 18:27, Chad a écrit : Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Well done! Can we please disable Pull requests until we agree on a workflow to review those or have them automatically sent to Gerrit? Thanks! -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso hashar+...@free.fr wrote: Le 03/10/12 18:27, Chad a écrit : Just letting everyone know: mediawiki/core is now replicating from gerrit to github. https://github.com/mediawiki/core Next step: extensions! Well done! Can we please disable Pull requests until we agree on a workflow to review those or have them automatically sent to Gerrit? There is no way to do that that I've found. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 10:10 AM, Chad innocentkil...@gmail.com wrote: On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso hashar+...@free.fr wrote: Can we please disable Pull requests until we agree on a workflow to review those or have them automatically sent to Gerrit? There is no way to do that that I've found. My recommendation would be to leave pull requests active and, when we see things come in, manually import them to gerrit and close out the pull requests. Perfect? No, but probably a better way than refusing to take them until we figure out a magic automatic gateway. :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Call to eliminate sajax
sajax is an ancient ajax library, it's part of our legacy code. And it still gets to sneak a license note into our README. It's probably about time that we start making sure that code is ready for the day it disappears. Just as code shouldn't be relying on bits and pieces of wikibits. Currently the only parts of core that depend on sajax are the legacy mwsuggest and upload.js. mwsuggest is going to disappear when Krinkle's work making simplesearch's suggestions work universally is finished. I'm not sure what's going on with upload.js. The real problem however is extensions. For some reason it appears that we STILL have extensions depending on sajax. And I'm not talking about ancient extensions on the wiki or in svn. I only did an ack through stuff that's currently in git. So I welcome anyone who is interested in going through extension code and eliminating the use of sajax in favor of jQuery.ajax and RL. core/skins/common/ajax.js 3:window.sajax_debug_mode = false; 4:window.sajax_request_type = 'GET'; 7: * if sajax_debug_mode is true, this function outputs given the message into 8: * the element with id = sajax_debug; if no such element exists in the document, 11:window.sajax_debug = function(text) { 12: if (!sajax_debug_mode) return false; 14: var e = document.getElementById( 'sajax_debug' ); 18: e.className = 'sajax_debug'; 19: e.id = 'sajax_debug'; 41:window.sajax_init_object = function() { 42: sajax_debug( 'sajax_init_object() called..' ); 61: sajax_debug( 'Could not create connection object.' ); 77: *sajax_do_call( 'doFoo', [1, 2, 3], document.getElementById( 'showFoo' ) ); 83:window.sajax_do_call = function(func_name, args, target) { 88: if ( sajax_request_type == 'GET' ) { 105:x = sajax_init_object(); 112:x.open( sajax_request_type, uri, true ); 119:if ( sajax_request_type == 'POST' ) { 130: sajax_debug( 'received (' + x.status + ' ' + x.statusText + ') ' + x.responseText ); 153: alert( 'bad target for sajax_do_call: not a function or object: ' + target ); 157:sajax_debug( func_name + ' uri = ' + uri + ' / post = ' + post_data ); 159:sajax_debug( func_name + ' waiting..' ); 169:var request = sajax_init_object(); core/skins/common/mwsuggest.js 489:var xmlhttp = sajax_init_object(); core/skins/common/upload.js 99: if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 124:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 133:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 141: sajax_do_call( 'SpecialUpload::ajaxGetExistsWarning', [this.nameToCheck], 287:var req = sajax_init_object(); extensions/CommunityVoice/Resources/CommunityVoice.js 100:var oldRequestType = sajax_request_type; 102:sajax_request_type = POST; 104:sajax_do_call( 114:sajax_request_type = oldRequestType; extensions/DonationInterface/modules/validate_input.js 14: sajax_do_call( 'efPayflowGatewayCheckSession', [], checkSession ); extensions/Drafts/Drafts.js 76: var oldRequestType = sajax_request_type; 78: sajax_request_type = 'POST'; 80: sajax_do_call( 98: sajax_request_type = oldRequestType; extensions/OnlineStatus/OnlineStatus.js 35: sajax_do_call( 'OnlineStatus::Ajax', ['get'], function( x ){ 71: sajax_do_call( 'OnlineStatus::Ajax', ['set', status], function( x ){ extensions/ReaderFeedback/readerfeedback.js 36: /*extern sajax_init_object, sajax_do_call */ 92: sajax_do_call( ReaderFeedbackPage::AjaxReview, args, wgAjaxFeedback.processResult ); extensions/SecurePoll/resources/SecurePoll.js 95: sajax_do_call( 'wfSecurePollStrike', [ action, id, reason ], processResult ); extensions/SemanticForms/includes/SF_FormUtils.php 450:function FCK_sajax(func_name, args, target) { 451:sajax_request_type = 'POST' ; 452:sajax_do_call(func_name, args, function (x) { 716:sajax_request_type = 'POST' ; 718: sajax_do_call('wfSajaxWikiToHTML', [SRCtextarea.value], function ( result ){ 736:if (!oFCKeditor.ready) return false;//sajax_do_call in action 754:sajax_request_type = 'GET' ; 755: sajax_do_call( 'wfSajaxToggleFCKeditor', ['hide'], function(){} ) ; //remember closing in session extensions/SemanticForms/libs/SF_ajax_form_preview.js 43: var aj = sajax_init_object(); 44: var aj2 = sajax_init_object(); 70: // if (!oFCKeditor.ready) return false;//sajax_do_call in action - what do we do? extensions/SemanticForms/libs/SF_autoedit.js 35: sajax_request_type = 'POST'; 37: sajax_do_call( 'SFAutoeditAPI::handleAutoEdit', data, function( ajaxHeader ){ extensions/SemanticForms/libs/SF_submit.js 55: sajax_request_type = 'POST'; 58: sajax_do_call( 'SFAutoeditAPI::handleAutoEdit', new
Re: [Wikitech-l] Call to eliminate sajax
On Wed, Oct 3, 2012 at 7:18 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote: core/skins/common/upload.js 99: if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 124:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 133:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return; 141:sajax_do_call( 'SpecialUpload::ajaxGetExistsWarning', [this.nameToCheck], 287:var req = sajax_init_object(); https://bugzilla.wikimedia.org/show_bug.cgi?id=31946 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Call to eliminate sajax
On Oct 3, 2012, at 7:18 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote: The real problem however is extensions. For some reason it appears that we STILL have extensions depending on sajax. And I'm not talking about ancient extensions on the wiki or in svn. I only did an ack through stuff that's currently in git. So I welcome anyone who is interested in going through extension code and eliminating the use of sajax in favor of jQuery.ajax and RL. Also note that in various cases these are not just frontend legacy problems, backend as well. Meaning, AjaxDispatcher. Invoked through index.php?action=ajaxrs=efFooBarrsargs[]=paramrsargs[]=param. Though blindly replacing sajax would allow us to remove it from core, it would be very much worth it to give these extensions a good look and update them in general (to make it use API modules, ResourceLoader modules, and following current conventions for front-end code with mw and jQuery). -- Krinkle ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 1:13 PM, Brion Vibber br...@pobox.com wrote: On Wed, Oct 3, 2012 at 10:10 AM, Chad innocentkil...@gmail.com wrote: On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso hashar+...@free.fr wrote: Can we please disable Pull requests until we agree on a workflow to review those or have them automatically sent to Gerrit? There is no way to do that that I've found. My recommendation would be to leave pull requests active and, when we see things come in, manually import them to gerrit and close out the pull requests. Perfect? No, but probably a better way than refusing to take them until we figure out a magic automatic gateway. :) Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye on Github and make sure patches get into Gerrit, let me know and I'll add you to the group on Github. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer
I'm very glad you are joining us Željko. Welcome! On Oct 2, 2012, at 8:47 PM, Krinkle wrote: On Oct 2, 2012, at 4:25 PM, Chris McMahon cmcma...@wikimedia.org wrote: I am pleased to announce that Željko Filipin joins WMF this week as QA Engineer. Welcome Željko! For the last 1.5 year, hashar and I have set up the current integration environment. I'm also in CET (Krinkle on freenode). Hashar did most of the backend with PHPUnit and Jenkins, I'm occupied in browsers and unit testing their in (QUnit/TestSwarm/BrowserStack/..). Looking forward to work with you! -- Timo Krinkle Tijhof ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github replication
On Wed, Oct 3, 2012 at 10:29 AM, Chad innocentkil...@gmail.com wrote: Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye on Github and make sure patches get into Gerrit, let me know and I'll add you to the group on Github. Crap, I think I just volunteered. ;) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Call to eliminate sajax
Informative e-mail threads are not documentation. I added https://www.mediawiki.org/wiki/Manual:Ajax#Deprecated_functionality There seems to be no mention of AjaxDispatcher on mediawiki.org , which I guess is good? If it's obsolete someone needs to add a comment to includes/AjaxDispatcher.php. -- =S Page software engineer on E3 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Call to eliminate sajax
Daniel Friesen wrote: sajax is an ancient ajax library, it's part of our legacy code. And it still gets to sneak a license note into our README. It's probably about time that we start making sure that code is ready for the day it disappears. Just as code shouldn't be relying on bits and pieces of wikibits. Currently the only parts of core that depend on sajax are the legacy mwsuggest and upload.js. mwsuggest is going to disappear when Krinkle's work making simplesearch's suggestions work universally is finished. I'm not sure what's going on with upload.js. The real problem however is extensions. For some reason it appears that we STILL have extensions depending on sajax. And I'm not talking about ancient extensions on the wiki or in svn. I only did an ack through stuff that's currently in git. Can you please file a tracking bug for removing sajax? MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Xmldatadumps-l] HTML wikipedia dumps: Could you please provide them, or make public the code for interpreting templates?
Could we have an HTML dump for X amount of money? Something like a paid feature. Include the CSS of course. Also, leave the math tags as they are, as those have to be processed by 3rd party libraries. 2012/9/17 Pablo N. Mendes pablomen...@gmail.com I also think the HTML dumps would be super useful! Cheers Pablo On Sep 17, 2012 8:05 PM, James L james_lea...@hotmail.com wrote: I’m all vote for continuing the HTML wiki dumps that were once done, *2007 was the last*? Why are these discontinued? they would be more useful than the so called “XML”. There is no complete solution to processing dumps, the XML is most certainly not XML in its lowest form, and it IS DEFINITELY a moving target! Regards, *From:* Roberto Flores f.roberto@gmail.com *Sent:* Sunday, September 09, 2012 8:07 PM *To:* Wikimedia developers wikitech-l@lists.wikimedia.org *Cc:* Wikipedia Xmldatadumps-l xmldatadump...@lists.wikimedia.org *Subject:* Re: [Xmldatadumps-l] [Wikitech-l] HTML wikipedia dumps: Could you please provide them, or make public the code for interpreting templates? Allow me to reply to each point: (By the way, my offline app is called WikiGear Offline:) http://itunes.apple.com/us/app/wikigear-offline/id453614487?mt=8 Templates are dumped just like all other pages are... Yes, but that's only a text description of what the template does. Code must be written to actually process them into HTML. There are tens of thousands of them, and some can't be even programmed by me (e.g., Wiktionary's conjugation templates) If they were already pre-processed into HTML inside the articles' contents, that would solve all of my problems. what purpose would the dump serve? you dont want to keep the full dump on the device. I made an indexing program that selects only content articles (namespaces included) and compresses it all to a reasonable size (e.g. about 7gb for the English Wikipedia) How would this template API function? What does import mean? By this I mean, a set of functions written in some computer language to which I could send them the template within the wiki markup and receive HTML to display. Wikipedia does this whenever a page is requested, but I ignore the exact mechanism through which it's performed. Maybe you just need to make that code publicly available, and I'll try to make it work with my application somehow. 2012/9/9 Jeremy Baron jer...@tuxmachine.com On Sun, Sep 9, 2012 at 6:34 PM, Roberto Flores f.roberto@gmail.com wrote: I have developed an offline Wikipedia, Wikibooks, Wiktionary, etc. app for the iPhone, which does a somewhat decent job at interpreting the wiki markup into HTML. However, there are too many templates for me to program (not to mention, it's a moving target). Without converting these templates, many articles are simply unreadable and useless. Templates are dumped just like all other pages are. Have you found them in the dumps? which dump are you looking at right now? Could you please provide HTML dumps (I mean, with the templates pre-processed into HTML, everything else the same as now) every 3 or 4 months? 3 or 4 month frequency seems unlikely to be useful to many people. Otherwise no comment. Or alternatively, could you make the template API available so I could import it in my program? How would this template API function? What does import mean? -Jeremy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- ___ Xmldatadumps-l mailing list xmldatadump...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l ___ Xmldatadumps-l mailing list xmldatadump...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l