Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+
On 01/15/2013 03:18 AM, Robert Vogel wrote: Hi everybody! I'm writing an extension that needs to replace/disable the build-in suggestion feature of the search box. In earlier versions this could be archived by setting $wgEnableMWSuggest (http://www.mediawiki.org/wiki/Manual:$wgEnableMWSuggest) to false. But with 1.20 this option is no longer available. In includes/OutputPage.php:2484 it seems that $wgUseAjax is the only condition (despite of user settings) that is evaluated to decide whether to add the module mediawiki.searchSuggest or not. But I don't want to disable the ajax features completely. There's a preference, disablesuggest, that controls it. I don't know if there's an easy way for extensions to alter user preferences (probably not). But in the installation instructions, you can tell them to set it to true (true is disabled), and hide it from the preferences view. See https://www.mediawiki.org/wiki/Manual:FAQ#How_do_I_change_default_user_preferences.3F Matt Flaschen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?
Hello, My own preference would be to have this in the core for several reasons. It seems that it makes some existing core code simpler. There's already some code dealing with disambiguation in the core (Special:Disambiguation, ...). Several external tools, including my own WPCleaner [1], are dealing with fixing disambiguation links on several wikis. In my opinion, it will be easier for tools developers to have one standard method for finding if a page is a disambiguation page or not. Currently, I'm already managing two methods depending on the wiki : based on Mediawiki:Disambiguationpages, or based on categories (for enwiki and frwiki) which is faster and requires less API requests. If it's in an extension, I think less wikis (outside wikimedia) will use this new method than if it's in the core. Because extension requires the wiki owner to first add it, whereas only contributors are needed if it's in the core. If a wiki doesn't need disambiguation pages, there's nothing to setup specifically for not using it. It's just an unused feature, as it is currently with Mediawiki:Disambiguationpages ;) Nico [1] http://en.wikipedia.org/wiki/Wikipedia:WPCleaner On 1/16/13, Tyler Romeo tylerro...@gmail.com wrote: I agree with extension. For example, my school's IT department uses a wiki to collect information about common computer problems, and on a wiki about computer problems, none of the issues share the same name. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Tue, Jan 15, 2013 at 9:38 PM, Chad innocentkil...@gmail.com wrote: On Tue, Jan 15, 2013 at 8:58 PM, Ryan Kaldari rkald...@wikimedia.org wrote: Personally, I don't mind implementing it either way, but would like to have consensus on where this code should reside. The code is pretty clean and lightweight, so it wouldn't increase the footprint of core MediaWiki (it would actually decrease the existing footprint slightly since it replaces more hacky existing core code). So core bloat isn't really an issue. The issue is: Where does it most make sense for disambiguation features to reside? Should disambiguation pages be supported out of the box or require an extension to fully support? I'd say extension. I can think of lots of wikis that don't use disambiguation pages. If we really want, we can stash it in the default tarball along with the other bundled extensions. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] good MediaWiki\Semantic MediaWiki blogs?
Hi everyone! Can anyone recommends some blogs that describes tricks/tips/use cases of wikis, especially MW/SMW? I'm running the blog myself and need some inspiration. Very truly yours, Yury Katkov, WikiVote ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
Japanese RPG games does something interesting. When you get a quest like Go to the house of MrTom and pick the toothnail the words MrTom and toothnail are bolded. Something in the system (maybe is done manually wen the quest text is written) acknowledge entities in the system ( NPC characters, locations, items) and bold it, so is easier for the player to tell the important parts of the quest without really reading it full trough it. Computer geeks use to do this using the * character. fixed wikidata bug by reversing the *polarity*. This type of ascii syntax is what started the wikisyntax. It could be interesting (but I have no idea if is feasible), if git recognize automatically elements in a commit text, and colorize it on the terminal screen (or maybe bold it if the screen renders using truetype fonts). This way, if you have written wikidata many times, you will quickly spot a problem if the commit renders to you with fixed wkidata bug by reversing the polarity and wikidata is not bolded/colored different. A alternate would be for this script/program, to extract keywords and present to you, so if you notice the commit lack the label wikidata, theres something wrong. Many times people don't read what are writing, only when are given back what have write notice that theres something wrong in it. Many Internet forums recognize this by allowing people to edit / fix post in the first 5 minutes. Anyway half the battle is making so people care about this,so put a bit of effort into writing without many spell errors. Perhaps is harder if you have people from different cultures, and you have developers that are not english natives. If people not care, will always make this type of mistakes, and if the mistakes change keywords like wikidata or function names, it may make commits harder to search for keywords. It is true this thing is so minor its not even worth a thread on the mail list. I am posting this here, because is a fun problem. --- The Swedish Chef ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tin/Interbusiness and blacklists
2013/1/12 Cristian Consonni kikkocrist...@gmail.com: Hi there, (Not sure if this is the right place, but I can't figure of anywhere else) this morning a bunch of users were automatically removed from WikiIt-admins-l mailing list due to a blacklisting operated from Tin/Interbusiness on 208.80.154.4 At the moment it seems that people using Tin/Interbusiness services (virgilio.it / alice.it / tin.it domains) cannot receive any mail from WMF production servers. (still not sure if this is the right place, if you know where I should send this, please tell me). Wikimedia Italia contacted Telecom yesterday (luckily WMI has a couple of projects in partnership with Telecom about Wikipedia) and this is what was told us. ++ Hello, Currently there's should be no blocks on the domains tin.it/alice.it=aliceposta.it che sono nel dominio di gestione Telecom Italia IT. look at the following links: http://www.spamhaus.org/pbl http://ipremoval.sms.symantec.com/lookup http://psbl.org/ http://cbl.abuseat.org/lookup.cgi After your ip has been delisted a few hours may occours in order to have our systems updated. Regards ++ Cristian ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tin/Interbusiness and blacklists
2013/1/16 Cristian Consonni kikkocrist...@gmail.com: Currently there's should be no blocks on the domains tin.it/alice.it=aliceposta.it che sono nel dominio di gestione Telecom Italia IT. (sorry, forgot to translate completely... yes, I'm a genius =P)... anyways: Currently there's should be no blocks on the domains tin.it/alice.it=aliceposta.it which are owned and managed by Telecom Italia IT Cristian ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?
Hey, From a technical point of view it's nicer to have it as an extension as it prevents feature bloat in core. OTOH the lack of extension distribution mechanism is an argument against. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On Wed, Jan 16, 2013 at 7:09 AM, Tim Starling tstarl...@wikimedia.org wrote: I am also concerned about demotivating people. The motivation factor works with the two positions. I felt a little demotivated after having read all these we don't care about typos positions at the start of the thread and felt really relieved to read this is not the consensus opinion. Especially, as, in September and October, when I weren't at ease with my -1 typo reviews, I mailed four people I reviewed their change such a way to ask them what they prefer: a -1 review or a new patchset fixing the typo directly (I weren't at ease with the idea to resubmit a patchset without prior author consent either). Three of them didn't bother to answer me at all (for the anecdote, the 4th preferred a -1 review). So we quit the realm of the workflow and the UI don't allow easy correction to enter into the realm of we don't care. On a side matter, typos could be a symptom of another issue: how important a commit message is? Should it be a formality to expedite in 30 seconds or an informative valuable text describing the change, crafted with care and proofread before submission or merge? What's the goal of a commit message as a changelog, communication tool and change documentation value? At the end, the direct commit message edit in the UI will offer an acceptable solution: corrections will be more trivial than found again my branch, amend the commit, resubmit as a new patchset. Meanwhile, we can suffer the last weeks of extra work for review spelling (as 0 or -1) purpose pending the Gerrit migration. And if you don't want to fix yourselves your commit, please create a list stating so on mediawiki.org, that will be a clear message for the code reviewers: If you see a typo, would you be so kind as to fix it yourself and submit a new patchset?. -- Best Regards, Sébastien Santoro aka Dereckson http://www.dereckson.be/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On Wed, Jan 16, 2013 at 8:06 AM, Sébastien Santoro dereck...@espace-win.org wrote: At the end, the direct commit message edit in the UI will offer an acceptable solution: corrections will be more trivial than found again my branch, amend the commit, resubmit as a new patchset. Meanwhile, we can suffer the last weeks of extra work for review spelling (as 0 or -1) purpose pending the Gerrit migration. And really, this is just a few weeks out. I've been testing the upgrade, and I'm confident we won't see any real problems. I'm planning to do it as soon as the eqiad migration is complete next week. And if you don't want to fix yourselves your commit, please create a list stating so on mediawiki.org, that will be a clear message for the code reviewers: If you see a typo, would you be so kind as to fix it yourself and submit a new patchset?. Please no. We really don't want to encourage people to be lazy because they think people will clean up after them. Really, I think the whole thread is moot with the pending upgrade. Typos should always be fixed before merging (I think we all agree?), and the new abilities to fix these from the UI means we won't need to mark people as -1 to do so. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Jenkins tests exiting with error code 139
Hello, For some weeks now, the PHPUnit tests for MediaWiki core have been randomly failing with an 'exit code 139' error. I did not bother investigating it caused I had no clue about what would cause it and it happened mostly in the 1.21wmf7 branch, not master. The mighty Tim Starling found out the root cause which is due to a bug in PHP. Tim applied a workaround to our PHPUnit installation to avoid the segfaulting code path. If you have any patchset blocked by the exit code 139, you can reattempt a merge of it by revoting CR+2 or sending a rebased patchset. Our bug (with MediaWiki backtraces) https://bugzilla.wikimedia.org/show_bug.cgi?id=43972 Upstream bug: https://bugs.php.net/bug.php?id=63055 cheers, -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] some issues with missing Files
Hi again! Put it another way: what is the right way of exporting/importing a bunch of files from one MediaWiki to another? Cheers, Yury Katkov, WikiVote On Mon, Dec 24, 2012 at 6:38 PM, Yury Katkov katkov.ju...@gmail.com wrote: Hi everyone! I want to move my wiki from one server (A) to another (B). On server A I have a lot of files. I don't need them on server B, but I need all my wikipages. What I've done is: I removed the 'images' directory and ran: php maintainance/rebuildall.php Unfortunately now wiki still thinks that all the files are at the places where they need to be. This is clearly not true since the images directory is empty. Questions: 1) how to property delete the files permanently? 2) is there any script to make the files and their description in MediaWiki DB consistent? Cheers, - Yury Katkov, WikiVote ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Jenkins tests exiting with error code 139
On Wed, Jan 16, 2013 at 8:27 AM, Antoine Musso hashar+...@free.fr wrote: Hello, For some weeks now, the PHPUnit tests for MediaWiki core have been randomly failing with an 'exit code 139' error. I did not bother investigating it caused I had no clue about what would cause it and it happened mostly in the 1.21wmf7 branch, not master. The mighty Tim Starling found out the root cause which is due to a bug in PHP. Tim applied a workaround to our PHPUnit installation to avoid the segfaulting code path. If you have any patchset blocked by the exit code 139, you can reattempt a merge of it by revoting CR+2 or sending a rebased patchset. Our bug (with MediaWiki backtraces) https://bugzilla.wikimedia.org/show_bug.cgi?id=43972 Upstream bug: https://bugs.php.net/bug.php?id=63055 Three cheers for Tim and Antoine! -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
Thanks Tim for pitching in. On 16.01.2013 07:09, Tim Starling wrote: Giving a change -1 means that you are asking the developer to take orders from you, under threat of having their work ignored forever. A -1 status can cause a change to be ignored by other reviewers, regardless of its merit. If the developer can't lower their sense of pride sufficiently to allow them to engage with nitpickers, then the change might be ignored by all concerned for many months. That's exactly the problem. However, if you give minor negative feedback with +0, the change stays bold in your review requests list, as if you haven't reviewed it at all. I've tried giving -1 with a comment to the effect of please merge this immediately regardless of my nitpicking above, but IIRC the comment was ignored. Yes, mentioning a type in a +0 comment would be perfectly fine with me. I generally use +0 for nitpicks, i.e. anything that doesn't really hurt. Nitpicks with a -1 are really annoying. Anyway: editing in the UI makes the whole argument mute. -- daniel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
Le 16/01/13 14:06, Sébastien Santoro a écrit : Should it be a formality to expedite in 30 seconds or an informative valuable text describing the change, crafted with care and proofread before submission or merge? To me the commit summary introduce the patch to the reviewers and should also serve as a base for later debugging. It is always good to explain there the problem you attempt to solve or the feature being implemented, then explain your design choice and finally explain what you have tested / what people should be careful about. Of course I do not apply that principle to myself :-] Some of the repositories on which I am the sole contributor and self reviewing, have very short commit message. But I do try to add extensive comments in the code. A random example: We had a trivial configuration issue in our php.ini file. The apc.shm_size parameter was missing a suffix to the amount of memory that need to be added. The patch is: files/php/apc.ini -apc.shm_size=240 +apc.shm_size=240M I could have written: 'added M to apc mem size' Instead I wrote: php.ini apc.shm_size requires The apc.shm_size is an integer determining the size of each memory segment in MB. When missing the M suffix, PHP send us a warning in syslog: php: PHP Warning: PHP Startup: apc.shm_size now uses M/G suffixes, please update your ini files in Unknown on line 0 Upstream documentation: http://php.net/manual/en/apc.configuration.php#ini.apc.shm-size Having a change like 'fix preg call' is a -1 for me. I don't even bother reading the code in such a case. Yeah there are probably typo above, but at least that: - explain the change (I add M to apc.shm_size) - expose the issue (there is an annoying PHP warning) - reference doc for easy lookup for the reviewer Done. The PHP code base has usually very short summaries cause they simply reference the bug report. Example: Bug #62593 Added test for change The git project usually has very nice commit message. Have a look at http://repo.or.cz/w/git.git/commit/b72a1904aefb9e27014a11790f3af4dc90b38e8d That optimize some code path, the summary show up command line output before and after the patch :-] So yeah, we need extensive commit message. If someone don't bother fixing a typo (which is like 1 minute of work), I will not bother doing it for them nor approving their change. My time is better invested in reviewing another patch. -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?
Le 16/01/13 02:58, Ryan Kaldari wrote: Back in December, there was discussion about needing a better method of identifying disambiguation pages programmatically (bug 6754). I wrote some core code to accomplish this, but was informed that disambiguation functions should reside in extensions rather than in core, per bug 35981. Hello Ryan, So at first, thanks a ton for implementing a feature that let us mark the disambiguation pages. IIRC we used to track then by finding article pages having a template which in turns link to another page. That is a bit crazy :-] Since you did the work to make it an extension, I would keep it an extension. The less stuff we have in core, the happier I will be! As someone else says, we can ship the extension in the MediaWiki tarballs just like we do for other important extensions :-] -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Problem to get an article's content in MW 1.20.2
Hi Platonides 2013/1/15 Platonides platoni...@gmail.com: I tried to get the content via getArticleID() ... $titleObj=Title::newFromText(Existing page); $articleID=$titleObj-getArticleID(); Article::newFromID($articleID)-fetchContent(); etc. ... but it returns $articleID=0 although the page exits. With MW 1.18 this approach worked fine, but after upgrade to MW 1.20.2 it does not any more. It should be working, and it works for me on 1.20.2 Can you provide more details on that $title-getArticleID(); which is not working? On the http://offene-naturfuehrer.de/web/Spezial:MobileKeyV1 I want to generate a MobileKey for Lamium (Deutschland) (the page exists) and the PHP code wants to get the content of page Unterlippe (Lamiaceae) (the page exists too) but the PHP code above stops at the step retrieving the articleID. The strange thing is, when I want to generate a MobileKey for Lamium (Deutschland) and just print out (1) Title::newFromText(Unterlippe (Lamiaceae)) and (2) Title::newFromText(Lamium (Deutschland)) ... I get only an articleID for (2) not for (1) Andreas --- printout of title object for page (1) Unterlippe (Lamiaceae) Title Object ( [mTextform] = Unterlippe (Lamiaceae) [mUrlform] = Unterlippe_(Lamiaceae) [mDbkeyform] = Unterlippe_(Lamiaceae) [mUserCaseDBKey] = Unterlippe_(Lamiaceae) [mNamespace] = 0 [mInterwiki] = [mFragment] = [mArticleID] = 0 [mLatestID] = [mEstimateRevisions:Title:private] = [mRestrictions] = Array ( ) [mOldRestrictions] = [mCascadeRestriction] = [mCascadingRestrictions] = [mRestrictionsExpiry] = Array ( ) [mHasCascadingRestrictions] = [mCascadeSources] = [mRestrictionsLoaded] = [mPrefixedText] = [mTitleProtection] = [mDefaultNamespace] = 0 [mWatched] = [mLength] = -1 [mRedirect] = [mNotificationTimestamp] = Array ( ) [mHasSubpage] = ) --- printout of title object for page (2) Lamium (Deutschland) Title Object ( [mTextform] = Lamium (Deutschland) [mUrlform] = Lamium_(Deutschland) [mDbkeyform] = Lamium_(Deutschland) [mUserCaseDBKey] = Lamium_(Deutschland) [mNamespace] = 0 [mInterwiki] = [mFragment] = [mArticleID] = 36 [mLatestID] = [mEstimateRevisions:Title:private] = [mRestrictions] = Array ( ) [mOldRestrictions] = [mCascadeRestriction] = [mCascadingRestrictions] = [mRestrictionsExpiry] = Array ( ) [mHasCascadingRestrictions] = [mCascadeSources] = [mRestrictionsLoaded] = [mPrefixedText] = [mTitleProtection] = [mDefaultNamespace] = 0 [mWatched] = [mLength] = -1 [mRedirect] = [mNotificationTimestamp] = Array ( ) [mHasSubpage] = ) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2
Hi Platonides (sorry I posted from a wrong Email address, I try again with a proper subject line to get the thread line correct on http://news.gmane.org/gmane.science.linguistics.wikipedia.technical) 2013/1/15 Platonides platoni...@gmail.com: I tried to get the content via getArticleID() ... $titleObj=Title::newFromText(Existing page); $articleID=$titleObj-getArticleID(); Article::newFromID($articleID)-fetchContent(); etc. ... but it returns $articleID=0 although the page exits. With MW 1.18 this approach worked fine, but after upgrade to MW 1.20.2 it does not any more. It should be working, and it works for me on 1.20.2 Can you provide more details on that $title-getArticleID(); which is not working? On the http://offene-naturfuehrer.de/web/Spezial:MobileKeyV1 I want to generate a MobileKey for Lamium (Deutschland) (the page exists) and the PHP code wants to get the content of page Unterlippe (Lamiaceae) (the page exists too) but the PHP code above stops at the step retrieving the articleID. The strange thing is, when I want to generate a MobileKey for Lamium (Deutschland) and just print out (1) Title::newFromText(Unterlippe (Lamiaceae)) and (2) Title::newFromText(Lamium (Deutschland)) ... I get only an articleID for (2) not for (1) WikiPage::factory ($titleObj) prints the same articleID data as Title::newFromText(...) Andreas --- printout of title object for page (1) Unterlippe (Lamiaceae) Title Object ( [mTextform] = Unterlippe (Lamiaceae) [mUrlform] = Unterlippe_(Lamiaceae) [mDbkeyform] = Unterlippe_(Lamiaceae) [mUserCaseDBKey] = Unterlippe_(Lamiaceae) [mNamespace] = 0 [mInterwiki] = [mFragment] = [mArticleID] = 0 [mLatestID] = [mEstimateRevisions:Title:private] = [mRestrictions] = Array ( ) [mOldRestrictions] = [mCascadeRestriction] = [mCascadingRestrictions] = [mRestrictionsExpiry] = Array ( ) [mHasCascadingRestrictions] = [mCascadeSources] = [mRestrictionsLoaded] = [mPrefixedText] = [mTitleProtection] = [mDefaultNamespace] = 0 [mWatched] = [mLength] = -1 [mRedirect] = [mNotificationTimestamp] = Array ( ) [mHasSubpage] = ) --- printout of title object for page (2) Lamium (Deutschland) Title Object ( [mTextform] = Lamium (Deutschland) [mUrlform] = Lamium_(Deutschland) [mDbkeyform] = Lamium_(Deutschland) [mUserCaseDBKey] = Lamium_(Deutschland) [mNamespace] = 0 [mInterwiki] = [mFragment] = [mArticleID] = 36 [mLatestID] = [mEstimateRevisions:Title:private] = [mRestrictions] = Array ( ) [mOldRestrictions] = [mCascadeRestriction] = [mCascadingRestrictions] = [mRestrictionsExpiry] = Array ( ) [mHasCascadingRestrictions] = [mCascadeSources] = [mRestrictionsLoaded] = [mPrefixedText] = [mTitleProtection] = [mDefaultNamespace] = 0 [mWatched] = [mLength] = -1 [mRedirect] = [mNotificationTimestamp] = Array ( ) [mHasSubpage] = ) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?
On Wed, Jan 16, 2013 at 2:11 AM, Jon Robson jdlrob...@gmail.com wrote: To me disambiguation seems like a common problem of wikis and thus should be a core feature. On a wiki about people, people share the same name On a wiki about cities, cities share the same name etc etc you get the idea. Agreed. Also, it only makes sense for mediawiki to natively provide proper support for disambiguations, the same way there is support for redirects. Furthermore, I'd like to underline what Ryan said in his original message, since several people seem to be ignoring it, and using code bloat as an argument for using an extension: The code is pretty clean and lightweight, so it wouldn't increase the footprint of core MediaWiki (it would actually decrease the existing footprint slightly since it replaces more hacky existing core code). So * core bloat isn't really an issue*. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] WikiEditor-like functionality in the VisualEditor age
And finally, many wikis built their own custom features: ProofreadPage on wikisource is of particular note here, e.g. https://en.wikisource.org/w/index.php?title=Page:United_States_Statutes_at_Large_Volume_43_Part_2.djvu/15action=edit -Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome, Munagala Ramanath (Ram)
On 01/15/2013 02:08 PM, Rob Lanphier wrote: Please join me in welcoming Ram! Ram, welcome to Wikimedia. I look forward to working with you. Since you have experience teaching computer science, we'll have to talk about how we can be teaching and training our tech community better! -- Sumana Harihareswara Engineering Community Manager Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2
Hi, One of Tyler Romeo's suggested approach is to use Article::newFromTitle($titleObj) ... but it does need a second argument of type IContextSource. I do not know how to get it or how to instantiate it yet. At the moment it does not retrieve the page content as well because of the missing IContextSource ... Can somebody Maybe Article::newFromTitle($titleObj, IContextSource-something) would Do I need that IContextSource? I read at Manual:Article.php to use the WikiPage class instead, so maybe at this point I don't have to use IContextSource(?). WikiPage::factory($titleObj) prints the same Title Object structure as Title::newFromText(Existing page but not existing for Title object class); Does it may have to do with data base update, some of the maintenance scripts? Maybe data in the data base are not entirely correct after upgrading from MW 1.18 to MW 1.20.2? How do I ensure that the data base has correct page data? When I run php ./update.php --conf /path/to/LocalSettings.php all is fine ... any further idea? Thanks, Andreas ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+
On Jan 15, 2013, at 9:18 AM, Robert Vogel vo...@hallowelt.biz wrote: Hi everybody! I'm writing an extension that needs to replace/disable the build-in suggestion feature of the search box. In case there may be a better approach to solve your problem, what is it you're looking to add to this feature? Most of this module is pretty trivial and only wraps around other features that have elaborate hooks and toggles (such as PrefixIndex and the API). -- Krinkle ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disambiguation features: Do they belong in core or in an extension?
Nicolas Vervelle nvervelle at gmail.com writes: My own preference would be to have this in the core for several reasons. [...] Yes, the core code already handles disambiguation pages specially in some ways (Special:Disambiguations, MediaWiki:Disambiguationspage). But it treats them as exceptional cases - a bit of a hack. Proposal: A fundamentally more robust and flexible way to handle disambiguation pages would be to move them all into their own namespace. For example, http://en.wikipedia.org/wiki/Bow and http://en.wikipedia.org/wiki/Bow_(disambiguation) could both redirect to http://en.wikipedia.org/wiki/Disambiguation:Bow This would make it much more consistent with the ordinary wikipage functions and internationalization, as well as making it easy to programmatically identify disambiguation pages without affecting the database schema. Though there would initially be some upheaval as pages were moved in bulk, the result would be stabler. One drawback is that dab pages from all namespaces would end up in the new Disambiguation namespace. So, http://en.wikipedia.org/wiki/Portal:Football would redirect to http://en.wikipedia.org/wiki/Disambiguation:Portal:Football That is not a major problem, as it is easy to identify pages beginning with Disabiguation:Portal: etc, though it makes it harder to list mainspace dabpages (they would have to be identified by eliminating valid namespace prefixes from the pagename). A similar cross-namespace shadow hierarchy already exists at Template:Editnotices/Page/ for edit notices. After the initial bulk creation, a bot would need to check for new dab pages that needed moving into the dab namespace, and for new pages in the dab namespace that lacked a redirect in the relevant non-dab namespace. Alternatively, the need for separate redirect pages could be obviated if MediaWiki automatically redirected browsers when a corresponding dab-namespace exists (but this would be a departure from the existing practice of having all redirects as editable wikipages). Individual wikis would be free to opt out of the new approach. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program
Hi, next week I will have a casual chat with Siko about the new Wikimedia Individual Engagement Grants and how MediaWiki contributors could theoretically benefit from them. If you have specific questions or feedback there is nothing stopping you from contact her directly, but maybe it's more useful to start sharing here. This way I can go with more consolidated questions and feedback. Original Message Subject: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program Date: Tue, 15 Jan 2013 18:03:42 -0800 From: Siko Bouterse sboute...@wikimedia.org Reply-To: wikimedi...@lists.wikimedia.org To: wikimediaannounc...@lists.wikimedia.org *Hi all, Im pleased to announce the launch of a new grantmaking program at the Wikimedia Foundation: Individual Engagement Grants. These grants will support Wikimedians as individuals or small teams to complete projects that benefit the Wikimedia movement, lead to online impact, and serve the mission, community, and strategic priorities. This new program is intended to complement WMFs other grantmaking programs as well as the grants that chapters and affiliate organizations provide. The first round of proposals will be accepted from now until 15 February 2013. Were also seeking committee members to help select the first round of grantees. Please help spread the word to other lists! To get involved, share your thoughts, submit a proposal, or join the committee: https://meta.wikimedia.org/wiki/Grants:IEG For more information on all of WMFs grantmaking programs: https://meta.wikimedia.org/wiki/Grants:Start Best wishes,* Siko -- Siko Bouterse Head of Individual Engagement Grants Wikimedia Foundation, Inc. sboute...@wikimedia.org *Imagine a world in which every single human being can freely share in the sum of all knowledge. * *Donate https://donate.wikimedia.org or click the edit button today, and help us make it a reality!* ___ Please note: all replies sent to this mailing list will be immediately directed to Wikimedia-l, the public mailing list of the Wikimedia community. For more information about Wikimedia-l: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l ___ WikimediaAnnounce-l mailing list wikimediaannounc...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikimediaannounce-l ___ Wikimedia-l mailing list wikimedi...@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2
You could try RequestContext::getMain() to get the context source object. Alex Monk On 16/01/13 16:56, Andreas Plank wrote: Hi, One of Tyler Romeo's suggested approach is to use Article::newFromTitle($titleObj) ... but it does need a second argument of type IContextSource. I do not know how to get it or how to instantiate it yet. At the moment it does not retrieve the page content as well because of the missing IContextSource ... ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
- Original Message - From: . oscar.vi...@gmail.com It could be interesting (but I have no idea if is feasible), if git recognize automatically elements in a commit text, and colorize it on the terminal screen (or maybe bold it if the screen renders using truetype fonts). This way, if you have written wikidata many times, you will quickly spot a problem if the commit renders to you with fixed wkidata bug by reversing the polarity and wikidata is not bolded/colored different. A alternate would be for this script/program, to extract keywords and present to you, so if you notice the commit lack the label wikidata, theres something wrong. Talk to the people over on the derivative wikitext list, spun off to contain discussions relevant to creating a replacement MWText parser (there are, I think 5 or 6 projects in varying degrees of activity; though the list itself is pretty quiet). Cheers, -- jra -- Jay R. Ashworth Baylink j...@baylink.com Designer The Things I Think RFC 2100 Ashworth Associates http://baylink.pitas.com 2000 Land Rover DII St Petersburg FL USA #natog +1 727 647 1274 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
This is a ridiculous conversation and I can't believe it now spans +20 messages. ___Don't___ -1 a patchset for a typo. The result of that is far more catastrophic. We put off volunteers and people end up wasting valuable time doing rebases due to the time taken to find another code review. Code review is hard enough as it is. Make a ___suggestion___ about fixing the typo so the person who ends up merging it can do it themselves if necessary. +1 or -1 depending on the __code quality__ alone. Can we now return our attention to engaging in more interesting and useful conversations to the future of MediaWiki such as [Wikitech-l] Disambiguation features: Do they belong in core or in an extension? On Tue, Jan 15, 2013 at 10:09 PM, Tim Starling tstarl...@wikimedia.org wrote: On 16/01/13 01:57, Daniel Kinzler wrote: On 15.01.2013 15:06, Tyler Romeo wrote: I agree with Antoine. Commit messages are part of the permanent history of this project. From now until MediaWiki doesn't exist anymore, anybody can come and look at the change history and the commit messages that go with them. Now you might ask what the possibility is of somebody ever coming across a single commit message that has a typo in it, but when you're using git-blame, git-bisect, or other similar tools, it's very possible. And then they see a typo. So what? If you look through a mailing list archive or Wikipedia edit comments, you will also see typos. I'm much more concerned about scaring away new contributors with such nitpicking. I am also concerned about demotivating people. Giving a change -1 means that you are asking the developer to take orders from you, under threat of having their work ignored forever. A -1 status can cause a change to be ignored by other reviewers, regardless of its merit. If the developer can't lower their sense of pride sufficiently to allow them to engage with nitpickers, then the change might be ignored by all concerned for many months. However, if you give minor negative feedback with +0, the change stays bold in your review requests list, as if you haven't reviewed it at all. I've tried giving -1 with a comment to the effect of please merge this immediately regardless of my nitpicking above, but IIRC the comment was ignored. I think people who give -1 should be aware of the potential roadblock they are creating. And I would like to see a feature in Gerrit to unbold +0 reviews. Under Subversion, my policy as a reviewer was to never ask the committer to fix a typo in a comment, since committing the fix myself was easier than telling them what to fix, and doing so avoided offence. Under Gerrit, it's more difficult to submit amendments, and I hate multi-author patchsets anyway, so negative feedback seems more attractive. Maybe the answer is better scripting for amendments and dependent commits. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Jon Robson http://jonrobson.me.uk @rakugojon ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
Op 15-1-2013 12:44, Jeroen De Dauw schreef: bla IMHO all commit messages should be green. Maarten *hides* ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Željko Filipin presenting at FOSDEM
On Tue, Jan 15, 2013 at 9:19 AM, Sébastien Santoro dereck...@espace-win.org wrote: Congratulations for your talk. It's nice to see a MediaWiki involvement in this conference. Thanks, I hope to see you there. :) This is not the only MediaWiki talk, I think there will be two more[1]. Željko -- [1] http://www.mediawiki.org/wiki/Events/FOSDEM ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Željko Filipin presenting at FOSDEM
If you have any questions about presenting at FOSDEM then let us know. Many of us have presented about Wikimedia related projects at FOSDEM before. --tomasz On Wed, Jan 16, 2013 at 10:32 AM, Željko Filipin zfili...@wikimedia.orgwrote: On Tue, Jan 15, 2013 at 9:19 AM, Sébastien Santoro dereck...@espace-win.org wrote: Congratulations for your talk. It's nice to see a MediaWiki involvement in this conference. Thanks, I hope to see you there. :) This is not the only MediaWiki talk, I think there will be two more[1]. Željko -- [1] http://www.mediawiki.org/wiki/Events/FOSDEM ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On Wed, Jan 16, 2013 at 7:26 PM, Jon Robson jdlrob...@gmail.com wrote: This is a ridiculous conversation and I can't believe it now spans +20 messages. Apparently you don't care, but other people do care. Please do not disregard other people's opinions because you believe yours is correct. To keep it in the style of this thread; attitudes like these do demotivate _me_. Also, I fully agree with Maarten's last comment. Bryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Željko Filipin presenting at FOSDEM
On Wed, Jan 16, 2013 at 7:35 PM, Tomasz Finc tf...@wikimedia.org wrote: If you have any questions about presenting at FOSDEM then let us know. Many of us have presented about Wikimedia related projects at FOSDEM before. I gave a similar talk (about a test automation project that I did for a client before I started working for WMF) at a few conferences and meetups and wrote a blog post with screencast[1], so I pretty much know what I will be talking about, I just need to adjust it for the current project. I plan to write a blog post and record another screencast while preparing the talk. I can post the link here when it is done, so interested people can provide feedback. Željko -- [1] http://filipin.eu/test-automation-at-homeswap-com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
I don't mind getting dinged for typos. If I'm being sloppy it's fair to point it out. However, I think the social contract should be that after I fix the typos you requested then you owe me a real code review where you look at the merits of the code. Code review is an awesomely useful but time consuming thing to provide. Patrolling for typos is not. Put it this way, if I concede and agree that the bikeshed can be green, the people who spent three hours arguing for green should feel obligated to turn up to the working bee to help with the painting. Luke Welling On Wed, Jan 16, 2013 at 10:40 AM, Bryan Tong Minh bryan.tongm...@gmail.comwrote: On Wed, Jan 16, 2013 at 7:26 PM, Jon Robson jdlrob...@gmail.com wrote: This is a ridiculous conversation and I can't believe it now spans +20 messages. Apparently you don't care, but other people do care. Please do not disregard other people's opinions because you believe yours is correct. To keep it in the style of this thread; attitudes like these do demotivate _me_. Also, I fully agree with Maarten's last comment. Bryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program
Interesesting. Could this possibly work like gsoc but aimed at experianced devs instead of newbies? For example if I (in theory) had an idea for a new feature or extension that could have a large impact on how wikimedians use mediawiki could I potentially get a grant to work on such an idea over the summer? -bawolff On 2013-01-16 1:48 PM, Quim Gil q...@wikimedia.org wrote: Hi, next week I will have a casual chat with Siko about the new Wikimedia Individual Engagement Grants and how MediaWiki contributors could theoretically benefit from them. If you have specific questions or feedback there is nothing stopping you from contact her directly, but maybe it's more useful to start sharing here. This way I can go with more consolidated questions and feedback. Original Message Subject: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program Date: Tue, 15 Jan 2013 18:03:42 -0800 From: Siko Bouterse sboute...@wikimedia.org Reply-To: wikimedia-l@lists.wikimedia.**orgwikimedi...@lists.wikimedia.org To: wikimediaannounce-l@lists.**wikimedia.orgwikimediaannounc...@lists.wikimedia.org *Hi all, I’m pleased to announce the launch of a new grantmaking program at the Wikimedia Foundation: Individual Engagement Grants. These grants will support Wikimedians as individuals or small teams to complete projects that benefit the Wikimedia movement, lead to online impact, and serve the mission, community, and strategic priorities. This new program is intended to complement WMF’s other grantmaking programs as well as the grants that chapters and affiliate organizations provide. The first round of proposals will be accepted from now until 15 February 2013. We’re also seeking committee members to help select the first round of grantees. Please help spread the word to other lists! To get involved, share your thoughts, submit a proposal, or join the committee: https://meta.wikimedia.org/**wiki/Grants:IEGhttps://meta.wikimedia.org/wiki/Grants:IEG For more information on all of WMF’s grantmaking programs: https://meta.wikimedia.org/**wiki/Grants:Starthttps://meta.wikimedia.org/wiki/Grants:Start Best wishes,* Siko -- Siko Bouterse Head of Individual Engagement Grants Wikimedia Foundation, Inc. sboute...@wikimedia.org *Imagine a world in which every single human being can freely share in the sum of all knowledge. * *Donate https://donate.wikimedia.org or click the edit button today, and help us make it a reality!* ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2
2013/1/16 Krenair kren...@gmail.com: You could try RequestContext::getMain() to get the context source object. Alex Monk On 16/01/13 16:56, Andreas Plank wrote: I tried many functions in various classes: RequestContext WikiPage Title Article ... but it is always the same I do not get the raw page content. I believe these functions should work but somehow they do not get the right incoming data. In an earlier post I printed the Title object and it is almost empty, I wonder about that too, might be related? Maybe it has nothing to do with these functions but with the database? However browsing and editing those pages runs smoothly but my application does not get the related page data. I wonder too, because with MW 1.18 it did work fine. If I switch on $wgDebugDumpSql there are a lot of queries and also a page query SELECT page_id, page_namespace, page_title, page_restrictions, page_counter, page_is_redirect, page_is_new, page_random, page_touched, page_latest, page_len FROM `page` WHERE page_namespace = 'X' LIMIT N ... So it seems the right query from a quick look. Has anybody an idea how I can trace and debug the data flow until my app asserts “the page does not exist”? Andreas ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Generating documentation from JavaScript doc comments
On Fri, Jan 4, 2013 at 11:00 AM, Krinkle krinklem...@gmail.com wrote: (...) I've recently looked into a documentation generator for VisualEditor and though I haven't stopped looking yet, I'm currently pausing rather long at JSDuck. It is very well engineered and especially optimised for modern JavaScript (inheritance, mixins, event emitters, override/overload methods from another module, modules, etc.). It is also easy to extend when needing to implement custom @tags. I've set up a vanilla install for VisualEditor's code base here: http://integration.wmflabs.org/mwext-VisualEditor-docs/ Would it be possible/difficult to get something similar working for gadgets on WMF wikis? Helder ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] some issues with missing Files
It seems that I've found the way: Export(in the FROM wiki directory): php maintainance/dumpBackup.php --current --pagelist=tmp --uploads ~/dmp Import(in the TO wiki directory) : php maintainance/importDump.php ~/dmp --uploads Here dmp is the list of files. On Wed, Jan 16, 2013 at 5:34 PM, Yury Katkov katkov.ju...@gmail.com wrote: Hi again! Put it another way: what is the right way of exporting/importing a bunch of files from one MediaWiki to another? Cheers, Yury Katkov, WikiVote On Mon, Dec 24, 2012 at 6:38 PM, Yury Katkov katkov.ju...@gmail.com wrote: Hi everyone! I want to move my wiki from one server (A) to another (B). On server A I have a lot of files. I don't need them on server B, but I need all my wikipages. What I've done is: I removed the 'images' directory and ran: php maintainance/rebuildall.php Unfortunately now wiki still thinks that all the files are at the places where they need to be. This is clearly not true since the images directory is empty. Questions: 1) how to property delete the files permanently? 2) is there any script to make the files and their description in MediaWiki DB consistent? Cheers, - Yury Katkov, WikiVote ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: [Wikimedia-l] [Wikimedia Announcements] Announcing the Individual Engagement Grants program
On 01/16/2013 11:16 AM, bawolff wrote: Interesesting. Could this possibly work like gsoc but aimed at experianced devs instead of newbies? For example if I (in theory) had an idea for a new feature or extension that could have a large impact on how wikimedians use mediawiki could I potentially get a grant to work on such an idea over the summer? Looking at https://meta.wikimedia.org/wiki/Grants:IEG#ieg-applying you can see the rules / principles of the game: scope of the projects, maximum budget, selection criteria... A software development project contributing to Wikimedia's mission and strategic priorities could theoretically fit, and in fact all of the conditions defined there could be equally applicable to software projects (feasibility, skills/experience, community involvement, sustainability after the funded period...) In your example the parallelism with GSOC would be restricted to the S, isn't it. :) It is more similar to a contract work between a community and a freelancer. You would need to define a project and a budget. If you convince the community and you get the deal you will ave no mentor assigned and you will need to report on a regular basis to your customers. If you fail you will fail more as a professional freelancer + recognized community member than as a GSOC student showing up for the first time. At least this is how I personally see it. I'm just learning about this program like anybody else. -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On 2013-01-16 3:07 PM, Luke Welling WMF lwell...@wikimedia.org wrote: I don't mind getting dinged for typos. If I'm being sloppy it's fair to point it out. However, I think the social contract should be that after I fix the typos you requested then you owe me a real code review where you look at the merits of the code. Code review is an awesomely useful but time consuming thing to provide. Patrolling for typos is not. Put it this way, if I concede and agree that the bikeshed can be green, the people who spent three hours arguing for green should feel obligated to turn up to the working bee to help with the painting. Luke Welling Well put. That sounds entirely fair to me. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Extension code review
Hi, I'm writing a new extension and would like to get some code reviewers (even 1 would be fantastic). May I have any assistance? Extension: http://www.mediawiki.org/wiki/Extension:NamespaceRelations Current pending reviews: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/NamespaceRelations,n,z -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Generating documentation from JavaScript doc comments
Would it be possible/difficult to get something similar working for gadgets on WMF wikis? Helder What would be really cool would be if the js content handler code detected code doc comments and formatted them nicely. Something similar to how back in the old days people used to have things like /* ==header == */ That would be picked up by mw and formatted as headers. But automatic and more complete. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: Deployment freeze during Eqiad migration week (and deployments next week)
Thanks for the response, Rob (and I hope you're feeling better soon). I'm terrified about using new tools to deploy on Thursday, but super excited to be leaving scap behind \o/ Will there be git-deploy gurus on call or generally available during upcoming deployment windows? On Tue, Jan 15, 2013 at 4:51 PM, Rob Lanphier ro...@wikimedia.org wrote: Hi Arthur, Sorry for the delayed reply. Comments below: On Tue, Jan 15, 2013 at 12:07 PM, Arthur Richards aricha...@wikimedia.org wrote: I am scheduled to do a deployment (to all wikis) for MobileFrontend this Thursday afternoon: * I presume scap/sync-blah will still work for 1.21wmf7 - can you please confirm? After tomorrow's deployment, assuming it goes well, it's going to be all git-deploy, all the time, and scap/sync-file will go away. * Will I *have* to use git-deploy for 1.21wmf8 or will scap/sync-blah still work? You'll need to use git-deploy. And to reiterate my question from a few days ago re the deployment freeze next week: * During the 'deployment freeze', in the event that someone needs to deploy an emergency fix, what do we do and who do we need to communicate with? Use #wikimedia-operations. If no one is around, and it is an emergency, go ahead and make the fix, and note it in the logs. Will scap/deploy-whatever still be functional, or will gti-deploy be the de-facto deployment method at that point? git-deploy is going to be it. Rob -- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] A testing bug management wheel
There are ongoing separate discussions about the best way to organize testing sprints and bug days. The more we talk and the more we delay the beginning of continuous activities the more I believe the solution is common for both: Smaller activities and more frequent. Each one of them less ambitious but more precise. Not requiring by default the involvement of developer teams. Especially not requiring the involvement of WMF dev teams. Of course we want to work together with development teams! But just not wait for them. They tend to be busy, willing and at the same time unwilling (a problem we need to solve but not necessarily before starting a routine of testing and bug management activities. If a dev team (WMF or not) wants to have dedicated testing and bug management activities we will give them the top priority. Imagine this wheel: Week 1: manual testing (Chris) Week 2: fresh bugs (Andre) Week 3: browser testing (Željko) Week 4: rotten bugs (Valerie) All the better if there is certain correlation between testing and bugs activities, but no problem if there is none. From the point of view of the week coordinators this is how a cycle would look like: Week 1: decide the goal of the next activity. Weeks 2-3: preparing documentation, recruiting participants. Week 4: DIY activities start. Support via IRC mailing list. Group sprint on Wed/Thu DIY activities continue. Week 4+1: Evaluation of results. Goal of the next activity During the group sprints there would be secondary DIY tasks for those happy to participate but not fond of the main goal of the week. If one group needs more than one activity per month they can start overflowing the following week, resulting in simultaneous testing bugs activities. Compared to the current situation, this wheel looks powerful and at the same time relatively easy to set up. There will plenty of things to improve and fine tune, but probably none of them will require to stop the wheel. What do you think? -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Scap says I'm not dead yet
Hi everyone, We attempted to deploy 1.21wmf8 using git-deploy[1], and ran into enough problems that we decided the best course of action is to use scap[2] today and for the foreseeable future. The new plan is to implement scap/sync-file/sync-dir for Eqiad as a temporary solution for next week and possibly as much as a few weeks beyond that, and then deploy a new-and-improved git-deploy when we have the problems we uncovered today sorted out. That means people deploying code *don't* have to learn a new system right away; scap/sync-file/sync-dir will remain the standard mechanism for the rest of this week and the rest of next at a minimum, and probably a while longer. Yay for the devil we know! :-) Ryan and Chris will still be doing the tutorial tomorrow (lunchtime in SF). Ryan tells me that the changes he's planning are under-the-hood, and that users of the system shouldn't notice anything substantially different in the workflow. Rob [1] http://wikitech.wikimedia.org/view/Git-deploy [2] http://wikitech.wikimedia.org/view/Scap ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] some issues with missing Files
Do you want the files or not? The first post sounds like you don't, in that case you'd need to truncate the image/oldimage/archive tables. This will remove all registration of the files. Clearing memcached (or whatever cache you use) might be needed to. You can copy the files over with copyFileBackend.php from the old backend to the new one. The src backend would be the default upload backend name (just dump $wgFileBackends in eval.php to find it) unless you configured it otherwise and the dst backend would have to be added to $wgFileBackends and point to the new server somehow (such as via NFS or removable media). -- View this message in context: http://wikimedia.7.n6.nabble.com/some-issues-with-missing-Files-tp4992140p4993969.html Sent from the Wikipedia Developers mailing list archive at Nabble.com. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On 17/01/13 00:14, Chad wrote: Really, I think the whole thread is moot with the pending upgrade. Typos should always be fixed before merging (I think we all agree?), and the new abilities to fix these from the UI means we won't need to mark people as -1 to do so. I didn't mention commit summaries in my post. My interest is in nitpicking in general. Jeroen calls arguments over commit summaries the /ultimate/ bikeshed, which they may or may not be; there are plenty of other examples which may compete for that title. Nitpicking is the minor end of the negative feedback spectrum. By definition, it has the smallest concrete payoff when advice is followed, in exchange for complex, context-dependent social costs. You should think carefully before you do it. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On Wed, Jan 16, 2013 at 6:07 PM, Tim Starling tstarl...@wikimedia.org wrote: On 17/01/13 00:14, Chad wrote: Really, I think the whole thread is moot with the pending upgrade. Typos should always be fixed before merging (I think we all agree?), and the new abilities to fix these from the UI means we won't need to mark people as -1 to do so. I didn't mention commit summaries in my post. My interest is in nitpicking in general. Jeroen calls arguments over commit summaries the /ultimate/ bikeshed, which they may or may not be; there are plenty of other examples which may compete for that title. Indeed, I had missed that. Nitpicking is the minor end of the negative feedback spectrum. By definition, it has the smallest concrete payoff when advice is followed, in exchange for complex, context-dependent social costs. You should think carefully before you do it. *nod* I agree. And really, nitpicks in code can always be cleaned up later (heck, we did it for years with SVN). It's only nitpicks in commit messages that should always be fixed, since they're immutable after submission. And it's *that* that I think won't be a big deal anymore (since any drive-by contributor could fix a typo on the spot). -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A testing bug management wheel
Some examples to illustrate. On 01/16/2013 02:25 PM, Quim Gil wrote: Smaller activities and more frequent. Each one of them less ambitious but more precise. Not requiring by default the involvement of developer teams. Especially not requiring the involvement of WMF dev teams. ... Imagine this wheel: Week 1: manual testing (Chris) If there are no priorities ripe for sprint at http://www.mediawiki.org/wiki/QA/Features_testing then an idea could be to help commits waiting (and waiting) to be reviewed in Gerrit. Collaborating with the authors, we could test those fixes and features in fresh installs at Labs and bring first hand feedback to the related bug reports as a way to help reviewers. We could even help testing projects at https://www.mediawiki.org/wiki/Review_queue The organization of this week could be done by http://www.mediawiki.org/wiki/Groups/Proposals/Features_testing Week 2: fresh bugs (Andre) I don't think Andre will have problems finding tasks for this. But again, if the top priority, WMF lead projects are well covered then we can help and involve others e.g. interesting extensions. Organized by https://www.mediawiki.org/wiki/Groups/Proposals/Bug_Squad Week 3: browser testing (Željko) As long as there is a backlog at http://www.mediawiki.org/wiki/QA/Browser_testing/Test_backlog it should be easy for Željko to decide what comes next. Having the backlog empty would be a nice problem to have, but if that happens I'm sure we will find areas to fill it up. Organized by http://www.mediawiki.org/wiki/Groups/Proposals/Browser_testing Week 4: rotten bugs (Valerie) http://www.mediawiki.org/wiki/Community_metrics/December_2012#Stalled suggests that we won't have problems finding tasks any time soon... Also organized by the Bug Squad. -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Extension code review
On Wed, Jan 16, 2013 at 10:28 PM, Paul Selitskas p.selits...@gmail.com wrote: Hi, I'm writing a new extension and would like to get some code reviewers (even 1 would be fantastic). May I have any assistance? Extension: http://www.mediawiki.org/wiki/Extension:NamespaceRelations Current pending reviews: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/NamespaceRelations,n,z Ok, I'm reviewing your extension. -- Best Regards, Sébastien Santoro aka Dereckson http://www.dereckson.be/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
On 2013-01-16 7:20 PM, Chad innocentkil...@gmail.com wrote: On Wed, Jan 16, 2013 at 6:07 PM, Tim Starling tstarl...@wikimedia.org wrote: On 17/01/13 00:14, Chad wrote: Really, I think the whole thread is moot with the pending upgrade. Typos should always be fixed before merging (I think we all agree?), and the new abilities to fix these from the UI means we won't need to mark people as -1 to do so. I didn't mention commit summaries in my post. My interest is in nitpicking in general. Jeroen calls arguments over commit summaries the /ultimate/ bikeshed, which they may or may not be; there are plenty of other examples which may compete for that title. Indeed, I had missed that. Nitpicking is the minor end of the negative feedback spectrum. By definition, it has the smallest concrete payoff when advice is followed, in exchange for complex, context-dependent social costs. You should think carefully before you do it. *nod* I agree. And really, nitpicks in code can always be cleaned up later (heck, we did it for years with SVN). It's only nitpicks in commit messages that should always be fixed, since they're immutable after submission. And it's *that* that I think won't be a big deal anymore (since any drive-by contributor could fix a typo on the spot). If we're talking nitpicks in general. Ive seen -1 for things like someFunc($a, $b) instead of someFunc( $a, $b ) which I agree does more harm than good. I imagine how much someone considers spelling issues to be a minor nitpick varries quite a lot between people. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages
If we're talking nitpicks in general. Ive seen -1 for things like someFunc($a, $b) instead of someFunc( $a, $b ) which I agree does more harm than good. I disagree. The entire purpose of code review is to make sure the code is organized and styled properly. If code isn't written in accordance with MediaWiki style requirements, that's even more of a reason to -1 than a typo in the commit message. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Wed, Jan 16, 2013 at 9:11 PM, Bawolff Bawolff bawo...@gmail.com wrote: On 2013-01-16 7:20 PM, Chad innocentkil...@gmail.com wrote: On Wed, Jan 16, 2013 at 6:07 PM, Tim Starling tstarl...@wikimedia.org wrote: On 17/01/13 00:14, Chad wrote: Really, I think the whole thread is moot with the pending upgrade. Typos should always be fixed before merging (I think we all agree?), and the new abilities to fix these from the UI means we won't need to mark people as -1 to do so. I didn't mention commit summaries in my post. My interest is in nitpicking in general. Jeroen calls arguments over commit summaries the /ultimate/ bikeshed, which they may or may not be; there are plenty of other examples which may compete for that title. Indeed, I had missed that. Nitpicking is the minor end of the negative feedback spectrum. By definition, it has the smallest concrete payoff when advice is followed, in exchange for complex, context-dependent social costs. You should think carefully before you do it. *nod* I agree. And really, nitpicks in code can always be cleaned up later (heck, we did it for years with SVN). It's only nitpicks in commit messages that should always be fixed, since they're immutable after submission. And it's *that* that I think won't be a big deal anymore (since any drive-by contributor could fix a typo on the spot). If we're talking nitpicks in general. Ive seen -1 for things like someFunc($a, $b) instead of someFunc( $a, $b ) which I agree does more harm than good. I imagine how much someone considers spelling issues to be a minor nitpick varries quite a lot between people. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l