[Wikitech-l] standing up for setFunctionTagHook()
Hi all. When working on a wiki extension I came across this thread (http://lists.wikimedia.org/pipermail/wikitech-l/2011-January/051437.html) regarding the purpose/history of setFunctionTagHook() and where/when to use it. Daniel Friesen wrote: setFunctionTagHook was added so that you could expand variables inside tags. Then setHook() got added $frame and the need for such hook type disappeared. I'd like to kill it, although it has been there for some time (if any list reader uses it, please stand up). I started using the hook in a Firebase extension (http://www.mediawiki.org/w/index.php?title=Extension:Firebase) in order to replace firebaseraw tags with responses to HTTP requests /before most other wiki text is parsed/. It works great for me because I can insert data into chunks of code like a google street view widget: {{#widget:Google Street View |lat=firebaseraw url=http://gamma.firebase.com/SomeAccountName/lat; / |lng=firebaseraw url=http://gamma.firebase.com/SomeAccountName/lon; / |yaw=370.64659986187695 |pitch=-20 |zoom=0 }} I'm new to extension writing, though, so maybe there are other, better ways to accomplish this? If not, consider this a vote to leave in setFunctionTagHook()! --Benny ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikidata status
Hi Tim! For some reason, your mail went under my radar until now. Sorry about that. On 03.09.2012 03:30, Tim Starling wrote: I've been busy, but I can do another review of the ContentHandler branch this week. That would be great, thanks! There's the question of what level of quality we should aim for. We'll probably find things that will break when a non-text content type is used. I'd like to see such issues solved, or at least make sure the ContentHandler API will support a solution without major changes, but my reasons are mostly aesthetic. In principle, such development work can be done after the merge. But it seems to me that there's no point in merging it if it only supports text content, since MediaWiki already supports pure text content well enough. If we can achieve robust support for non-text data types, then the motivation for merging it will be stronger. I agree. We are using the mechanism extensively for Wikidata, which of course uses non-text content. That should serve as a pretty good test. I'm trying to fix any issues I find on the road, but of course we are not exploring every possible corner of MediaWiki. I think we should make sure that the main functionality of MediaWiki works with the ContentHandler without a hitch, and try to have sane failure modes for stuff that is not yet (dis)covered. There's no way to be 100% sure, of course. This week, I have only done a little maintenance on the Wikidata branch (like merging master again). I'll be looking for loose ends some more over the next couple of days, but any changes should be confined to small corners of mediawiki. I'm using gerrit for all changes now, so you should be able track what i'm doing (well, last week I had to resort to a direct push when gerrit got very confused about a merge). Thanks again Daniel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikidata status
Just an idea, but wouldn't Lua source code make a perfect alternative content type? 2012/9/7 Daniel Kinzler dan...@brightbyte.de: Hi Tim! For some reason, your mail went under my radar until now. Sorry about that. On 03.09.2012 03:30, Tim Starling wrote: I've been busy, but I can do another review of the ContentHandler branch this week. That would be great, thanks! There's the question of what level of quality we should aim for. We'll probably find things that will break when a non-text content type is used. I'd like to see such issues solved, or at least make sure the ContentHandler API will support a solution without major changes, but my reasons are mostly aesthetic. In principle, such development work can be done after the merge. But it seems to me that there's no point in merging it if it only supports text content, since MediaWiki already supports pure text content well enough. If we can achieve robust support for non-text data types, then the motivation for merging it will be stronger. I agree. We are using the mechanism extensively for Wikidata, which of course uses non-text content. That should serve as a pretty good test. I'm trying to fix any issues I find on the road, but of course we are not exploring every possible corner of MediaWiki. I think we should make sure that the main functionality of MediaWiki works with the ContentHandler without a hitch, and try to have sane failure modes for stuff that is not yet (dis)covered. There's no way to be 100% sure, of course. This week, I have only done a little maintenance on the Wikidata branch (like merging master again). I'll be looking for loose ends some more over the next couple of days, but any changes should be confined to small corners of mediawiki. I'm using gerrit for all changes now, so you should be able track what i'm doing (well, last week I had to resort to a direct push when gerrit got very confused about a merge). Thanks again Daniel -- Project director Wikidata Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin Tel. +49-30-219 158 26-0 | http://wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikidata status
On 07.09.2012 12:22, Denny Vrandečić wrote: Just an idea, but wouldn't Lua source code make a perfect alternative content type? Yes, it would, but it's also a textual content type. These are rather unproblematic. One thing that is not yet implemented is a highlighter interface for different kinds of code as content model. Non-text content that mediawiki already uses: * gadget definitions * sidebar definition * license list for uploads * book sources * list of templates that indicate a disambiguation * etc... Would be nice to convert some of them to actual non-text content models, just for testing. I don't want to commit to that at the moment though, I doubt I'll find the time. -- daniel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikidata status
Not really much different from the CSS/JS cases we already have. The trickier stuff will be non-text stuff. Or especially the 'multipart' stuff as the [[ContentHandler]] page calls them. It would be nice to have a test case where WikiText is part of a multipart type that also store things like a high-level list of categories (even though it will be an empty list). Just to make sure that basic functionality is not depending too heavily on pages being non-multipart. -- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name] On Fri, 07 Sep 2012 03:22:50 -0700, Denny Vrandečić denny.vrande...@wikimedia.de wrote: Just an idea, but wouldn't Lua source code make a perfect alternative content type? 2012/9/7 Daniel Kinzler dan...@brightbyte.de: Hi Tim! For some reason, your mail went under my radar until now. Sorry about that. On 03.09.2012 03:30, Tim Starling wrote: I've been busy, but I can do another review of the ContentHandler branch this week. That would be great, thanks! There's the question of what level of quality we should aim for. We'll probably find things that will break when a non-text content type is used. I'd like to see such issues solved, or at least make sure the ContentHandler API will support a solution without major changes, but my reasons are mostly aesthetic. In principle, such development work can be done after the merge. But it seems to me that there's no point in merging it if it only supports text content, since MediaWiki already supports pure text content well enough. If we can achieve robust support for non-text data types, then the motivation for merging it will be stronger. I agree. We are using the mechanism extensively for Wikidata, which of course uses non-text content. That should serve as a pretty good test. I'm trying to fix any issues I find on the road, but of course we are not exploring every possible corner of MediaWiki. I think we should make sure that the main functionality of MediaWiki works with the ContentHandler without a hitch, and try to have sane failure modes for stuff that is not yet (dis)covered. There's no way to be 100% sure, of course. This week, I have only done a little maintenance on the Wikidata branch (like merging master again). I'll be looking for loose ends some more over the next couple of days, but any changes should be confined to small corners of mediawiki. I'm using gerrit for all changes now, so you should be able track what i'm doing (well, last week I had to resort to a direct push when gerrit got very confused about a merge). Thanks again Daniel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit server issues
2012/9/7 Chad innocentkil...@gmail.com: And mediawiki/extensions/TranslationNotifications is back up too, master was intact. The remaining ones in extensions/* and debs/* are under repair. It seems TranslationNotifications extension has some problem. Git review is always failing with this error: error: unpack failed: error Missing unknown 4ee9dc02a655cb376a8f20e6d5c1ee95a81b1a37 And remote rejected with unpack error. ps: Some information about this corruption available at http://asheepapart.blogspot.com/2011/10/gerrit-code-review-unpack-error-missing.html Thanks Santhosh Thottingal ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit server issues
On Fri, Sep 7, 2012 at 11:30 AM, Santhosh Thottingal santhosh.thottin...@gmail.com wrote: 2012/9/7 Chad innocentkil...@gmail.com: And mediawiki/extensions/TranslationNotifications is back up too, master was intact. The remaining ones in extensions/* and debs/* are under repair. It seems TranslationNotifications extension has some problem. Git review is always failing with this error: error: unpack failed: error Missing unknown 4ee9dc02a655cb376a8f20e6d5c1ee95a81b1a37 And remote rejected with unpack error. ps: Some information about this corruption available at http://asheepapart.blogspot.com/2011/10/gerrit-code-review-unpack-error-missing.html Ah indeed, that repo does need some work. This is the only thing I'm working on today, and I figured out how to resolve it late last night--things will be back to normal by the end of today. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Jenkins permissions
Hello, I have changed the Jenkins permission system a few minutes ago. Previously, any authenticated user could change the whole configuration. The new scheme is: - anonymous : can read - authenticated with a labs account : can read, manually trigger a Gerrit change. - 'wmf' LDAP group : can do anything There are very few people doing configuration changes, so that new scheme should not cause any harm. cheers, -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Jenkins permissions
Le 07/09/12 21:38, Antoine Musso a écrit : There are very few people doing configuration changes, so that new scheme should not cause any harm. And as Chad pointed to me : if someone outside of the WMF needs to do config changes, we can add an exception to let you help out :-) -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Status on Wikidata review (Sep 7)
Hi all, here's our weekly list with Wikidata review items. We have a few new ones, some of them also rather small, next to the big gorillas. I say it as Rob does: if you want to discuss on of the items here, it would make sense to rename the thread in your answer. Based on experience, I accept the futility of this recommendation, though :) * ContentHandler. Tim stated he could do another review this week, in the meanwhile we have tied up several loose ends. We are eagerly awaiting the review and further input. Here's the bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=38622 Some discussion was here on the list, too. * Sites. There has been quite some activity, input from the outside, and the sites code has been reworked greatly. It landed in Gerrit today and is awaiting reviews. Here's the link to the RFC: https://www.mediawiki.org/wiki/Requests_for_comment/New_sites_system and it is widely implemented. * jQuery table sorting improvements. This improves the UI on initial display of a sorted table. Here's the patchset in Gerrit: https://gerrit.wikimedia.org/r/#/c/22562/ * userWasLastToEdit improvement. It is just moving the function to a static place so it is available for an extension to call. That one looks straightforward to me. Here's the patchset in Gerrit: https://gerrit.wikimedia.org/r/#/c/22049/ * Towards nested transactions. There was quite some discussion on the mailing list, and here are the currently open patchsets towards that. Some other patchsets have been already merged. Currently still open are: https://gerrit.wikimedia.org/r/#/c/21582/ and https://gerrit.wikimedia.org/r/#/c/21584/ . Note that this patch neither implements nested transactions, nor do we require them (we worked around that), it just would make the code nice if they were there and these are patches preparing for nested transactions. * Reviewing the extension. We have a beautiful extension here, which is quite ready for deployment on the repository, and we need to organize the review for the extension as a whole. I assume we will start talking about this next week. We would be very happy to see activity and merges on all or any of these items :) Cheers, Denny -- Project director Wikidata Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin Tel. +49-30-219 158 26-0 | http://wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit server issues
On Thu, Sep 6, 2012 at 5:11 PM, Chad innocentkil...@gmail.com wrote: In mediawiki/extensions/*: Comments, FacebookOpenGraph, GoogleDocs4MW, Nonlinear, OnlineStatusBar, Phalanx, RandomImageByCategory, SemanticImageInput, ShoutWikiAds, SphinxSearch, TranslationNotifcations All of these have been fixed other than Nonlinear (more heavily broken). TranslationNotifications' master is intact, but some of the changes are still in a bad state and I need to finish cleaning it up. In operations/*: debs/mysqlatfacebook, debs/wikimedia-lvs-realserver debs/wikimedia-search-qa, debs/wikistats, software software, wikimedia-lvs-realserver, wikimedia-search-qa and wikistats are all back and fine. mysqlatfacebook is very broken like Nonlinear. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit server issues
On Fri, 07 Sep 2012 08:30:47 -0700, Santhosh Thottingal santhosh.thottin...@gmail.com wrote: 2012/9/7 Chad innocentkil...@gmail.com: And mediawiki/extensions/TranslationNotifications is back up too, master was intact. The remaining ones in extensions/* and debs/* are under repair. It seems TranslationNotifications extension has some problem. Git review is always failing with this error: error: unpack failed: error Missing unknown 4ee9dc02a655cb376a8f20e6d5c1ee95a81b1a37 And remote rejected with unpack error. ps: Some information about this corruption available at http://asheepapart.blogspot.com/2011/10/gerrit-code-review-unpack-error-missing.html Thanks Santhosh Thottingal I'd like to know what kind of blobs it is that Gerrit depends on so much but git is happy to prune. -- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l