Re: [Wikitech-l] Moving MediaWikiWidgets.org to Wikimedia
Folks, If current implementation can be made more secure, I'm well for it - ideally more secure then the alternative native PHP extensions infrastructure MW has right now. Unfortunately, this was born because writing extensions for widgets is hard, writing them in secure way is even harder, splitting the code into two layers maintained on different levels seemed better, especially for small independent wikis who have hard time managing extensions. If anyone is interested in rewriting it using more secure framework or maybe fixing all the widgets (providing better validators or alike), you're very welcome to do so. There were other ideas like context-based auto-escaping, better review process, rewrite using new version of Smarty, installation from centralized repo and so on, but someone else sill have to continue that. Anyway, we can kill the work of many years, that's easy, more over, it'll die alone without active commitment anyway, I hope good open source community like MW team can find out a way to utilize this work. If there is any help needed, I'm eager to help with the move as I did before by maintaining Widgets, OpenID extension and contributing to Semantic MediaWiki/Forms/Bundle and so on - it's just time for me to step away from MW development. Best, Sergey On Tue, Sep 4, 2012 at 6:06 PM, Daniel Friesen dan...@nadir-seen-fire.comwrote: On Tue, 04 Sep 2012 14:14:34 -0700, Jeroen De Dauw jeroended...@gmail.com wrote: Hey, This is clearly not the case. Because there are XSS vectors all over these widgets. Developers who understand security do not monitor code strewn about in piles of wiki pages. They in no way have the same level of gatekeeping as extensions. So instead of writing a widget publicly visible, the random third party admin who barley knows the basics of PHP goes write something that quite possibly is not published anywhere and can have gaping security holes not known to them and remaining so. Random third party admins running wikis so small they hack together custom code don't have people who understand security reviewing anything for vulnerabilities. Even if it's public it's going to stay vulnerable. The only way these sites will ever have something secure is if we have a nice widget request area where third party admins can get someone to write a simple widget extension for some service they want to use. You also mention stuff such as Html::element. Guess what - they might not know about it. I have looked at A LOT of extensions, and I can assure you that you have a rather rosy view on the subject. We just have bad documentation on the subject. A proper PHP based Widget extension would provide some apis even nicer than our current Html. Easy to use validation. Boilerplate cleanup. And would naturally come with good documentation that encourages people to use the high-level style of code. Well, not just encourages... I'd say it wouldn't even mention the fact you can concatenate strings of html. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- -- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name] __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Moving MediaWikiWidgets.org to Wikimedia
Hi everybody, Sumana suggested I should post to this list so you guys can help me. A few years back I saw a need in easy widget creation and too many extensions that just did that, but were not so well maintained and had a bunch of XSS holes in them and so on, that's when I came up with idea of Widgets extension: http://www.mediawiki.org/wiki/Extension:Widgets Since individual widgets were just wiki pages, I created a standalone wiki where everybody can post their widgets (in special Widget namespace) which will be available to everyone after basic security review (it integrates with Flagged Revisions if it's installed): http://www.mediawikiwidgets.org/ There are plenty of widgets there and quite a few people use the extension and the widgets on their wikis. That being said, I moved on to other kind of work and would be happy to give MediaWikiWidgets.org back to community instead of slowly killing it by inactivity. It would be great if Wikimedia Foundation could take this project over and host it either as standalone site or as part of mediawiki.org - I'll be happy to assist in moving the catalog and would probably still be curious enough to contribute a widget or two once in a while. Best, Sergey ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Patch for Widgets extension
Guys, I've updated Widgets extension to support latest version of FlaggedRevisions, but no longer have repo access as MW moved to Git. If somebody can patch the extension for me, I'll really appreciate that - here's the diff (not much work): Index: extensions/Widgets/WidgetRenderer.php === --- extensions/Widgets/WidgetRenderer.php (revision 114929) +++ extensions/Widgets/WidgetRenderer.php (working copy) @@ -137,7 +137,7 @@ if ( $widgetTitle $widgetTitle-exists() ) { if ( $wgWidgetsUseFlaggedRevs ) { - $flaggedWidgetArticle = FlaggedArticle::getTitleInstance( $widgetTitle ); + $flaggedWidgetArticle = FlaggableWikiPage::getTitleInstance( $widgetTitle ); $flaggedWidgetArticleRevision = $flaggedWidgetArticle-getStableRev(); if ( $flaggedWidgetArticleRevision ) { Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ http://www.meetup.com/Web-Performance-NY/ http://www.showslow.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] I moved support of HeaderTabs here
Due to my inability to support HeaderTabs and other extensions that were created by/for Semantic Communities LLC, I discontinued the mailing list that was used for those extensions and instructed users to ask their questions here. Hope Yaron and others can continue helping those users out here as they already do. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ http://www.meetup.com/Web-Performance-NY/ http://www.showslow.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Stepping down as maintainer for OpenID extension
You're very welcome! Hope UI improvements will continue along with backend features. ;) Sergey On Mon, May 16, 2011 at 5:00 PM, Casey Brown li...@caseybrown.org wrote: On Sun, May 15, 2011 at 8:41 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: I'd like to step down as maintainer for OpenID extension. Please remove me as maintainer from Bugzilla. Thank you for your hard work, Sergey! :-) -- Casey Brown Cbrown1023 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Stepping down as maintainer for OpenID extension
Glad that we already have a person who stepped up. Congrats, Thomas! Sergey On Mon, May 16, 2011 at 9:05 AM, Siebrand Mazeland s.mazel...@xs4all.nlwrote: Great! (and I'm guilty :P) -- Siebrand Mazeland M: +31 6 50 69 1239 Skype: siebrand On 16-05-11 13:11 Thomas Gries m...@tgries.de wrote: Someone already assigned me as maintainer (ok) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Stepping down as maintainer for OpenID extension
Guys, I'd like to step down as maintainer for OpenID extension. Please remove me as maintainer from Bugzilla. Sorry - don't have time and energy to apply patches and fix bugs, hope somebody else can pick it up. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ http://www.meetup.com/Web-Performance-NY/ http://www.showslow.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] modernizing mediawiki
I'd like to chime into the discussion and point out that there is a huge community around extensions and features that are not used by Wikimedia foundation - Semantic MediaWiki co and OpenID to name a few. These extensions are maintained by 3rd party developers and many of them, including myself don't have Wikimedia interests as their primary goal. I run quite a few wikis based on MediaWiki and even though I personally don't need Wordpress easiness and comfortable with creating build environments using SVN externals and stuff like that, I'm always working toward general ease of use and Widgets extension I wrote, OpenID picker contributions as well as some SMW changes I made were always targeted at users outside of Wikimedia. So I'd like Wikimedia crowd to acknowledge outside community and their needs. Don't get me wrong - you guys built a great product and some aspects of it like internationalization wiki or extensibility or APIs are quite unique, but Open Source requires open mind with things. At the same time, I'd like to say that Domas and others are exactly right about different interests with different parties - if you need something, go ahead and build it. I spent quite a lot of time coding away things that were needed for my business and for my personal projects and it's fair. Nobody in Open Source world is obligated to code for you! Not in Wordpress world either - they, for that matter had quite lousy software for quite a while until they did more work on fixing it and it only happened because they have a commercial enterprise that has different interests then Wikimedia foundation. All that being said, I think there is a great opportunity for MW to get even larger piece of corporate knowledge management market and if you or somebody else wants to go there and make your money on it, go ahead - companies like Yaron's WikiWorks, for example will be happy to work with you on it - they live and breath Mediawikis. Just don't expect that somebody will do work for you for free only because Wikimedia foundation is non-for-profit and their projects don't charge money. We all need to eat and software developers are expansive, especially good ones, especially those who can do both complex and user friendly software. Don't insult people by saying that they didn't make something you need, they already spend time that they could've spent on their families. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Wed, Mar 3, 2010 at 5:34 PM, fl foxyloxy.wikime...@gmail.com wrote: On Wed, 3 Mar 2010 10:31 pm, Platonides wrote: fl wrote: I would disagree. The Wikimedia software has been released under an open source license: While the WMF certainly has no obligation to improve the software, they most definately have an obligation to release the source code to third-parties. Wrong. They do it, and it's consistent with their mission, but they have no obligation to do that. They could even have MediaWiki be closed source software. No, they can't. As far as I am aware, MediaWiki is released under the GNU General Public License[1], which stipulates, among other things, the requirement to release a program's source code to the public and to release any derived changes under the same license[2]. If the WMF were to try and convert MediaWiki to a closed source project, they would be liable to legal actions against them. [1] http://www.mediawiki.org/wiki/Special:Version [2] http://www.gnu.org/licenses/old-licenses/gpl-2.0.html -- fl ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] jQuery and extensions
I remember it was mentioned that jQuery is going to be bundled with MediaWiki soon (1.16?). I wonder what would be the best approach for extensions that use jQuery and package it up within itself to use global jQuery and do that in backwards compatible way (e.g. to make sure that prior versions of MediaWiki don't get broken). Is there a good How-To for developers to rewrite their extensions? Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] jQuery and extensions
Thanks, Siebrand. It gives us a bit more time to write such documentation. Sergey On Sun, Jan 3, 2010 at 4:02 PM, Siebrand Mazeland s.mazel...@xs4all.nlwrote: Just a little roadmap info: mwEmbed[1], contaning jQuery, will not be part of MediaWiki 1.16. It will be added in a later version. Tim Starling pulled it from trunk a few weeks ago and Michael Dale has continued working on it in a branch. I expect it will be added again after the branching/tagging of 1.16, and be part of 1.17 after a thorough shake down during the 1.17 alpha period. Siebrand [1] http://www.mediawiki.org/wiki/MwEmbed -Original Message- From: wikitech-l-boun...@lists.wikimedia.org [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of Sergey Chernyshev Sent: Sunday, January 03, 2010 9:42 PM To: Wikimedia developers Subject: [Wikitech-l] jQuery and extensions I remember it was mentioned that jQuery is going to be bundled with MediaWiki soon (1.16?). I wonder what would be the best approach for extensions that use jQuery and package it up within itself to use global jQuery and do that in backwards compatible way (e.g. to make sure that prior versions of MediaWiki don't get broken). Is there a good How-To for developers to rewrite their extensions? ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New York MediaWiki Users Group
Great! We'll be happy to have you! Sergey On Mon, Oct 5, 2009 at 2:07 PM, Brion Vibber br...@wikimedia.org wrote: On 10/2/09 2:07 PM, Sergey Chernyshev wrote: Hello to fellow MediaWiki hackers, I'm writing to let you know that Yaron Koren and finally decoded to organize New York MediaWiki Users Group: http://www.meetup.com/MediaWiki-New-York/ Events are going to be dedicated to various uses of MediaWiki on public web sites as well as enterprise. If you have a conference room or some other space in New York and you're willing to host some of our events, let me know! If you're in New York or passing by at the right time, feel free to just join the group! Awesome! I'll drop a line if I'm ever in town. :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New York MediaWiki Users Group
Hello to fellow MediaWiki hackers, I'm writing to let you know that Yaron Koren and finally decoded to organize New York MediaWiki Users Group: http://www.meetup.com/MediaWiki-New-York/ Events are going to be dedicated to various uses of MediaWiki on public web sites as well as enterprise. If you have a conference room or some other space in New York and you're willing to host some of our events, let me know! If you're in New York or passing by at the right time, feel free to just join the group! Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal for editing template calls within pages
Guys, Why are we talking XML? Yaron's Semantic Forms survived without XML - yes, it's definition language is not English, but it's good enough and works relatively good with the rest of MW syntax. I think using XML will not necessarily do good here as form creators would want the language to be close to the rest of the wiki. Here's the example of the form which is the mix of wikitext, HTML and regular wiki formatting http://www.techpresentations.org/w/index.php?title=Form:Presentationaction=edit It worked for me. Maybe this kind of definition can be merged with Template pages, but frankly SF's model works pretty well (of course, SF is also smart about data because of SMW, but it can be more manual). Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Sat, Sep 26, 2009 at 8:11 AM, Tei oscar.vi...@gmail.com wrote: On Sat, Sep 26, 2009 at 10:14 AM, Dmitriy Sintsov ques...@rambler.ru wrote: * Tei oscar.vi...@gmail.com [Sat, 26 Sep 2009 02:40:06 +0200]: Hello. Heres a screenshot of me editing the wikipedia: http://zerror.com/unorganized/crap/nogoodenough.png All the webmasters on this mail list will spot the problem with this text in 1 second: is unreadable. The space betwen lines, the lines length, the complexity of the text... Is really hard to read. A HTML textarea can server for writting emails, and simple text, but on this image fail short. Textareas are not designed for this, or are not good enough. How a webmaster can make that text better? well.. you need to stop using the HTML textarea widget. And emulate it with divs, css and javascript. You need to colorize the code. Nowdays *ALL* good code editors colorize code. If our code editor don't colorize the wiki sintax, or don't even try, our editor is bad. I could be wrong, but maybe [[links]] and {{templates}} can be detected and colorized. And since you are emulating a editor, you can add a bit of usefull beaviors: make so some areas are read only, so the cursor skip then. Oh.. and you can make the whole think AJAXified,.. so wen you click [Edit section] this section become editable, and wen you save, the edit view send, and is replaced by the result. Why would you want to people bounce here and there to post stuff in 2009? He... our computers support 24 M colors, and we are showing text with 2 colors? pfff I am very much supporting you! Both code colorizing and AJAX editing preview. And maybe a links code completion - when yuu press [[ it will open an JS-generated dialog with drop-down title search list. It's not that wikitext is too hard (with the huge exception of templates) but the editor is very much restricted.. Though templates surely aren't nice and it's probably is better to keep them separate and XML-ize them. Dmitriy For templates you can use a Code beatiffier, that unofuscate the code. Templates can be hard to write, but theres no reason to let then be hard to read. Maybe MW already do that.. Here is a example using another template language (bbcode): [uRL]lalala[/URL] = [url]lalala[/url] [quote=Dan]blabla bla bla[/img] = [quote= Dani ] bla bla bla [/quote] I know that this maybe is a bad idea, If this may cause other problems, and theres one million others things that are worth our time :-I A serverside Code beatifier can also helps a clientside colorizer. He can massage the template code first, and be smarter than the colorizers and prevent problems before hit the colorizer. A code beafifier can be implemented in a incremental way, the first version can just lowercase all letter. The colorizer can also be implemented in a incremental way, starting colorizing simple stuff. If a colorizing or a beatifier become a problem, can be deactivated, and things will continue smoothly. -- -- ℱin del ℳensaje. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Usability initiative
Are these releases in any way connected to MediaWiki releases though? I understand that all that gets release on Wikimedia projects, but it'll be great to have the rest of MW user base benefit from these as well (I have personal interest here as you can imagine ;)). Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Sep 15, 2009 at 2:15 PM, Dmitriy Sintsov ques...@rambler.ru wrote: * Lane, Ryan ryan.l...@ocean.navo.navy.mil [Tue, 15 Sep 2009 12:41:26 -0500]: See: http://usability.wikimedia.org/wiki/Releases This is listed as one of the features of the Citron release. Thanks. I've figured out that will be http://www.mediawiki.org/wiki/Extension:UsabilityInitiative then probably moved to core. I've just confused the codename with release number of MediaWiki. (You know, how these developers of operating systems love to give codenames to their systems - Fedora or Windows usually come with codename) Dmitriy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wiki not responding to patches
There might be an aggressive cache like APC that is configured not to check for changes. In this case, you need to stop and then start your web server. But it's a rare case, I would say. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Thu, Aug 13, 2009 at 7:32 PM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Thu, Aug 13, 2009 at 3:10 PM, Taja Anandtaja.w...@gmail.com wrote: 3) removed all the content of index.php !! [it still runs] Then you're editing the wrong files. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Extensions in Bugzilla
You can count me in for OpenID. Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jul 31, 2009 at 2:01 PM, Chad innocentkil...@gmail.com wrote: Hey all, I've compiled a list[1] of extensions in Bugzilla that don't have a default assignee. If you want to be (or should already be) the assignee for any of these, please let me know. Would like to really cut that list down so bugs are getting triaged to someone who cares. Right now, they're all being assigned to wikibugs-l, and we know how many bugs he resolves :p -Chad [1] http://www.mediawiki.org/wiki/User:^demon/Unloved_extensions ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Extensions in Bugzilla
Actually Maps was just developed by Jeroen De Dauw under supervision of Yaron Koren as part of Google Summer of Code - they might be interested in maintaining it as well, just didn't get around to applying for it, maybe. Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jul 31, 2009 at 5:58 PM, Chad innocentkil...@gmail.com wrote: On Fri, Jul 31, 2009 at 4:52 PM, Audeaude.w...@gmail.com wrote: On Fri, Jul 31, 2009 at 2:01 PM, Chad innocentkil...@gmail.com wrote: Hey all, I've compiled a list[1] of extensions in Bugzilla that don't have a default assignee. If you want to be (or should already be) the assignee for any of these, please let me know. Would like to really cut that list down so bugs are getting triaged to someone who cares. Right now, they're all being assigned to wikibugs-l, and we know how many bugs he resolves :p -Chad [1] http://www.mediawiki.org/wiki/User:^demon/Unloved_extensions I think the best place for the maps bugs is the maps-l list. (if no one objects) Or, they can Ævar, myself, and anyone else that wants to get them. -Aude ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l Added everybody who's replied to the list thus far. Maps-l isn't registered with an account on Bugzilla, so I can't assign it right now, so I just put Aude and Ævar. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Extensions in Bugzilla
I didn't mean that you should kick those guys out ;) Just tried to explain why it probably doesn't have a person assigned yet. Sergey On Fri, Jul 31, 2009 at 8:02 PM, Chad innocentkil...@gmail.com wrote: On Fri, Jul 31, 2009 at 8:00 PM, Sergey Chernyshevsergey.chernys...@gmail.com wrote: Actually Maps was just developed by Jeroen De Dauw under supervision of Yaron Koren as part of Google Summer of Code - they might be interested in maintaining it as well, just didn't get around to applying for it, maybe. Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jul 31, 2009 at 5:58 PM, Chad innocentkil...@gmail.com wrote: On Fri, Jul 31, 2009 at 4:52 PM, Audeaude.w...@gmail.com wrote: On Fri, Jul 31, 2009 at 2:01 PM, Chad innocentkil...@gmail.com wrote: Hey all, I've compiled a list[1] of extensions in Bugzilla that don't have a default assignee. If you want to be (or should already be) the assignee for any of these, please let me know. Would like to really cut that list down so bugs are getting triaged to someone who cares. Right now, they're all being assigned to wikibugs-l, and we know how many bugs he resolves :p -Chad [1] http://www.mediawiki.org/wiki/User:^demon/Unloved_extensions I think the best place for the maps bugs is the maps-l list. (if no one objects) Or, they can Ævar, myself, and anyone else that wants to get them. -Aude ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l Added everybody who's replied to the list thus far. Maps-l isn't registered with an account on Bugzilla, so I can't assign it right now, so I just put Aude and Ævar. -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l I can always add more CCs :) -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal: switch to HTML 5
I'm only considering the projects I was going to work on and can't talk for all the things MediaWiki team should have in mind - I was going to add support for RDFa (http://www.w3.org/TR/rdfa-syntax/) which currently is W3C Recomendation, but only for XHTML and even though HTML profiles (or whatever they are called) are in the works they are not ready yet. Switching to non-recomendation will mean that implementing RDFa in standard compliant form will have to be postponed for quite a while. As for commotion I mentioned, I believe there is at least tension between RDFa world and Microdata world that is being pushed along HTML 5 spec. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Jul 7, 2009 at 2:46 PM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Tue, Jul 7, 2009 at 2:29 PM, Sergey Chernyshevsergey.chernys...@gmail.com wrote: Just my 2 cents - I don't think that switching to new not yet W3C Recomendation is a good idea - many extensions and features are not yet finished (e.g. RDFa support for it) Much of the spec is very stable. We would not be using any part that's likely to change -- in most cases, only parts that have multiple interoperable implementations. Such parts of the spec will not change significantly; that's a basic principle of most W3C specs' development processes (and HTML 5's in particular). We use other W3C specs that nominally aren't stable, e.g., some parts of CSS. We used plenty of CSS 2.1 when that was still nominally a Working Draft. We use multi-column layout (at least in our content on enwiki) even though that's a Working Draft. Etc. Given the way the W3C works, it's not reasonable at all to require that the *whole* spec be a Candidate Recommendation or whatever. You can make a feature-by-feature stability assessment pretty easily in most cases: if it has multiple interoperable implementations, it's stable and can be used; if it doesn't, it's not very useful anyway, so who cares? and considering a huge commotion in this area it might not be a very good decision. There is no more commotion. XHTML 2.0 is officially dead. The working group is disbanded. HTML 5 is the only version of HTML that is being developed. I don't think you've raised any substantive objections here. *Practically* speaking, what reason is there not to begin moving to HTML 5 now? ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal: switch to HTML 5
On Wed, Jul 8, 2009 at 11:15 AM, Sergey Chernyshevsergey.chernys...@gmail.com wrote: I'm only considering the projects I was going to work on and can't talk for all the things MediaWiki team should have in mind - I was going to add support for RDFa (http://www.w3.org/TR/rdfa-syntax/) which currently is W3C Recomendation, but only for XHTML and even though HTML profiles (or whatever they are called) are in the works they are not ready yet. Switching to non-recomendation will mean that implementing RDFa in standard compliant form will have to be postponed for quite a while. I'm pretty sure this will be resolved within a matter of months, one way or another. Either Ian will cave and support RDFa, or RDFa will support HTML 5 (at least in a usable draft form) without HTML 5's explicit agreement, or microdata will gain support as wide as RDFa. At worst, you can still use MW 1.15 while things are being worked out. Or maybe we could provide a switch to allow HTML 5 or XHTML, but I'm leery of that, since it negates most of the benefits. I admit that I don't follow RDF and semantic web stuff too closely, so I'm not very qualified to address this objection. I'm pretty sure that RDFa support is not an issue for the overwhelming majority of our users, however. On the other hand, improved video support and better form handling for a significant percentage of our users are examples of clear and concrete benefits from HTML 5. I see your point - video is clearly more popular then RDFa and if you're willing to go off-standard to support it, it's might be a reasonable decision for a site like Wikipedia. Not sure what is the rush for that and why can't it wait till HTML 5 spec becomes a recommendation. I'm not that familiar with HTML 5 support in modern browsers to state that there are going to be regressions with some other things, but it might be another thing to consider, although Wikipedia might be big enough to be a driving force in such decisions. Is this actually a *practical* problem even for the very small number of users who want to use RDFa? I mean, will RDFa really not work with HTML 5 in practice, or will it work but it's not standardized? Sorry, can't give you a definitive answer - CCing RDFa list for this. Guys, will be happy if you provide where RDFa support stands here. As for commotion I mentioned, I believe there is at least tension between RDFa world and Microdata world that is being pushed along HTML 5 spec. Yes, there definitely is tension there! Just not between HTML 5 and XHTML 2 -- that's over, even if a few people might not have gotten the message yet. I don't know what will happen with RDFa vs. microdata. I find it unlikely that anyone will convince Ian to include RDFa at this point with just arguments. But if it sees much wider adoption than microdata, he'd probably include it. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal: switch to HTML 5
Great, looks like HTML5 vs. XHTML fight is infecting everything. Just my 2 cents - I don't think that switching to new not yet W3C Recomendation is a good idea - many extensions and features are not yet finished (e.g. RDFa support for it) and considering a huge commotion in this area it might not be a very good decision. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Jul 7, 2009 at 9:38 AM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Tue, Jul 7, 2009 at 2:37 AM, Remember the dotrememberthe...@gmail.com wrote: That page clearly says that there will be an XHTML 5. XHTML is not going away. By XHTML I meant the family of standards including XHTML 1.0, 1.1, 2.0, etc.. XHTML 5 is identical to HTML 5 except with a different serialization. Practically speaking, however, it looks like no one will use XHTML 5 either, because it's impossible to deploy on the current web. (See below.) As far as I can tell, it was thrown in as a sop to XML fans, on the basis that it cost very little to add it to the spec (given the definition in terms of DOM plus serializations), without any expectation that anyone will use it in practice. What's to prevent a malicious user from manually posting an invalid submission? If there are no server-side checks, will the servers crash? Obviously there will be server-side checks as well! This will just serve to inform the user immediately that they're missing a required field, without having to wait for the server or use JavaScript. Why be cruel to our bot operators? XHTML is simpler and more consistent than tag soup HTML, and it's a lot easier to find a good XML parser than a good HTML parser. Because it will make the markup easier to read and write for humans, and smaller. Things like leaving off superfluous closing elements do not make for tag soup. One of the great features of HTML 5 is that it very carefully defines the text/html parsing model in painstaking backward-compatible detail. For example, the description of unquoted attributes is as follows: The attribute name, followed by zero or more space characters, followed by a single U+003D EQUALS SIGN character, followed by zero or more space characters, followed by the attribute value, which, in addition to the requirements given above for attribute values, must not contain any literal space characters, any U+0022 QUOTATION MARK () characters, U+0027 APOSTROPHE (') characters, U+003D EQUALS SIGN (=) characters, U+003C LESS-THAN SIGN () characters, or U+003E GREATER-THAN SIGN () characters, and must not be the empty string. If an attribute using the unquoted attribute syntax is to be followed by another attribute or by one of the optional U+002F SOLIDUS (/) characters allowed in step 6 of the start tag syntax above, then there must be a space character separating the two. http://dev.w3.org/html5/spec/Overview.html#attributes Given that browsers need to implement all these complicated algorithms anyway, there's no reason to prohibit the use of convenient shortcuts for authors. They're absolutely well-defined, and even if they're more complicated for machines to parse, they're easier for humans to use than the theoretically simpler XML rules. Anyway. Bots should not be scraping the site. They should be using the bot API, which is *vastly* easier to parse for useful data than any variant of HTML or XHTML. We could use this as an opportunity to push bot operators toward using the API -- screen-scraping has always been fragile and should be phased out anyway. Bot operators who screen-scrape will already break on other significant changes anyway; how many screen-scrapers will keep working when Vector becomes the default skin? So I view the added difficulty of screen-scraping as a long-term side benefit of switching to HTML 5, like validation failures for presentational elements. It makes behavior that was already undesirable more *obviously* undesirable. Clearly we can't break all the bots, though. So try breaking XML well-formedness. If there are only a few isolated complaints, go ahead with it. If it causes large-scale breakage, revert and tell all the bot operators to switch to the API, then try again in a few months or a year. Or when we enable Vector, which will probably break all the bots anyway. So, while I see some benefit to switching to HTML 5, I'd prefer to use XHTML 5 instead. XHTML 5, by definition, must be served under an XML MIME type. Anything served as text/html is not XHTML 5, and is required to be an HTML (not XHTML) serialization. We cannot serve content under non-text/html MIME types, because that would break IE, so we can't use XHTML 5. Even if we could, it would still be a bad idea. In XHTML 5, as in all XML, well-formedness errors are fatal. And we can't ensure that well-formedness errors are impossible without
Re: [Wikitech-l] On templates and programming languages
I think you're confusing simple logic of ParserFunctions in the template with a full scripting language like PHP. That's why I proposed to look at something simplified like Smarty or alike. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Jun 30, 2009 at 11:46 PM, Aryeh Gregor simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com wrote: On Tue, Jun 30, 2009 at 10:45 PM, Sergey Chernyshevsergey.chernys...@gmail.com wrote: I don't know about scripting languages for the templating, it might be an overkill. People are using ParserFunctions as a scripting language already. That's not feasibly going to be removed at this point. So the only way to go is to replace it with a better scripting language, which is what we're talking about. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On templates and programming languages
I don't know about scripting languages for the templating, it might be an overkill. When I was picking lower language for MediaWiki Widgets extension, I looked at popular PHP templating systems and ended up picking Smarty ( http://smarty.net/) - it can be security locked, it has a few useful features. You can see Widget code here: http://www.mediawikiwidgets.org/w/index.php?title=Widget:Google_Calendaraction=editand widget is called using a parser function like this: {{widget: Name|param=val|param2=val2}}. Double curlys are far from perfect, but there are not that many good alternatives - XML is probably the only good alternative because it's universal and use by many-many tools out there. Can't say that I'm an expert in templating languages though, especially when we're talking about power-users and not developers. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Jun 30, 2009 at 12:16 PM, Brion Vibber br...@wikimedia.org wrote: As many folks have noted, our current templating system works ok for simple things, but doesn't scale well -- even moderately complex conditionals or text-munging will quickly turn your template source into what appears to be line noise. And we all thought Perl was bad! ;) There's been talk of Lua as an embedded templating language for a while, and there's even an extension implementation. One advantage of Lua over other languages is that its implementation is optimized for use as an embedded language, and it looks kind of pretty. An _inherent_ disadvantage is that it's a fairly rarely-used language, so still requires special learning on potential template programmers' part. An _implementation_ disadvantage is that it currently is dependent on an external Lua binary installation -- something that probably won't be present on third-party installs, meaning Lua templates couldn't be easily copied to non-Wikimedia wikis. There are perhaps three primary alternative contenders that don't involve making up our own scripting language (something I'd dearly like to avoid): * PHP Advantage: Lots of webbish people have some experience with PHP or can easily find references. Advantage: we're pretty much guaranteed to have a PHP interpreter available. :) Disadvantage: PHP is difficult to lock down for secure execution. * JavaScript Advantage: Even more folks have been exposed to JavaScript programming, including Wikipedia power-users. Disadvantage: Server-side interpreter not guaranteed to be present. Like Lua, would either restrict our portability or would require an interpreter reimplementation. :P * Python Advantage: A Python interpreter will be present on most web servers, though not necessarily all. (Windows-based servers especially.) Wash: Python is probably better known than Lua, but not as well as PHP or JS. Disadvantage: Like PHP, Python is difficult to lock down securely. Any thoughts? Does anybody happen to have a PHP implementation of a Lua or JavaScript interpreter? ;) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Minify
It's probably worth mentioning that this bug is still open: https://bugzilla.wikimedia.org/show_bug.cgi?id=17577 This will save not only traffic on subsequent page views (in this case: http://www.webpagetest.org/result/090218_132826127ab7f254499631e3e688b24b/1/details/cached/it's about 50K), but also improve performance dramatically. I wonder if anything can be done to at least make it work for local files - I have hard time understanding File vs. LocalFile vs. FSRepo relationships to enable this just for local file system. It's probably also wise to figure out a way for it to be implemented on non-local repositories too so Wikimedia projects can use it, but I'm completely out of the league here ;) Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jun 26, 2009 at 11:42 AM, Robert Rohde raro...@gmail.com wrote: I'm going to mention this here, because it might be of interest on the Wikimedia cluster (or it might not). Last night I deposited Extension:Minify which is essentially a lightweight wrapper for the YUI CSS compressor and JSMin JavaScript compressor. If installed it automatically captures all content exported through action=raw and precompresses it by removing comments, formatting, and other human readable elements. All of the helpful elements still remain on the Mediawiki: pages, but they just don't get sent to users. Currently each page served to anons references 6 CSS/JS pages dynamically prepared by Mediawiki, of which 4 would be needed in the most common situation of viewing content online (i.e. assuming media=print and media=handheld are not downloaded in the typical case). These 4 pages, Mediawiki:Common.css, Mediawiki:Monobook.css, gen=css, and gen=js comprise about 60 kB on the English Wikipedia. (I'm using enwiki as a benchmark, but Commons and dewiki also have similar numbers to those discussed below.) After gzip compression, which I assume is available on most HTTP transactions these days, they total 17039 bytes. The comparable numbers if Minify is applied are 35 kB raw and 9980 after gzip, for a savings of 7 kB or about 40% of the total file size. Now in practical terms 7 kB could shave ~1.5s off a 36 kbps dialup connection. Or given Erik Zachte's observation that action=raw is called 500 million times per day, and assuming up to 7 kB / 4 savings per call, could shave up to 900 GB off of Wikimedia's daily traffic. (In practice, it would probably be somewhat less. 900 GB seems to be slightly under 2% of Wikimedia's total daily traffic if I am reading the charts correctly.) Anyway, that's the use case (such as it is): slightly faster initial downloads and a small but probably measurable impact on total bandwidth. The trade-off of course being that users receive CSS and JS pages from action=raw that are largely unreadable. The extension exists if Wikimedia is interested, though to be honest I primarily created it for use with my own more tightly bandwidth constrained sites. -Robert Rohde ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Minify
It probably depend on how getTimestamp() is implemented for non-local repos. Important thing is not to have it return new values too often and return real version of the image. If this is already the case, can someone apply this patch then - don't want to be responsible for such an important change ;) Sergey On Fri, Jun 26, 2009 at 3:52 PM, Chad innocentkil...@gmail.com wrote: You're patching already-existing functionality at the File level, so it should be ok to just plop it in there. I'm not sure how this will affect the ForeignApi interface, so it'd be worth testing there too. From what I can tell at a (very) quick glance, it shouldn't adversely affect anything from a client perspective on the API, as we just rely on whatever URL was provided to us to begin with. -Chad On Fri, Jun 26, 2009 at 3:31 PM, Sergey Chernyshevsergey.chernys...@gmail.com wrote: Which of all those file to change to apply my patch only to files in default repository? Currently my patch is applied to File.php http://bug-attachment.wikimedia.org/attachment.cgi?id=5833 If you just point me into right direction, I'll update the patch and upload it myself. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jun 26, 2009 at 3:17 PM, Chad innocentkil...@gmail.com wrote: The structure is LocalRepo extends FSRepo extends FileRepo. ForeignApiRepo extends FileRepo directly, and ForeignDbRepo extends LocalRepo. -Chad On Jun 26, 2009 3:15 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: It's probably worth mentioning that this bug is still open: https://bugzilla.wikimedia.org/show_bug.cgi?id=17577 This will save not only traffic on subsequent page views (in this case: http://www.webpagetest.org/result/090218_132826127ab7f254499631e3e688b24b/1/details/cached/it'shttp://www.webpagetest.org/result/090218_132826127ab7f254499631e3e688b24b/1/details/cached/it%27s http://www.webpagetest.org/result/090218_132826127ab7f254499631e3e688b24b/1/details/cached/it%27s about 50K), but also improve performance dramatically. I wonder if anything can be done to at least make it work for local files - I have hard time understanding File vs. LocalFile vs. FSRepo relationships to enable this just for local file system. It's probably also wise to figure out a way for it to be implemented on non-local repositories too so Wikimedia projects can use it, but I'm completely out of the league here ;) Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Jun 26, 2009 at 11:42 AM, Robert Rohde raro...@gmail.com wrote: I'm going to mention ... ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bugzilla components for Extensions
Thank you. I'll update the extension page. Sergey On Sat, May 23, 2009 at 8:27 PM, K. Peachey p858sn...@yahoo.com.au wrote: Submitted request into bugzilla as bug #1 (https://bugzilla.wikimedia.org/show_bug.cgi?id=1). -Peachey ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Tracking Wikipedia using ShowSlow
Guys, I've added Wikipedia homepage and Hillary Clinton (Re: bug #17577https://bugzilla.wikimedia.org/show_bug.cgi?id=17577) page to ShowSlow.com's tracker: http://www.showslow.com/details/?url=http%3A%2F%2Fwikipedia.org%2F http://www.showslow.com/details/?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FHillary_Clinton Right now ShowSlow.com's beacons tracks only YSlow and display YSlow grade and page size, but I'm hoping to add more measurements in the future. Let me know if this kind of information is interesting to you for your Usability initiative or general front-end optimization projects. ShowSlow is open source so you can run your own instance if you like - I'll be happy to help setting it up. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikipedia - openID provider?
Guys, Accepting OpenID works on generic MediaWiki - making it work for Wikimedia considering it's just a mapping between OpenID credentials and MediaWiki user ID (centralized or not) is not a problem - user accounts will stay the same and will have the ability to use password auth instead of OpenID if desired. I'm trying to make OpenID registration and login for MediaWiki as easy and user friendly as possible and will be happy if some of this work will enable it to get implemented on Wikimedia projects. You can see my latest UI improvements (rel 0.8.4.x in SVN) implemented on my projects: http://www.mediawikiwidgets.org/Special:OpenIDLogin and http://www.techpresentations.org/Special:OpenIDLogin (without icons) OpenID extension is currently nominated on Usability project: http://usability.wikimedia.org/wiki/Environment_Survey/MediaWiki_Extensions/Nomination#Other If you feel that there are bugs of reatures that will make OpenID more viable, feel free to add them to Bugzilla: https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensionscomponent=OpenID Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Apr 28, 2009 at 10:26 AM, Aryeh Gregor simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com wrote: On Tue, Apr 28, 2009 at 10:21 AM, Strainu strain...@gmail.com wrote: Well, in my view, the benefit will be the ability to use Wikipedia logins for linked projects. I have at least two use-cases in mind: 1. A local chapter's website/blog/whatever. 2. Sensitive tools that would require login. One such example would be WikiVerifier [1], an anti-vandal tool used on the Romanian Wikipedia That actually does make sense. I retract my objection to Wikipedia as an OpenID provider, regardless of whether it's a consumer. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On extension SVN revisions in Special:Version
You mean, just run svnversion for each extensions' folder? this might be a good idea - it'll also indicate if we're running from checkout. Might also be a good idea to get the branch URL (e.g. trunk vs MW REL_X vs. extensions' REL_X or just branch path after http://svn.wikimedia.org/svnroot/mediawiki/) just for ease of debugging and stuff. Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Wed, Apr 22, 2009 at 4:38 PM, Brion Vibber br...@wikimedia.org wrote: On 4/22/09 11:01 AM, Sergey Chernyshev wrote: I probably have an idea of how to implement this using a bot (or post-commit hook if we want real-time data) and externals. Essentially bot script should be checking the version of extension folder and generate and check-in an entry in another repository in the form like this: http://extensionversioningrepo/trunk/OpenID/version.php and write a Last Changed Rev from svn info http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/something like this: ? $wgExtensionRevisions['OpenID'] = 49664; H, could be maintained in tree this way, but IMHO not worth the trouble -- it's more likely to trigger merge conflicts and other annoyances. Probably better would be: 1) Look up the actual useful revision from the working directory when we're running from a checkout 2) Also include that info into releases ExtensionDistributor output so people who aren't working from checkouts will still get the useful versions, and look that file up if the SVN info isn't there. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Skin JS cleanup and jQuery
Yep, with jQuery in the core it's probably best to just bundle it. There is another issue with the code loading and stuff - making JS libraries call a callback function when they load and all the functionality to be there instead of relying on browser to block everything until library is loaded. This is quite advance thing considering that all the code will have to be converted to this model, but it will allow for much better performance when implemented. Still it's probably Phase 5 kind of optimization, but it can bring really good results considering JS being the biggest blocker. More on the topic is on Steve Souders' blog: http://www.stevesouders.com/blog/2008/12/27/coupling-async-scripts/ Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Wed, Apr 22, 2009 at 12:42 PM, Brion Vibber br...@wikimedia.org wrote: On 4/22/09 9:33 AM, Sergey Chernyshev wrote: Exactly because this is the kind of requests we're going to get, I think it makes sense not to have any library bundled by default, but have a centralized handling for libraries, e.g. one extension asks for latest jQuery and latest YUI and MW loads them, another extension asks for jQuery only and so on. Considering we want core code to be able to use jQuery, I think the case for bundling it is pretty strong. :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On extension SVN revisions in Special:Version
I probably have an idea of how to implement this using a bot (or post-commit hook if we want real-time data) and externals. Essentially bot script should be checking the version of extension folder and generate and check-in an entry in another repository in the form like this: http://extensionversioningrepo/trunk/OpenID/version.php and write a Last Changed Rev from svn info http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/something like this: ? $wgExtensionRevisions['OpenID'] = 49664; Then we'll use externals trick to map this into constant location withing http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/ - something like: svn propset svn:externals 'version http://extensionversioningrepo/trunk/OpenID/' http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/ Then $wgExtensionCredits declaration in http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/OpenID.setup.phpwill have something constant like this: $wgExtensionCredits['other'][] = array( 'name' = 'OpenID', 'version' = '1.8.4.rev'.$wgExtensionRevisions['OpenID'], ... ); This way every svn checkout and svn update will have extension version checked out from extensionversioningrepo Alternative, more simple approach to this is to just store versions directly in repository as http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/OpenID/version.phpusing post-commit hook, but it'll double the revision number in main repository - it still can be done using a bot, in this case it'll only add one revision per run checking all updated versions in. Hope this can be a solution. Sergey On Wed, Apr 22, 2009 at 12:14 PM, Brion Vibber br...@wikimedia.org wrote: On 4/22/09 5:54 AM, Chad wrote: Not sure it's worth it :-\ What's wrong with just giving version numbers that make sense, rather than relying on the revision number which isn't indicative of anything? It's indicative of the running version of the code, as long as it also tells you which branch to pull from. :) And of course as long as it's a relevant number like the revision of the extension directory... -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Skin JS cleanup and jQuery
No, my link is about 3 ways of loading: 1. Normal script tags (current style) 2. Asynchronous Script Loading (loading scripts without blocking, but without waiting for onload) 3. Lazyloading (loading script onload). Number 2 might be usable as well. In any case changing all MW and Extensions code to work for #2 or #3 might be a hard thing. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Wed, Apr 22, 2009 at 1:21 PM, Michael Dale md...@wikimedia.org wrote: The mv_embed.js includes a doLoad function that matches the autoLoadJS classes listed in mediaWiki php. So you can dynamically autoload arbitrary sets of classes (js-files in the mediaWiki software) in a single http request and then run something once they are loaded. It can also autoload sets of wiki-titles for user-space scripts again in a single request grouping, localizing, gziping and caching all the requested wiki-title js in a single request. This is nifty cuz say your script has localized msg. You can fill these in in user-space MediaWiki:myMsg then put them in the header of your user-script, then have localized msg in user-space javascript ;) .. When I get a chance I will better document this ;) But its basically outlined here: http://www.mediawiki.org/wiki/Extension:ScriptLoader The link you highlight appears to be about running stuff once the page is ready. jQuery includes a function $(document).ready(function(){ //code to run now that the dom-state is ready }) so your enabled gadget could use that to make sure the dom is ready before executing some functions. (Depending on the type of js functionality your adding it /may/ be better to load on-demand once a new interface component is invoked rather than front load everything. Looking at the add-media-wizard gadget on testing.wikipedia.org for an idea of how this works. peace, --michael Sergey Chernyshev wrote: Yep, with jQuery in the core it's probably best to just bundle it. There is another issue with the code loading and stuff - making JS libraries call a callback function when they load and all the functionality to be there instead of relying on browser to block everything until library is loaded. This is quite advance thing considering that all the code will have to be converted to this model, but it will allow for much better performance when implemented. Still it's probably Phase 5 kind of optimization, but it can bring really good results considering JS being the biggest blocker. More on the topic is on Steve Souders' blog: http://www.stevesouders.com/blog/2008/12/27/coupling-async-scripts/ Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Wed, Apr 22, 2009 at 12:42 PM, Brion Vibber br...@wikimedia.org wrote: On 4/22/09 9:33 AM, Sergey Chernyshev wrote: Exactly because this is the kind of requests we're going to get, I think it makes sense not to have any library bundled by default, but have a centralized handling for libraries, e.g. one extension asks for latest jQuery and latest YUI and MW loads them, another extension asks for jQuery only and so on. Considering we want core code to be able to use jQuery, I think the case for bundling it is pretty strong. :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [OpenID] OpenID MediaWiki Extension v.0.8.4.1 - Identity Providers UI
Unfortunately, it might be quite hard for MediaWiki admins to set up SSL comparing to what they do to setup MediaWiki or it's extensions. Looks like XRDS is easier approach to implement. Sergey On Sun, Apr 19, 2009 at 3:46 PM, Peter Williams pwilli...@rapattoni.comwrote: This could be interesting of itself in the uci spirit of openid. One can use yahoos willingess to rely without warning on a https realm as an authentication scheme. Yahoo implies that the https cert on an https realm is valid (wrt its trust list, its handling of crls and arls). A reputation service can now crawl which sites yahoo so rates, and publish a meta reliance signal (by updating its ocsp database for example). Those rp doing discovery on smaller ops might configure their ssl client engines to use that ocsp source, when qualifying the original yahoo rp (now acting as an asserting or attribute authority/agent of the dataowner (ie the user) ). From: Allen Tom a...@yahoo-inc.com Sent: Sunday, April 19, 2009 12:34 PM To: Sergey Chernyshev sergey.chernys...@gmail.com Cc: Wikimedia developers wikitech-l@lists.wikimedia.org; gene...@openid.net gene...@openid.net Subject: Re: [OpenID] OpenID MediaWiki Extension v.0.8.4.1 - Identity Providers UI Hi Sergey, The Yahoo OpenID Provider will display a warning to the user if the RP's OpenID endpoints are not discoverable. Warning: This website has not confirmed its identity with Yahoo! and might be fraudulent. Do not share any personal information with this website unless you are certain it is legitimate. The best documentation for fixing this issue is here: http://blog.nerdbank.net/2008/06/why-yahoo-says-your-openid-site.html The AOL Sign-in form fails if the user just clicks the Login Button without entering their AOL ScreenName. You might want to disable the button until after the user types in their ScreenName. This will only be an issue until AOL upgrades their OpenID Provider from OpenID 1.1 to OpenID 2.0. Once they have OpenID 2.0 support, you'll be able to handle AOL logins identically to Google and Yahoo. Good job! Allen Sergey Chernyshev wrote: Hi, I'm done with initial implementation of Identity Providers UI for OpenID MediaWiki Extension. Extension now shows a user-friendly (although my design skills are far from perfect) form where they can pick from a list of OpenID providers (generic OpenID URL form is still default). You can see it in action here: http://www.mediawikiwidgets.org/Special:OpenIDLogin http://www.techpresentations.org/Special:OpenIDLogin (without icons - I'll enable them later) After some discussions and concerns here on the list, I implemented it in the way that provider logos don't show up by default and if you would like to show them on your site, you have to add: $wgOpenIDShowProviderIcons = true; to your LocalSettings.php Hope you like it, but I'm still open to suggestions about improving the interface so you all finally install it on your wikis ;) Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ general mailing list gene...@openid.netmailto:gene...@openid.net http://openid.net/mailman/listinfo/general ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] OpenID MediaWiki Extension v.0.8.4.1 - Identity Providers UI
Hi, I'm done with initial implementation of Identity Providers UI for OpenID MediaWiki Extension. Extension now shows a user-friendly (although my design skills are far from perfect) form where they can pick from a list of OpenID providers (generic OpenID URL form is still default). You can see it in action here: http://www.mediawikiwidgets.org/Special:OpenIDLogin http://www.techpresentations.org/Special:OpenIDLogin (without icons - I'll enable them later) After some discussions and concerns here on the list, I implemented it in the way that provider logos don't show up by default and if you would like to show them on your site, you have to add: $wgOpenIDShowProviderIcons = true; to your LocalSettings.php Hope you like it, but I'm still open to suggestions about improving the interface so you all finally install it on your wikis ;) Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] OpenID MediaWiki Extension v.0.8.4.1 - Identity Providers UI
On Sat, Apr 18, 2009 at 8:18 AM, Marco Schuster ma...@harddisk.is-a-geek.org wrote: On Sat, Apr 18, 2009 at 9:00 AM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: Hope you like it, but I'm still open to suggestions about improving the interface so you all finally install it on your wikis ;) There's a double escape on the confirmation page which redirects to the OID provider (\continue\)... unfortunately it redirected to myopenid too fast to CnP the page. This was in i18n file for some reason - I changed it. And, I can't choose the case spelling of my nick (it's harddisk on OID), normally it shoulda be HardDisk, but I think this is an OpenID-related problem - anyway, it'd be cool if you could make an additional field for the user to input desired username. Where is this happening? Is it related to this bug? https://bugzilla.wikimedia.org/show_bug.cgi?id=17654 It's very possible that your provider returns lowercase nickname and MediaWiki user is automatically created. Besides that, it's ENORMOUSLY cool. It was actually there all the time, all I did is added a better UI. There are more things to change to improve how OpenID integration works - I already entered a few feature-bugs: https://bugzilla.wikimedia.org/buglist.cgi?query_format=advancedproduct=MediaWiki+extensionscomponent=OpenIDbug_status=NEWbug_status=ASSIGNEDbug_status=REOPENED Marco -- VMSoft GbR Nabburger Str. 15 81737 München Geschäftsführer: Marco Schuster, Volker Hemmert http://vmsoft-gbr.de ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Large nested templates (example: NYRepresentatives)
Domas, In this particular case, template will just contain an SMW query to get all representatives. I run TechPresentations.org on MW + SMW and this helps me maintain it in much better shape providing relevant information on all the pages - you can see the example of Topic template: http://www.techpresentations.org/w/index.php?title=Template:Topicaction=edit This one is used on all topic pages: http://www.techpresentations.org/Category:Topic And it includes various information about the topic aggreegated from all over the place in the wiki - good example is page about the Web: http://www.techpresentations.org/Web And here's what I have on the Web topic page: http://www.techpresentations.org/w/index.php?title=Webaction=edit BTW, it also provides feeds of information based on identified concepts. I wrapped actual feeds into FeedBurner so it's not clear that they are provided automatically, but they are: http://feeds.techpresentations.org/TechPresUpcoming http://feeds.techpresentations.org/TechPresOpenCallsForPapers http://feeds.techpresentations.org/TechPresMaterialsAvailable List of concepts is here: http://www.techpresentations.org/w/index.php?title=Special%3AAllPagesfrom=to=namespace=108 I wouldn't be able to maintain the resource myself using just MediaWiki - with Semantic MediaWiki it makes it viable. Hope this explains some of the benefits. I'll be happy to give more usecases. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Apr 14, 2009 at 3:47 AM, Domas Mituzas midom.li...@gmail.comwrote: On Apr 13, 2009, at 4:02 AM, Brian wrote: This is one example of a problem that Semantic MediaWiki solves. Can you show example solution, instead of an assertion? How does it do that kind of presentation? BR, -- Domas Mituzas -- http://dammit.lt/ -- [[user:midom]] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Large nested templates (example: NYRepresentatives)
Absolutely agree - there is no way to make data useful without getting it from somewhere - SMW helps to get more accurate data. Sergey On Tue, Apr 14, 2009 at 10:59 AM, Gregory Maxwell gmaxw...@gmail.comwrote: On Tue, Apr 14, 2009 at 10:52 AM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: Domas, In this particular case, template will just contain an SMW query to get all representatives. [snip] How does this avoid merely shifting the load from the parser (on the plentiful application servers) to the database? Not that more intelligence isn't good— From a content-maintenance perspective something query based is doubtlessly better than some static serialized lump, but the complaint here was performance as far as I can tell, and I think thats a more complicated question. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] BSD License question
On Tue, Feb 24, 2009 at 8:26 PM, Brion Vibber br...@wikimedia.org wrote: [skip-skip] IANAL My impression is that this is just as legal as referring to a company by name -- it's not an infringement to use someone's trademark to *refer to them*, whereas it is to *use the mark or something overly similar to creation confusion and imply you are associated with the mark holder*. However, I don't know just how true this is going to be of logos. :) /IANAL -- brion Brion, I spoke to David Recordon and at OpenID foundation and he's saying that at this point they don't have a legal basis for licensing these images and that he's working on license for OpenID logo itself. David also promised to connect me with people at Google who might help with licensing their logo. IANAL Looking at current trend of many companies using such images, I would say that it can be considered safe to use it in MediaWiki until such licensing gets into better shape. /IANAL I think, I'll create a switch that will control display of images vs. just text lables so extension users can decide for themselves if they want to use icons until there is better licensing implemented. Let me know if you feel this is a reasonable solution for the time being. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On extension SVN revisions in Special:Version
For extensions I write, I don't update the version number every time I make a change because it's too much, always editing (even if automatically) and checking in the main file. I have this process for releases though where I use Makefile to create a tag and packages: http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/Widgets/Makefile I know it's far from perfect, but better then not having any because I'll either forget to create a tag (or will not do that properly) or will not be maintaining downloadable versions (this will hopefully be unnecessary if ExtensionDistributor will support extension tags). There is probably a solution that can be done within the code - something that makes all included PHP files override the version if it's later then in main file - something like this in main file: *$myExtensionRevision = $LastChangedRevision$; function updateRevision($file_revision) { global $myExtensionRevision; if ($file_revision $myExtensionRevision) { $myExtensionRevision = $file_revision; } } **include_once('subfile1.php'); **include_once('subfile2.php'); * *$wgExtensionCredits['parserhook'][] = array( ... 'version' = '0.2.'.$myExtensionRevision, ... } * and then in subfile1.php and subfile2.php *updateRevision(**$LastChangedRevision$**)* Still, it's only a convention and not a universal solution, more over it'll only work if revision updated at least one PHP file. Also if extension doesn't include all it's files or uses AutoLoader, then it will not be reliable. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Mar 27, 2009 at 5:59 AM, Gerard Meijssen gerard.meijs...@gmail.comwrote: Hoi, The Cite extensions is two distinct extensions in the same directory.. In order to get the issue of supporting extensions under control, best practices have to be defined and implemented. Not having such a solution is not really an option because it breaks the ability to reliably support MediaWiki for stable versions and it breaks the ability to reliably use the same MediaWiki environment as the WMF projects. Thanks, Gerard 2009/3/27 Daniel Kinzler dan...@brightbyte.de I'm guessing this may be because the new file was added after r37404, but the file registering the extension (and providing the revision number) wasn't changed at that time, which means the most recent revision of *that file* is still r37404. Special:Version doesn't really report the most recent revision of the extension as a whole, but that of the setup file (IIRC). Indeed. It would be much nicer if it could report the *directories* revision number. Does SVN have a keyword substitution for this? Does SVN allow scripts/plugins for this type of thing? That would by quite useful. -- daniel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] On extension SVN revisions in Special:Version
Don't know if it'll solve issue at hand, but we also have extension tags for a brave few: http://svn.wikimedia.org/svnroot/mediawiki/tags/extensions/ It definitely helped me to manage my installations (rolling back releases and so on). One thing I thought it might be helpful for is for ExtensionDistributor to use those release tags. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Thu, Mar 26, 2009 at 3:53 PM, Roan Kattouw roan.katt...@home.nl wrote: Brion Vibber schreef: On 3/26/09 11:20 AM, Chad wrote: A standardized version file could go a long way to solving this issue. Could maybe make them auto-generated with an on-commit hook? Hmmm, possible. That'd be preferable over not providing any version information at all, which is what we're doing now. Roan Kattouw (Catrope) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Stock chart Extension
You don't have to commit your extension to MW repository - there is no problem if you host it externally. I used Google Code hosting for a long while. Sergey On Thu, Feb 26, 2009 at 4:58 PM, Roger Fong ro...@wikinvest.com wrote: Hi, I'd like to submit an extension to the Mediawiki codebase. I think it's a cool extension -- it allows Mediawiki users to embed financial stock charts on articles. We've secured redistribution rights for the financial data, which is provided by Thomson / Reuters and Xignite, with a 15-20 minute delay. The extension is licensed under the GPL. Once I've committed it, I'll add documentation to the Mediawiki.org site. I've tested the extension against the version of mediawiki we're using 1.9.3, and a new install off trunk, 1.15alpha. I'm following the process listed here: http://www.mediawiki.org/wiki/Commit_access Can I get commit permissions? username: rfong ssh key: ssh-rsa B3NzaC1yc2EBIwAAAQEAz8CFXbEoey6c+Wh2a2+zAqZF2mMoKPeBbN7km2ae03RWPdkbZYhynDbA1X+IgGN5DtKFTocuozritEFCx762xGs7pfg9vIHwNi0pvD6WyLR+GYXh8vXRyUSGVmOsaIqiDH2xUF5dk62aRtFE+aglB+wjGhi41ldBIgOhjMMZIs0tIxvmSjXc5806gZh4Rd6s2t/VeGjyGBHus6DPp2bL4+tSm7xfmSp/T5r7n6gpkFH7XSpSInx6/myR3KS0ikeSPjhJ3pa2wG1EHS3NwvHSy43jsu/RsZsjsgsulyP5lJNI+dS0YApBO9uVEegVVJV8a+k7EUXczOfytIrPoOYVEw== ro...@wikinvest.com Thanks! Roger ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Front-end performance optimization
On Tue, Feb 24, 2009 at 7:31 PM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Tue, Feb 24, 2009 at 7:17 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)? It doesn't seem usability-related. Actually it is very related to usability - performance is a very important factor in usability and that is why Google, Yahoo and Amazon made research how it affects usage and figured out a few very interesting numbers that for them convert to hard cash: I'm preparing a presentation at New York Web Standards Meetup about web performance - it's not ready yet, but I came across a good presentation by Nicole Sullivan from Yahoo! targeted at designers and UI experts - http://www.techpresentations.org/Design_Fast_Websites I understand why http://usability.wikimedia.org/ doesn't have front-end performance as one of it's goals, but I think it should at least be mentioned to all people working on redesign of such major site. If you have commit access, you could just start committing code (probably hidden behind disabled-by-default config variables to start with until it's tested and complete enough). If you don't have commit access, you could ask for it. Please don't give me this attitude - you need it, you do it. I have commit access and is going to work on this, but my motivation with my small projects is nothing compared to Wikipedia's and my setups are much smaller and more controllable so I think I'll leave pitching for resources for this idea to those who need it, but I'll be happy to talk about it in person to Brion or anyone else who wants to do something about it. BTW, are there any Wikipedia-related events in SF on the week of Web 2.0 conference (March 30 - April 3)? I'll be in SF for the conference and will be happy to come by. Thank you, Sergey If you meant could Wikimedia resources be allocated to this?, then Brion is the one to talk to. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] BSD License question
I've made some customizations to OpenID selector code ( http://code.google.com/p/openid-selector/) and combined it with MediaWiki OpenID extension, you can see the result here: http://www.sharingbuttons.org/Special:OpenIDLogin Iwant to check it in back into the repository, but it uses New BSD License and I wonder if it's OK to do so. Otherwise I'll write one from scratch and GPL it. Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] BSD License question
Great - thanks for the license clarification, I don't think I was too excited to re-implement selector. Good point about logos - so what do we do with this? How do we make sure all those logos (including OpenID, BTW) are properly licensed? I don't think original developer thought about that either when he licensed it under BSD license. Sergey On Tue, Feb 24, 2009 at 3:01 PM, Aryeh Gregor simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com wrote: On Tue, Feb 24, 2009 at 2:28 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: I've made some customizations to OpenID selector code ( http://code.google.com/p/openid-selector/) and combined it with MediaWiki OpenID extension, you can see the result here: http://www.sharingbuttons.org/Special:OpenIDLogin Iwant to check it in back into the repository, but it uses New BSD License and I wonder if it's OK to do so. Otherwise I'll write one from scratch and GPL it. The three-clause BSD license is universally considered a free software license and is certainly acceptable for checking into our repository. Moreover, it's GPL-compatible. The license permits you to take any BSD-licensed software that you possess and relicense it as GPL (or under any other compatible license, such as totally proprietary (plus liability/attribution requirements for redistributors)). You certainly wouldn't need to rewrite anything. However, it seems to include a number of trademarked, copyrighted logos. In other words, it's not really BSD-licensed. I don't know if the logos should be in the repo. Even if we're not going to worry about copyright on logos (à la Firefox), I'd think that the current extension might be a trademark violation, in that users might reasonably think your site is part of or endorsed by Google/AOL/etc. IANAL, of course. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] OpenID and Provider logos
Sorry, CC-ed wrong Wikimedia list originally. Please use these lists for discussion: gene...@openid.net, le...@openid.net, Wikimedia developers wikitech-l@lists.wikimedia.org, Thank you, Sergey -- Forwarded message -- From: Sergey Chernyshev sergey.chernys...@gmail.com Date: Tue, Feb 24, 2009 at 5:12 PM Subject: OpenID and Provider logos To: gene...@openid.net, le...@openid.net, mediawik...@lists.wikimedia.org Hi, I'm working on updating MediaWiki OpenID extension ( http://www.mediawiki.org/wiki/Extension:OpenID) originally developed by Evan Prodromou to add some features including selector UI that will allow users to pick specific Provider to simplify identity URL entering. You can see current testing site here: http://www.sharingbuttons.org/Special:OpenIDLogin Current testing site is using code from modified http://code.google.com/p/openid-selector/ project which is supposedly licensed under BSD license, but this license is incorrect as code includes company logos required to display good UI (see original email to Wikitech list). I wonder if OpenID Foundation can release their logos and work with Providers on releasing appropriate logos under licenses that can be used in open software. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Tue, Feb 24, 2009 at 3:01 PM, Aryeh Gregor @gmail.com wrote: On Tue, Feb 24, 2009 at 2:28 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: I've made some customizations to OpenID selector code ( http://code.google.com/p/openid-selector/) and combined it with MediaWiki OpenID extension, you can see the result here: http://www.sharingbuttons.org/Special:OpenIDLogin Iwant to check it in back into the repository, but it uses New BSD License and I wonder if it's OK to do so. Otherwise I'll write one from scratch and GPL it. The three-clause BSD license is universally considered a free software license and is certainly acceptable for checking into our repository. Moreover, it's GPL-compatible. The license permits you to take any BSD-licensed software that you possess and relicense it as GPL (or under any other compatible license, such as totally proprietary (plus liability/attribution requirements for redistributors)). You certainly wouldn't need to rewrite anything. However, it seems to include a number of trademarked, copyrighted logos. In other words, it's not really BSD-licensed. I don't know if the logos should be in the repo. Even if we're not going to worry about copyright on logos (à la Firefox), I'd think that the current extension might be a trademark violation, in that users might reasonably think your site is part of or endorsed by Google/AOL/etc. IANAL, of course. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] BSD License question
I sent an email to OpenID general and legal lists CC-ing this list - not sure how we should go about it, but I'll definitely delay UI release until I get some initial understanding of potential solution (it's sort-of weird to have good UI without images). Sergey On Tue, Feb 24, 2009 at 5:39 PM, Gerard Meijssen gerard.meijs...@gmail.comwrote: Hoi, Be glad that the original developer chose the BSD license. From a perspective of being able to cooperate widely, the BSD license is vastly superior to the GPL. It is for this reason that I urge you to develop first the BSD software and back-port to a GPL'd version. Then again as long as you solely work on the code, you as the copyright holder are entitled to do this anyway. When a second person shares code it starts to become problematic. The notion that trade marked logos are problematic is true for a specific strict understanding of the GPL license as is prevalent under people who adhere to the Debian way of thinking. It is definetly not universally shared and it is a travesty that brought us Iceweasel. Thanks, GerardM 2009/2/24 Sergey Chernyshev sergey.chernys...@gmail.com Great - thanks for the license clarification, I don't think I was too excited to re-implement selector. Good point about logos - so what do we do with this? How do we make sure all those logos (including OpenID, BTW) are properly licensed? I don't think original developer thought about that either when he licensed it under BSD license. Sergey On Tue, Feb 24, 2009 at 3:01 PM, Aryeh Gregor simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com simetrical%2bwikil...@gmail.com simetrical%252bwikil...@gmail.com simetrical%2bwikil...@gmail.com simetrical%252bwikil...@gmail.com simetrical%252bwikil...@gmail.com simetrical%25252bwikil...@gmail.com wrote: On Tue, Feb 24, 2009 at 2:28 PM, Sergey Chernyshev sergey.chernys...@gmail.com wrote: I've made some customizations to OpenID selector code ( http://code.google.com/p/openid-selector/) and combined it with MediaWiki OpenID extension, you can see the result here: http://www.sharingbuttons.org/Special:OpenIDLogin Iwant to check it in back into the repository, but it uses New BSD License and I wonder if it's OK to do so. Otherwise I'll write one from scratch and GPL it. The three-clause BSD license is universally considered a free software license and is certainly acceptable for checking into our repository. Moreover, it's GPL-compatible. The license permits you to take any BSD-licensed software that you possess and relicense it as GPL (or under any other compatible license, such as totally proprietary (plus liability/attribution requirements for redistributors)). You certainly wouldn't need to rewrite anything. However, it seems to include a number of trademarked, copyrighted logos. In other words, it's not really BSD-licensed. I don't know if the logos should be in the repo. Even if we're not going to worry about copyright on logos (à la Firefox), I'd think that the current extension might be a trademark violation, in that users might reasonably think your site is part of or endorsed by Google/AOL/etc. IANAL, of course. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Front-end performance optimization
As mentioned already, I'm not sure if localization is the best for candidate for being held in JavaScript, but other things mentioned, e.g. single request, minified and infinitely cached JS is what I'm looking at for overall MW infrastructure - so far it's a big performance problem for MW - examples of waterfall diagram i posted for the infinite image cache also show main issue that I'm trying to attack for a while and might need more help with - there are too many JS and CSS requests are made to the server by MediaWiki: http://performance.webpagetest.org:8080/result/090218_132826127ab7f254499631e3e688b24b/1/details/- only 18th request is first image to be loaded and as you can see, JavaScript loads are blocking, meaning no parallel loading is happening. I think it's worth investing resources into creating some process for better handling of this. Right now it's possible to cut down this configuring MediaWiki not to use user scripts and stylesheets and manually combining JS and CSS files for skin, Ajax framework and all things needed by extensons - I did quite a lot for specific installations, but it seems that it needs more systematic approach. Good news is that MW already has some wrappers for style and script insertion that extensions use to refer to external files. It's a little bit lest fortunate with script loading sequence (e.g. it's ideal to load scripts only when the rest of the page is loaded), but that might be a much bigger challenge. It's also worth mentioning that reducing the amount of PHP that handles JavaScript and CSS is a good idea as serving static resources is much easier then starting up fullblown PHP engine even with opcode and variable caches. I think that there is a way to reduce the start-render delay as well as overall loading time plus, very likely to save some traffic by attacking front-end and will be happy to participate more in this. How do we go about doing this? Can it be tied into Usability project ( http://usability.wikimedia.org/)? Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ On Fri, Feb 20, 2009 at 8:07 PM, Gregory Maxwell gmaxw...@gmail.com wrote: On Fri, Feb 20, 2009 at 5:51 PM, Brion Vibber br...@wikimedia.org wrote: [snip] On the other hand we don't want to delay those interactions; it's probably cheaper to load 15 messages in one chunk after showing the wizard rather than waiting until each tab click to load them 5 at a time. But that can be up to the individual component how to arrange its loads... Right. It's important to keep in mind that in most cases the user is *latency bound*. That is to say that the RTT between them and the datacenter is the primary determining factor in the load time, not how much data is sent. Latency determines the connection time, it also influences how quickly rwin can grow and get you out of slow-start. When you send more at once you'll also be sending more of it with a larger rwin. So in terms of user experience you'll usually improve results by sending more data if doing so is able to save you a second request. Even ignoring the users experience— connections aren't free. There is byte-overhead in establishing a connection. Byte-overhead in lost compression by working with smaller objects. Byte-overhead in having more partially filled IP packets. CPU overhead from processing more connections, etc. Obviously there is a line to be drawn— You wouldn't improve performance by sending the whole of Wikipedia on the first request. But you will most likely not be conserving *anything* by avoiding sending another kilobyte of compressed user interface text for an application a user has already invoked, even if only a few percent use the additional messages. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Unique Image URLs
Hi, I'm working on optimizing MediaWiki for front-end performance and came across a problem that seems to be an issue even on Wikipedia - images all get requested from the server for the subsequent requests even though all of them didn't change - this means that images that are cached in a browser are re-requested again. The problem is that even though all those requests return 304s and none of the images are actually transmitted, it still makes all those requests which delays rendering of the page. I've wrote a patch and created a bug for this in Bugzilla: https://bugzilla.wikimedia.org/show_bug.cgi?id=17577 Let me know what you think. Thank you, Sergey -- Sergey Chernyshev http://www.sergeychernyshev.com/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l