Re: [Wikitech-l] ExternalAuth and AuthPlugin relationships
If there are multiple identification sources, what about unicity of usernames? i.e. who is User1 if it exists different people User1@OpenID and User1@RADIUS? the first who registers on the wiki? or is it assumed all User1 are the same people? And if there is a rewrite of the auth, I want just point out that aside authentications like OpenID, OAuth, local DB, there are also some profesionnal authentication backend like Shibboleth, RADIUS, CAS, Kerberos that should be taken into account for enterprise wikis (it should be generic enough for these types of authentication). Sébastien Le Fri, 12 Oct 2012 04:35:07 +0200, Tyler Romeo tylerro...@gmail.com a écrit: I don't think it's possible, or even preferable, to do a rework. AuthPlugin is fundamentally flawed in its design and ExternalAuth is lacking in a number of major features. What we need is a full-fledged authnz system. Attached is a basic outline I've been developing recently. The idea is a very rough draft, but it would allow: - Multiple authentication sources working in tandem - A separation of policy and implementation - A separation of authentication and authorization - A separation of MediaWiki logic and framework logic - An arbitrary list of user properties, so that frameworks can store more than just email and real name if necessary - An arbitrary authentication data array, so frameworks are not required to stick to username/password. - Permission-based blocking and role-based permissions This could be used in combination with the FormSpecialPage-based Special:Userlogin and Special:ChangePassword that are currently in Gerrit to allow more comprehensive authnz frameworks. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Thu, Oct 11, 2012 at 7:48 PM, Ryan Lane rlan...@gmail.com wrote: On Thu, Oct 11, 2012 at 4:33 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote: I was thinking about this recently too. Though I started thinking from the login form perspective. Things we should have: - Good build-in support for both single-authentication (everyone is in the user database, or everyone in ldap, etc...) and multi-authentication (some users are local, some are OAuth, others may be LDAP) and also the possibility of multiple auth types for one user. - A real abstract login form that lets extensions and auth systems simply add fields to the login/creation form without having to re-implement it and not work with other similar extensions. -- Perhaps also some meta information from auth plugins that let us say on the login form that a wiki is using LDAP or something. - Explicit support for auth systems using something other than the username. - Real support for auth systems involving a 3rd party. ie: Involving redirects such as OAuth, OpenID, and simple 3rd party login where the login link directs you to the login page of some forum, you get sent back, and somehow the extension knows what the session is. - Login form support for multiple authentication systems on the same wiki, incl. support for OAuth and OpenID like logins. That last one was the tricky one to figure out. Whatever is done, can it please be done as a refactor, rather than a rewrite? - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Make Extensions Aware of the ContentHandler
Dmitriy Sintsov. 11 Октябрь 2012 г. 16:13:02 пользователь Jeroen De Dauw (jeroended...@gmail.com) написал: Hey, Also, it would be great, if WMF selected some old version of MediaWiki as LTS (Long Time Support) so extensions would be required to work with it. Currently that could be 1.17, first version which got ResourceLoader. My observation is that most WMF maintained extensions are not maintained with much regard to backwards compatibility but with a lot of effort being put in replacing use of deprecated code usage quickly. They thus tend to have compatibility broken early after MediaWiki releases. I'd be semi-surprised to find a single one that is still compatible with 1.17 at this point. What's so major is preventing new extensions from running in 1.17? Page actions in separate classes? Router? Are these major obstacles? Dmitriy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.20 blocker bugs and code merges
I've been pretty busy with other work all week and, as a result, I haven't had time to look at the 1.20 release. On September 30 (2 weeks ago) I sent out a message with what I saw as the blocker bugs at that point. Only one of those bugs does not yet have any sort of resolution on it. * https://bugzilla.wikimedia.org/35894 -- Reports of secret key generation hanging on windows` At this point, I think we should add a known issues section to the release notes and plan to have a 1.20.1 release with this so that I can put together an RC1 tarball tomorrow for you guys to test. That said, some bugs have been added to the 1.20.0 milestone, but unless there is a major objection, I'd like to target those to the 1.20.1 release so that we don't hold up the 1.20.0 release any longer. These bugs could be listed in the known issues section of the release notes with a statement that we're planning a 1.20.1 release. And, speaking of plans, I will have time to write up my ideas for a release schedule tomorrow. This will mean plans for future releases as well as support for 1.19 and 1.18. Mark. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Gerrit Metrics
Interesting email coming through Chad about e tool being developed to retrieve metrics from Gerrit. I wonder how this effort is doing next to gerrit-stats. Overlapping, complementarity... I will look further, but if the Gerrit experts in the house can have a look, all the better. -- Quim -- Forwarded message -- From: Lundh, Gustaf gustaf.lu...@sonymobile.com Date: Fri, Oct 12, 2012 at 7:30 AM Subject: RE: Gerrit Metrics To: Remy Bohmer li...@bohmer.net, Janne Hellsten jjhel...@gmail.com Cc: Stephen Roberts stephen.rober...@gmail.com, Repo and Gerrit Discussion repo-disc...@googlegroups.com, jose.lob...@isis-papyrus.com jose.lob...@isis-papyrus.com For some of my Gerrit-metrics I'm using the Gerrit-event module in Gerrit-trigger[1] to collect events through the stream-event SSH interface. I collect the interesting data and feed it to a Graphite instance. This way I can plot real-time data in terms of: Changes per hour (or minute) and also the rate of comments/merges/etc. One of my colleagues has also created python interface for listening and parsing the events. Not sure if it has been open-sourced yet. [1] https://github.com/jenkinsci/gerrit-trigger-plugin Best regards Gustaf -Original Message- From: repo-disc...@googlegroups.com [mailto:repo-disc...@googlegroups.com] On Behalf Of Remy Bohmer Sent: den 29 december 2011 22:58 To: Janne Hellsten Cc: Stephen Roberts; Repo and Gerrit Discussion; jose.lob...@isis-papyrus.com Subject: Re: Gerrit Metrics Hi, 2011/12/29 Janne Hellsten jjhel...@gmail.com: I've written Haskell code to talk to Gerrit via SSH and parse the JSON responses to Haskell data structures. This is pretty handy for further data mining in Haskell. Ping me if you're interested, I can make the code available if someone finds it useful. Here is the ping ;-) I find it useful, where can I find the code? You can find the GerritJson module here: https://github.com/nurpax/gerrit-json I only recently went back to Haskell so there are probably many ways in which the code can be improved. Thanks for sharing! I will look into it in detail next year/week ;-) Kind regards, Remy -- To unsubscribe, email repo-discuss+unsubscr...@googlegroups.com More info at http://groups.google.com/group/repo-discuss?hl=en -- To unsubscribe, email repo-discuss+unsubscr...@googlegroups.com More info at http://groups.google.com/group/repo-discuss?hl=en ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Wikidata review mail
Hi all, here is our weekly report on changesets to core relevant for Wikidata development. * Great news! The Wikidata branch got merged, possibly the biggest single changeset to MediaWiki. Thanks to everyone for their input, I am afraid to try to list them all because I would fail. Special thanks to Tim for accompanying the development of the branch for the last seven months (!), and congratulations to Duesentrieb for creating it. We had cake. http://instagram.com/p/QmtQaqhE1N Thanks to Siebrand for pushing the button. The ContentHandler branch created a number of follow up items, many of which are already merged. Here are a few open ones: * Fix declaration of content_model and content_format fields: https://gerrit.wikimedia.org/r/#/c/27394/ * Support plain text content: https://gerrit.wikimedia.org/r/#/c/27399/ (two +1s already) * Silence warnings about deprecation by ContentHandler (since there are too many right now, has already been discussed on this list) https://gerrit.wikimedia.org/r/#/c/27537/ Besides the content handler als the ORMTable to access foreign wikis changeset got merged. Yay to Chad! https://gerrit.wikimedia.org/r/#/c/25264/ Also the Sites management got merged after a review by Asher for its DB impact and by Chad for the code. You are awesome! Thanks to Jeroen for working on this https://gerrit.wikimedia.org/r/#/c/23528/ We have one more changeset open, and I'd love to see that one closed too: * Sorting in jQuery tables: https://gerrit.wikimedia.org/r/#/c/22562/ That would be awesome to merge. It got several +1's over its lifetime, it has been around for more than a month, it got improved continually. If you have comments, please write them into the code, Henning will take care of them. Thank you so much for your awesome help everyone! Cheers, Denny -- Project director Wikidata Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin Tel. +49-30-219 158 26-0 | http://wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.20 blocker bugs and code merges
Hi Mark, Comments inline: On Fri, Oct 12, 2012 at 7:16 AM, Mark A. Hershberger m...@everybody.org wrote: At this point, I think we should add a known issues section to the release notes and plan to have a 1.20.1 release with this so that I can put together an RC1 tarball tomorrow for you guys to test. That said, some bugs have been added to the 1.20.0 milestone, but unless there is a major objection, I'd like to target those to the 1.20.1 release so that we don't hold up the 1.20.0 release any longer. These bugs could be listed in the known issues section of the release notes with a statement that we're planning a 1.20.1 release. Yup, that sounds reasonable. And, speaking of plans, I will have time to write up my ideas for a release schedule tomorrow. This will mean plans for future releases as well as support for 1.19 and 1.18. Great, thanks for doing this! Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ExternalAuth and AuthPlugin relationships
I like most of the requirements that Tyler and Danial have listed. There are a couple of things that I didn't clearly see accounted for, so I wanted to make sure whatever you come up with accounts for these: - Extensions / tools need to hook into all the identity pieces, so a tool like AbuseFilter can block users logging in with an external systems, and ConfirmEdit can inject captchas if someone is trying to brute force a login. - Whatever we do will need to keep most of the existing interfaces. We can't merge any updates that would break CentralAuth, LDAP, OathAuth, or the upcoming OAuth, without a clear upgrade plan and time to put those pieces in place without downtime. - Single and Multi-factor authentication support - Identity merging (if someone authenticates with OpenID, then later creates a wiki account, make sure we can merge the concept of that user) On Thu, Oct 11, 2012 at 7:35 PM, Tyler Romeo tylerro...@gmail.com wrote: I don't think it's possible, or even preferable, to do a rework. AuthPlugin is fundamentally flawed in its design and ExternalAuth is lacking in a number of major features. What we need is a full-fledged authnz system. Attached is a basic outline I've been developing recently. The idea is a very rough draft, but it would allow: - Multiple authentication sources working in tandem - A separation of policy and implementation - A separation of authentication and authorization - A separation of MediaWiki logic and framework logic - An arbitrary list of user properties, so that frameworks can store more than just email and real name if necessary - An arbitrary authentication data array, so frameworks are not required to stick to username/password. - Permission-based blocking and role-based permissions This could be used in combination with the FormSpecialPage-based Special:Userlogin and Special:ChangePassword that are currently in Gerrit to allow more comprehensive authnz frameworks. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Thu, Oct 11, 2012 at 7:48 PM, Ryan Lane rlan...@gmail.com wrote: On Thu, Oct 11, 2012 at 4:33 PM, Daniel Friesen dan...@nadir-seen-fire.com wrote: I was thinking about this recently too. Though I started thinking from the login form perspective. Things we should have: - Good build-in support for both single-authentication (everyone is in the user database, or everyone in ldap, etc...) and multi-authentication (some users are local, some are OAuth, others may be LDAP) and also the possibility of multiple auth types for one user. - A real abstract login form that lets extensions and auth systems simply add fields to the login/creation form without having to re-implement it and not work with other similar extensions. -- Perhaps also some meta information from auth plugins that let us say on the login form that a wiki is using LDAP or something. - Explicit support for auth systems using something other than the username. - Real support for auth systems involving a 3rd party. ie: Involving redirects such as OAuth, OpenID, and simple 3rd party login where the login link directs you to the login page of some forum, you get sent back, and somehow the extension knows what the session is. - Login form support for multiple authentication systems on the same wiki, incl. support for OAuth and OpenID like logins. That last one was the tricky one to figure out. Whatever is done, can it please be done as a refactor, rather than a rewrite? - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ExternalAuth and AuthPlugin relationships
On Fri, Oct 12, 2012 at 2:14 AM, Seb35 seb35wikipe...@gmail.com wrote: If there are multiple identification sources, what about unicity of usernames? i.e. who is User1 if it exists different people User1@OpenID and User1@RADIUS? the first who registers on the wiki? or is it assumed all User1 are the same people? Some kind of pipelining system, or pam like system would allow users to specify which service is used for identity, authentication, and authorization. That said, systems like this are pretty complicated to configure for end-users. Most auth extensions are already difficult to configure. Very few people need this level of flexibility. I think this could be accomplished by hooks easily enough. I have 3 authentication plugins working in unison on labsconsole.wikimedia.org (LdapAuthentication, OATHAuth, and OpenStackManager) plus ConfirmEdit (which requires a captcha for account creation). I'm using hooks to handle all of this. I could add on Kerberos, OpenID or some other form of auto-authentication if I liked without much issue. The current AuthPlugin system works for the most part. It just needs to be cleaned up and refactored. Its major issue is that core's authn/z system is really, really shitty and isn't properly maintained. If there's a rewrite it will very likely die like ExternalAuth. I have no plans on rewriting any of my authentication extensions from scratch, and I've written (or fixed) the majority of the auth extensions actually used. And if there is a rewrite of the auth, I want just point out that aside authentications like OpenID, OAuth, local DB, there are also some profesionnal authentication backend like Shibboleth, RADIUS, CAS, Kerberos that should be taken into account for enterprise wikis (it should be generic enough for these types of authentication). The current system can handle all of these already. - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] customizable IRC feed for bugzilla
This sounds pretty cool and very handy. How often does the bot poll the RSS feeds? Is this configurable? I tried to answer my own questions by looking at the code, but all I could find was a link to the SVN repo on the bot's wiki page (http://meta.wikimedia.org/wiki/Wm-bot), but there have been no commits there since June - is the code being maintained somewhere else now? Thanks! Arthur On Thu, Oct 11, 2012 at 7:04 PM, MZMcBride z...@mzmcbride.com wrote: Petr Bena wrote: That means anyone should be able to create a custom irc feed for bugzilla and use it in any wikimedia related irc channel you want (for example, right now we have a bugzilla feed in #wikimedia-labs that reports only labs related bugs). You can generate RSS feed in bugzilla, just by creating a new search, then you can click link Feed which is on bottom of each search results page. This sounds neat. :-) I didn't realize Bugzilla had RSS feed support. Currently wikibugs (the Bugzilla -- IRC bot in #mediawiki) parses mailing list messages (sent to wikibugs -l). If there's a machine-readable format that can be used, that would be awfully nice. MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Minor Gerrit upgrade deployed today
Hi everyone, You might've seen Gerrit go down for about 5 minutes a little while ago. We deployed a custom 2.4.2 build that includes a patch from OpenStack[0]. This was needed to support Jenkins + Zuul. Where's 2.5? We're still testing the release and hope to have an upgrade date in place sometime soon. Right now there's a pretty major regression in LDAP support that needs fixing before the production instance can be upgraded. Happy coding, -Chad [0] https://gerrit-review.googlesource.com/#/c/37930/ ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Please notice and report glitches - changes coming
Sorry for the spam, but the ContentHandler changes especially may affect you -- if you have any time this weekend or next week to do some testing, we'd appreciate it. Thanks. -Sumana Original Message Subject: Please notice and report big glitches - changes coming Date: Fri, 12 Oct 2012 17:14:05 -0400 From: Sumana Harihareswara suma...@wikimedia.org Organization: Wikimedia Foundation To: Coordination of technology deployments across languages/projects wikitech-ambassad...@lists.wikimedia.org On Monday we start deploying a new version of MediaWiki, 1.21wmf2, to the sites, starting with mediawiki.org and 2 test wikis (https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap). 1.21wmf2 will have 3 big new things in it and we need your help to test on the beta test site http://deployment.wikimedia.beta.wmflabs.org/wiki/Main_Page now to see if there are any really critical bugs. 1) The new ContentHandler ( https://www.mediawiki.org/wiki/ContentHandler ) might affect handing of CSS and JavaScript pages, import/export (including PDF export), and API stuff, especially when rendering and editing. I'd suggest we also look out for issues in template rendering, images and media handling, localisation, and mobile device access. (merged on Oct 9) 2) High-resolution image support. This work-in-progress will try to give higher-res images to high-density screens that can support it, like new Retina displays. More info at https://gerrit.wikimedia.org/r/#/c/24115/ . One of the bigger risks of the high res stuff is load-based, since we may see substantial new load on our image scalers. So *all* image scaling might be impacted. (merged on Oct 11) 3) Sites is a new backend to represent and store information about sites and site-specific configuration. This code is meant to replace the current interwiki code, but does not do so just yet. Still, keep an eye out for site-specific configuration or interwiki issues. Right now the version of MediaWiki on the beta cluster dates from 9 Oct and thus has ContentHandler but not the high-res image support or Sites. So please test on the beta sites now and look out for these issues on your sites in the weeks ahead. https://www.mediawiki.org/wiki/Category:MediaWiki_test_plans has some ideas on how to find errors. Thanks! With your help we can find bugs early and get them fixed before they affect lots of readers and editors. -- Sumana Harihareswara Engineering Community Manager Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Prefs removal
On Mon, Oct 8, 2012 at 11:12 PM, Erik Moeller e...@wikimedia.org wrote: I've pinged analytics to see if they can get us a better prefs report. Dario Taraborelli has kindly made a detailed dataset of preferences available. He can chime in with details if needed. The dataset can be found here: http://thedatahub.org/dataset/wikipedia-user-preferences These datasets capture the following: - user_properties set to '' from the default - active_prefs_0 - user_properties set to 1 from the default - active_prefs_1 - NOTE: This can be very misleading for non-boolean prefs - user_properties set to !='' from the default - active_prefs_all This is based on a dataset of _active_ users which is included. Unchanged prefs aren't included, specific non-boolean settings aren't included, and prefs with 5 users aren't included. This is en.wp, other languages to follow. Note there's all kinds of funkiness with how prefs may be serialized to the DB - prefs names and defaults may have changed, prefs may have been removed, and some stuff is serialized when it doesn't need to be (suggesting that prefs have been changed, when in fact the user has just performed a pref save). But for a lot of the more obscure prefs this should be good initial guidance. Will post some initial observations to https://www.mediawiki.org/wiki/Requests_for_comment/Core_user_preferences -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: SMWCon Fall 2012 Program
Hi Scott! We will do our best to organize the live stream but at the moment there is no guarantee. In any case, we will record the video from the conference and make it available for everyone after the conference. - Yury Katkov On Wed, Oct 10, 2012 at 10:48 PM, Scott MacLeod helia...@gmail.com wrote: Markus, Thanks for this, and great! Will any of this be streamed in real time? Cheers, Scott On Wed, Oct 10, 2012 at 11:41 AM, Markus Krötzsch mar...@semantic-mediawiki.org wrote: [Forwarded from semediawiki-user, apologies for cross-posting] (This is about the Semantic MediaWiki User Conference Fall 2012, see http://semantic-mediawiki.org/wiki/SMWCon_Fall_2012) Dear all, the program for the upcoming SMWCon in Cologne is becoming more and more stable [1]. Most talks should be at their almost final location now. There are quite a few highlights that are worth mentioning: * We have two exciting keynote talks by Denny Vrandecic (Wikimedia Germany e.V.) and Peter Haase (fluidOps): Denny will introduce Wikidata, the next big thing for Wikipedia, and the underlying software Wikibase. The co-operation of SMW and Wikidata will be an important topic of this SMWCon. Peter will introduce the Information Workbench, a semantic knowledge management solution by fluidOps. For the first time, SMWCon will include a number of talks on related systems that are not SMW. Other highlights in this category include OntoWiki, BlueSpice, SlideWiki, and the Drupal-based Planetary System. I am sure that it will be insightful and inspiring to exchange experiences with these projects. * We'll have a number of practical experience talks. I am particularly looking forward to the insights of Wikia Inc., presented by Krzysztof Krzyżaniak (eloy). * Joel Natividad will join us live from NY to report about his award-winning sites and new smart city projects. * And of course there will be plenty of updates on SMW and its old and new extensions, including a number of presentations about using SMW in completely new ways. The tutorial day will leave more space for discussions and practical work, especially to discuss problems and ideas with the developers (as usual, we will have a very high concentration of those). As a special non-semantic tutorial, Yury Katkov will share his first-hand experience in fighting spam on semanticweb.org and semantic-mediawiki.org, a topic that concerns many public SMW sites. Cheers, Markus [1] http://semantic-mediawiki.org/wiki/SMWCon_Fall_2012/Agenda ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- http://scottmacleod.com/worlduniversityandschool.htm This email is intended only for the use of the individual or entity to which it is addressed and may contain information that is privileged and confidential. If the reader of this email message is not the intended recipient, you are hereby notified that any dissemination, distribution, or copying of this communication is prohibited. If you have received this email in error, please notify the sender and destroy/delete all copies of the transmittal. Thank you. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ExternalAuth and AuthPlugin relationships
The problem is that the current system cannot handle multiple authentication or authorization sources at once. Furthermore, the system for granting and blocking permissions is ridiculously crude. This new system I proposed would allow more flexibility and security for authentication and authorization, something that MediaWiki needs. If implemented, we can wrap AuthPlugin in the new system and then deprecate it. @Seb35 - What I'm thinking is that, like the current ExternalAuth, an external user will be linked to local users. The only time there will be a conflict is if somehow a user is able to successfully authenticate to two different services separately and each has a different linked local account, which would be rare because it implies the user has and used authentication info for two separate accounts simultaneously. The only problem would be figuring out how to link already established local accounts to other external services as they are added in, e.g., a user who registered using OpenID and wants to add their LDAP account. @Chris - Agreed on all points. In the idea I sent out, there is no such thing as a password. There is just an array of authentication data, which is raw data captured from the authentication point. This is one advantage over AuthPlugin, which requires a username/password scheme. And I believe, if we were to do this, we could have an AuthPluginProvider, which would wrap around $wgAuth for backwards compatibility. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Fri, Oct 12, 2012 at 2:09 PM, Ryan Lane rlan...@gmail.com wrote: On Fri, Oct 12, 2012 at 2:14 AM, Seb35 seb35wikipe...@gmail.com wrote: If there are multiple identification sources, what about unicity of usernames? i.e. who is User1 if it exists different people User1@OpenIDand User1@RADIUS? the first who registers on the wiki? or is it assumed all User1 are the same people? Some kind of pipelining system, or pam like system would allow users to specify which service is used for identity, authentication, and authorization. That said, systems like this are pretty complicated to configure for end-users. Most auth extensions are already difficult to configure. Very few people need this level of flexibility. I think this could be accomplished by hooks easily enough. I have 3 authentication plugins working in unison on labsconsole.wikimedia.org (LdapAuthentication, OATHAuth, and OpenStackManager) plus ConfirmEdit (which requires a captcha for account creation). I'm using hooks to handle all of this. I could add on Kerberos, OpenID or some other form of auto-authentication if I liked without much issue. The current AuthPlugin system works for the most part. It just needs to be cleaned up and refactored. Its major issue is that core's authn/z system is really, really shitty and isn't properly maintained. If there's a rewrite it will very likely die like ExternalAuth. I have no plans on rewriting any of my authentication extensions from scratch, and I've written (or fixed) the majority of the auth extensions actually used. And if there is a rewrite of the auth, I want just point out that aside authentications like OpenID, OAuth, local DB, there are also some profesionnal authentication backend like Shibboleth, RADIUS, CAS, Kerberos that should be taken into account for enterprise wikis (it should be generic enough for these types of authentication). The current system can handle all of these already. - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l