Re: [Wikitech-l] Watchlistr.com, an outside site that asks for Wikimedia passwords

2009-07-22 Thread Michael Rosenthal
The toolserver rules forbid that:
https://wiki.toolserver.org/view/Rules (#8)

However there is gWatch which works without authentication:
http://toolserver.org/~luxo/gwatch/login.php



On Wed, Jul 22, 2009 at 9:59 PM, David Gerarddger...@gmail.com wrote:
 2009/7/22 Sage Ross ragesoss+wikipe...@gmail.com:

 http://www.watchlistr.com/ is a site that creates aggregate watchlists
 across multiple projects. See
 http://en.wikipedia.org/w/index.php?title=Wikipedia:Bounty_board#Transwiki_watchlist_tool
 The user who made it has very little editing history, and the site
 aggregates watchlists across multiple projects, but requires inputting
 your Wikimedia password into the watchlistr.com site.  I have no
 specific reason to think it's a scam, but if I was trying to phish
 passwords I would do something like this.


 Would something on the toolserver be safe enough in these terms?


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MW QA

2009-07-16 Thread Michael Rosenthal
Please note that there are some parser tests which in theory should
pass but never did in any version (thus they were not implemented in
the software).

On Thu, Jul 16, 2009 at 5:55 PM, dan nessettdness...@yahoo.com wrote:

 I have never been a QA engineer. However, it doesn't require great experience 
 to see that the MW software development process is broken. I provide the 
 following comments not in a destructive spirit. The success of the MW 
 software is obvious. However, in my view unless the development process 
 introduces some QA procedures, the code eventually will collapse and its 
 reputation will degrade.

 My interest in MW (the software, not the organization) is driven by a desire 
 to provide an enhancement in the form of an extension. So, I began by 
 building a small development environment on my machine (a work in progress). 
 Having developed software for other organizations, I intuitively sought out 
 what I needed in terms of testing in order to provide a good quality 
 extension. This meant I needed to develop unit tests for my extension and 
 also to perform regression testing on the main code base after installing it. 
 Hence some of my previous questions to this email list.

 It soon became apparent that the MW development process has little or no 
 testing procedures. Sure, there are the parser tests, but I couldn't find any 
 requirement that developers had to run them before submitting patches.

 Out of curiosity, I decided to download 1.16a (r52088), use the LocalSettings 
 file from my local installation (1.14) and run some parser tests. This is not 
 a scientific experiment, since the only justification for using these 
 extensions in the tests is I had them installed in my personal wiki. However, 
 there is at least one thing to learn from them. The results are:

 Mediawiki 52088 Parser Tests

 Extensions : 1) Nuke, 2) Renameuser, 3) Cite, 4) ParserFunctions, 5) CSS 
 Style Sheets, 6) ExpandTemplates, 7) Gadgets, 8) Dynamic Page List, 9) 
 Labeled Section Transclusion. The last extension has 3 require_once files: a) 
 lst.php, b) lsth.php, and c) compat.php.

 Test    Extensions                      ParserTests Test Fails

 1       1,2,3,4,5,6,7,8,9               19
 2       1                               14
 3       2                               14
 4       3                               14
 5       4                               14
 6       5                               14
 7       6                               14
 8       7                               14
 9       8                               14
 10      9 (abc)                         19
 11      9 (a)                           18
 12      9 (ab)                          19
 13      1,2,3,4,6,7                     14

 Note that the extension that introduces all of the unexpected parser test 
 failures is Labeled Section Transclusion. According to its documentation, it 
 is installed on *.wikisource.org, test.wikipedia.org, and en.wiktionary.org.

 I am new to this development community, but my guess is since there are no 
 testing requirements for extensions, its author did not run parser tests 
 before publishing it. (I don't mean to slander him and I am open to the 
 correction that it ran without unexpected errors on the MW version he tested 
 against.)

 This rather long preamble leads me to the point of this email. The MW 
 software development process needs at least some rudimentary QA procedures. 
 Here are some thoughts on this. I offer these to initiate debate on this 
 issue, not as hard positions.

 * Before a developer commits a patch to the code base, he must run parser 
 tests against the change. The patch should not be committed if it increases 
 the number of parser test failures. He should document the results in the 
 bugzilla bug report.

 * If a developer commits a patch without running parser tests or commits a 
 patch that increases the number of parser test failures, he should be warned. 
 If he does this another time with some time interval (say 6 months), his 
 commit privileges are revoked for some period of time (say 6 months). The 
 second time he becomes a candidate for commit privilege revocation, they will 
 be revoked permanently.

 * An extension developer also should run parser tests against a MW version 
 with the extension installed. The results of this should be provided in the 
 extension documentation. An extension should not be added to the extension 
 matrix unless it provides this information.

 * A test harness that performs regression tests (currently only parser tests) 
 against every trunk versions committed in the last 24 hours should be run 
 nightly. The installed extensions should be those used on the WMF machines. 
 The results should be published on some page on the Mediawiki site. If any 
 version increases the number of parser test failures, the procedure described 
 above for developers is initiated.

 * A group of developers should have the 

Re: [Wikitech-l] Google web bugs in Mediawiki js from admins - technical workarounds?

2009-06-04 Thread Michael Rosenthal
I suggest keep the bug on Wikimedia's servers and using a tool which
relies on SQL databases. These could be shared with the toolserver
where the official version of the analysis tool runs and users are
enabled to run their own queries (so taking a tool with a good
database structure would be nice). With that the toolserver users
could set up their own cool tools on that data.

On Thu, Jun 4, 2009 at 4:34 PM, David Gerard dger...@gmail.com wrote:
 2009/6/4 Daniel Kinzler dan...@brightbyte.de:
 David Gerard schrieb:

 Keeping well-meaning admins from putting Google web bugs in the
 JavaScript is a game of whack-a-mole.
 Are there any technical workarounds feasible? If not blocking the

 Perhaps the solution would be to simply set up our own JS based usage 
 tracker?
 There are a few options available
 http://en.wikipedia.org/wiki/List_of_web_analytics_software, and for 
 starters,
 the backend could run on the toolserver.
 Note that anything processing IP addresses will need special approval on the 
 TS.


 If putting that on the toolserver passes privacy policy muster, that'd
 be an excellent solution. Then external site loading can be blocked.

 (And if the toolservers won't melt in the process.)


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Rollback feature

2009-05-07 Thread Michael Rosenthal
Yeah, that is the behavior which should be there.

On Thu, May 7, 2009 at 9:56 PM, Strainu strain...@gmail.com wrote:
 Hi,

 Is the rollback feature available to sysops MEANT to remove all the
 consecutive contributions of a certain user? Shouldn't it erase just the
 latest change?

 Here is an example:
 http://ro.wikipedia.org/w/index.php?title=Facultatea_de_Mecanic%C4%83_a_Universit%C4%83%C5%A3ii_%E2%80%9EPolitehnica%E2%80%9D_din_Timi%C5%9Foaraaction=history(the
 changes marked with Revenire
 la ultima modificare... are made using this feature). You can see that it
 reverted 2 or 3 versions back.

 Thanks,
  Strainu
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mobile gateway testing

2009-02-23 Thread Michael Rosenthal
The page history is completely missing, as well as a link to the
GFDL-copy, the mobile mirror is therefore not GFDL-conform.

On Fri, Feb 20, 2009 at 12:07 AM, Chad innocentkil...@gmail.com wrote:
 I think for anything other than casual browsing, using
 native clients would be better. Perfect use of the API.
 I've been trying to get one for Android into decent
 shape.

 -Chad

 On Feb 19, 2009 6:03 PM, Brion Vibber br...@wikimedia.org wrote:

 On 2/19/09 4:14 AM, Angela wrote:  On Thu, Feb 19, 2009 at 9:42 PM, David
 Gerarddger...@gmail.com...
 That depends on how many people have been announcing new mobile
 platforms. :)

 In this work we're mainly targeting the popular has-a-good-browser
 smartphones. These will be able to do decent editing at least for little
 bits here and there, and creating a decent interface for doing that --
 commenting, typo-fixing, slapping up pictures, maybe taking notes for
 later serious editing -- would be great.

 The first attack surface is a decent read-and-search UI, of course,
 since that's going to be the biggest use.

 -- brion

 ___ Wikitech-l mailing list
 wikitec...@lists.wikimedia
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Enabling Flash in Wiki via extension

2009-02-14 Thread Michael Rosenthal
# You may want to write your own tag extension[1] or use an existing
one (which does not exactly do want you want)[2]
# clicking on the edit tab and changing action=edit in action=raw
(in the URI-bar) gives you a URI which spites out the raw wikitext[3]

[1] http://www.mediawiki.org/wiki/Manual:Tag_extensions
[2] http://www.mediawiki.org/wiki/Extension:Secure_HTML
[3] http://www.mediawiki.org/wiki/Manual:Page_action#Raw

On Sat, Feb 14, 2009 at 2:41 AM, David Di Biase dave.dibi...@gmail.com wrote:
 Hi there,

 I'm building a somewhat unique Wiki that will involve me writing an
 extension that allows us to embed Flash content (ie. streaming video). We're
 doing research within the communication/media industry and need this
 ability. There are two specific issues I need to solve:

   - First, and simplest, I'm having trouble writing out Flash object
   references ie. object classid=. as HTML. I'm wondering how I can get
   Wiki to ignore the fact that my parse function extension is giving it actual
   HTML to render and not just Wiki format?
   - The second issue I'm having is that we would like to format a specific
   data structure within a Wiki article and have the Flash application retrieve
   information from the article and parse it. The data is presented in the
   other article as a simple list ie.

   * Studio Name 1, 1980-1990
   * Studio Name 2, 1980-1990
   * Studio Name 3, 1980-1990
   * Studio Name 4, 1980-1990
   * Studio Name 5, 1980-1990
   * Studio Name 6, 1980-1990
   * Studio Name 7, 1980-1990

   I'm wondering how I can extract that particular page for parsing by my
   Flash. My thought was simple to create a script within my extensions folder
   that spits out just the raw data from the Wiki page. So in essence when the
   page is loaded the Flash app would make a call to
   /extensions/myext/getarticledata.php?title=whatever here

 I have been poking around the MediaWiki articles on how to accomplish these
 types of things, but I can't seem to find a solid solution. I'm a beginner
 Wiki developer so some initial guidance would be super appreciated :-)

 Cheers and thanks!

 Dave
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Article blaming

2009-01-23 Thread Michael Rosenthal
If you mean something like that, here are some:

http://de.wikipedia.org/wiki/Benutzer:Jah/Rhic

http://de.wikipedia.org/wiki/Benutzer:APPER/WikiHistory

On Fri, Jan 23, 2009 at 8:19 PM, Platonides platoni...@gmail.com wrote:
 With all the discussion on foundation-l about contributors and
 attribution, I have noted that while there're two different
 implementations for blaming mediawiki articles, none of them seem to be
 publically available.
 There're some example results, but not the tools themselves.

 The implementations I am aware are:
 *Roman Nosov (svn user roman) blamemap extension (2006-2007), which was
 available at
 http://217.147.83.36:9001/wiki/Freebsd?trackchanges=blamemapoldid=1524

 *Greg Hewgill wikiblame (2008)
 http://hewgill.com/journal/entries/461-wikipedia-blame

 Is the code available and I have missed it? Do we have any other
 implementation?





 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l