Re: [Wikitech-l] $wgRestrictionLevels
Le 24.11.2008 19:27, « Roan Kattouw » <[EMAIL PROTECTED]> a écrit : > K. Peachey schreef: >> I created a new usergroup in my mediawiki install called Staff and I >> want to set up a Restriction level so i can page edit >> restrictions/protection except the manual page about RestrcitionLevels >> <http://www.mediawiki.org/wiki/Manual:$wgRestrictionLevels> doesn't >> show much information about it so i was wondering how i would go about >> setting it up? > $wgRestrictionLevels[] = 'staff'; > > in LocalSettings.php > Remember that values in $wgRestrictionLevels are rights and *not* groups (except for sysop which is kept for backward compatibility). To use this, you need to add something like this in LocalSettings.php: $wgGroupPermissions['staff']['staff'] = true; Alexandre Emsenhuber (ialex) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Serving as xhtml+xml
Le 19.3.2009 4:46, « lee worden » a écrit : > Attached is a patch for the skins directory that allows changing the > Content-type dynamically. After applying this patch, if any code sets the > global $wgServeAsXHTML to true, the page will be output with the xhtml+xml > content type. This seems to work fine with the existing MW XHTML pages. Can't you set $wgMimeType = 'application/xhtml+xml'; in LocalSettings.php to serve pages with application/xhtml+xml content type? Alexandre Emsenhuber (ialex) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New preferences system
Le 24 avr. 09 à 16:15, John Doe a écrit : > thanks, I take this as the first step in creating global preferences? Global preferences were added with this rewrite. There is just a checkbox saying "use these preferences on all projects" at the bottom of Special:Preferences. Alexandre Emsenhuber (ialex) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] $this->html('bodytext')
This is OutputPage::$mBodytext from OutputPage.php ($out is generally $wgOut in this context). cheers, Alexandre Emsenhuber (ialex) Le 29 mai 09 à 11:32, magggus a écrit : > Hi @all, > > Could someone please tell me how the value of 'bodytext' in > MonoBook.php page > content is set? > Seems that it references to the variable $mBodytext in > SkinTemplate.php > ($tpl->setRef( 'bodytext', $out->mBodytext );) > > But I can't comprehend where $mBodytext is filled? > > > greets & thanks > magggus > > > ___ > Wikitech-l mailing list > Wikitech-l@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Is there any evidence that the tests in phase3/maintenance/Tests ever worked? (errata)
The code that starts the tests is in PHPUnit (which is required to run these tests) and is triggered by the line "require( 'PHPUnit/TextUI/ Command.php' );" in run-tests.php. Cheers! Alexandre Emsenhuber Le 3 août 09 à 20:06, dan nessett a écrit : > > That is, the tests in ../phase3/tests. > > --- On Mon, 8/3/09, dan nessett wrote: > >> From: dan nessett >> Subject: [Wikitech-l] Is there any evidence that the tests in >> phase3/maintenance/Tests ever worked? >> To: wikitech-l@lists.wikimedia.org >> Date: Monday, August 3, 2009, 11:02 AM >> >> I am working on the tests in ../phase3/maintenance/tests. I >> have found 2 problems (one of which may be only locally >> relevant). When I follow the logic initiated in >> run-tests.php (which according to the target in the Makefile >> seems to be the initiator of the tests) there are a lot of >> includes, but nothing in these appears to lead to anything >> that might start the tests. >> >> I am beginning to suspect that these tests never worked. >> However, I am open to correction. Has anyone ever run these >> tests successfully? >> >> >> >> >> ___ >> Wikitech-l mailing list >> Wikitech-l@lists.wikimedia.org >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l >> > > > > > ___ > Wikitech-l mailing list > Wikitech-l@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MW test infrastructure architecture
Le 11 août 09 à 22:03, Chad a écrit : > > To be perfectly honest, I'm of the opinion that tests/ and t/ > should be scrapped and it should all be done over, properly. > > What we need is an easy and straightforward way to write > test cases, so people are encouraged to write them. Right > now, nobody understands wtf is going on in tests/ and t/, so > they get ignored and the /vast/ majority of the code isn't tested. > > What we need is something similar to parser tests, where it's > absurdly easy to pop new tests in with little to no coding > required at all. Also, extensions having the ability to inject > their own tests into the framework is a must IMHO. > > -Chad +1, we could maybe write our own test system that can be based on the new Maintenance class, since we already some test scripts in / maintenance/ (cdb-test.php, fuzz-tester.php, parserTests.php, preprocessorFuzzTest.php and syntaxChecker.php). Porting tests such as parser to PHPUnit is a pain, since it has no native way to write a test suite that has a "unknow" number of tests to run. Alexandre Emsenhuber (ialex) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MW test infrastructure architecture
Le 11 août 09 à 22:51, dan nessett a écrit : > --- On Tue, 8/11/09, Alexandre Emsenhuber > wrote: > >> +1, we could maybe write our own test system that can be >> based on the >> new Maintenance class, since we already some test scripts >> in / >> maintenance/ (cdb-test.php, fuzz-tester.php, >> parserTests.php, >> preprocessorFuzzTest.php and syntaxChecker.php). Porting >> tests such as >> parser to PHPUnit is a pain, since it has no native way to >> write a >> test suite that has a "unknow" number of tests to run. > > Rewriting parserTests as php unit tests would be a horrible waste of > time. parserTests works and it provides a reasonable service. One > problem, however, is how do we fix the parser? It seems it is a > pretty complex code system (when I ran a MacGyver test on > parserTests, 141 files were accessed, most of which are associated > with the parser). I have been thinking about this, but those > thoughts are not yet sufficiently clear to make public yet. > > On the other hand, taking the parserTests route and doing all of our > own test infrastructure would also be a good deal of work. There are > tools out there (phpuint and prove) that are useful. In my view > creating a test infrastructure from scratch would unnecessarily > waste time and resources. > > Dan > My idea is the move the "backend" of ParserTest (parserTests.txt file processing, result reporting, ...) and the TestRecorder stuff to something like a MediaWikiTests class that extends Maintenance and move the rest in a file in /maintenance/tests/ (to be created) and re- use the backend to have files that have the same format, but test's input could be raw PHP code (a bit like PHP core's tests) with a new config variable that's like $wgParserTestFiles but for these kind of test. This mostly concerns the actual tests in /tests/ and /t/inc/). We can also port cdb-test.php, fuzz-tester.php, preprocessorFuzzTest.php and syntaxChecker.php to this new system and then create a script in /maintenance/ that runs all the tests in / maintenance/tests/. This allows to also upload all the tests to CodeReview, not only the parser tests. A benefit is that we can get ride of /tests/ and /t/. Alexandre Emsenhuber (ialex) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l