Re: [Wikitech-l] Developer Meet-Up in Berlin, April 14-16

2010-02-06 Thread Roan Kattouw
2010/2/5 Trevor Parscal tpars...@wikimedia.org:
 Do we know what location of the conference? I'm sure most of us want to
 book hotels as early on as possible, and as close to the venue as possible.

WMDE let people book their lodging through them last year, and Daniel
seems to imply this'll be done again this year, so I guess folks can
always go that route.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New phpunit tests eat ~1GB of memory

2010-02-06 Thread Tei
off-topic-ish

theres also a function to explicit call the collector (sorry, I forgot the name)

it seems php only flag things for collecting  (wen you unset($stuff)
), but never really collect then. The documentation says that the
collector will run wen theres not work to do, but this seems a very
rare event (maybe is never triggered).

hu,,

On 6 February 2010 03:37, Jared Williams jared.willia...@ntlworld.com wrote:

 A guess would be to try PHP 5.3, and enable the garbage collector.

 http://www.php.net/manual/en/function.gc-enable.php

 Jared

 -Original Message-
 From: wikitech-l-boun...@lists.wikimedia.org
 [mailto:wikitech-l-boun...@lists.wikimedia.org] On Behalf Of
 Ævar Arnfjörð Bjarmason
 Sent: 06 February 2010 01:05
 To: wikitech-l@lists.wikimedia.org
 Cc: mediawiki-...@lists.wikimedia.org
 Subject: [Wikitech-l] New phpunit tests eat ~1GB of memory

 Since the tests were ported from t/ to phpunit's
 phase3/maintenance/tests/ in r61938 and other commits running
 the tests on my machine takes up to 1GB of memory and grows
 as it runs more tests. It seems that phpunit uses the same
 instance of the php interpreter for running all the tests.

 Is there some way around this? Perhaps phpunit.xml could be
 tweaked so that it runs a new php for each test?

 Furthermore when I run `make test' I get this:

     Time: 03:35, Memory: 1849.25Mb

     There were 2 failures:

     1) LanguageConverterTest::testGetPreferredVariantUserOption
     Failed asserting that two strings are equal.
     --- Expected
     +++ Actual
     @@ @@
     -tg-latn
     +tg


 /home/avar/src/mw/trunk/phase3/maintenance/tests/LanguageConve
 rterTest.php:82

     2) Warning
     No tests found in class ParserUnitTest.

     FAILURES!
     Tests: 686, Assertions: 3431, Failures: 2, Incomplete: 34

 But when I run phpunit manually on the test then all tests pass:

     $ phpunit LanguageConverterTest.php
     PHPUnit 3.4.5 by Sebastian Bergmann.

     .

     Time: 23 seconds, Memory: 23.75Mb

     OK (9 tests, 34 assertions)

 Also after I get Tests: 686, Assertions: 3431, Failures: 2,
 Incomplete: 34 in the first output phpunit doesn't exit and
 continues hugging my memory. Why is it still running? It has
 already run all the tests.

 On Wed, Feb 3, 2010 at 17:35,  ia...@svn.wikimedia.org wrote:
  http://www.mediawiki.org/wiki/Special:Code/MediaWiki/61938
 
  Revision: 61938
  Author:   ialex
  Date:     2010-02-03 17:35:59 + (Wed, 03 Feb 2010)
 
  Log Message:
  ---
  * Port tests from t/inc/
  * Added new tests to XmlTest
 
  Added Paths:
  ---
     trunk/phase3/tests/LicensesTest.php
     trunk/phase3/tests/SanitizerTest.php
     trunk/phase3/tests/TimeAdjustTest.php
     trunk/phase3/tests/TitleTest.php
     trunk/phase3/tests/XmlTest.php
 
  Added: trunk/phase3/tests/LicensesTest.php
 
 ===
  --- trunk/phase3/tests/LicensesTest.php
     (rev
  0)
  +++ trunk/phase3/tests/LicensesTest.php 2010-02-03 17:35:59
 UTC (rev
  +++ 61938)
  @@ -0,0 +1,17 @@
  +?php
  +
  +/**
  + * @group Broken
  + */
  +class LicensesTest extends PHPUnit_Framework_TestCase {
  +
  +       function testLicenses() {
  +               $str = 
  +* Free licenses:
  +** GFLD|Debian disagrees
  +;
  +
  +               $lc = new Licenses( $str );
  +               $this-assertTrue( is_a( $lc, 'Licenses' ),
 'Correct
  +class' );
  +       }
  +}
  \ No newline at end of file
 
 
  Property changes on: trunk/phase3/tests/LicensesTest.php
 
 ___
  Added: svn:eol-style
    + native
 
  Added: trunk/phase3/tests/SanitizerTest.php
 
 ===
  --- trunk/phase3/tests/SanitizerTest.php

  (rev 0)
  +++ trunk/phase3/tests/SanitizerTest.php        2010-02-03
 17:35:59
  +++ UTC (rev 61938)
  @@ -0,0 +1,71 @@
  +?php
  +
  +global $IP;
  +require_once( $IP/includes/Sanitizer.php );
  +
  +class SanitizerTest extends PHPUnit_Framework_TestCase {
  +
  +       function testDecodeNamedEntities() {
  +               $this-assertEquals(
  +                       \xc3\xa9cole,
  +                       Sanitizer::decodeCharReferences(
  + 'eacute;cole' ),
  +                       'decode named entities'
  +               );
  +       }
  +
  +       function testDecodeNumericEntities() {
  +               $this-assertEquals(
  +                       \xc4\x88io bonas dans l'\xc3\xa9cole!,
  +                       Sanitizer::decodeCharReferences(
 #x108;io
  + bonas dans l'#233;cole! ),
  +                       'decode numeric entities'
  +               );
  +       }
  +
  +       function testDecodeMixedEntities() {
  +               $this-assertEquals(
  +                       \xc4\x88io bonas dans l'\xc3\xa9cole!,
  +                       Sanitizer::decodeCharReferences(
 #x108;io
  + bonas dans l'eacute;cole! ),
  +                       

Re: [Wikitech-l] Theora video in IE? Use Silverlight!

2010-02-06 Thread Liangent
On 2/6/10, David Gerard dger...@gmail.com wrote:

 The thirty-second startup time of Java for Cortado makes it unusable,
 in my experience. Here's to Firefox 3.5.


Firefox supports video since 3.5, so there's no need to use Cortado.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] New phpunit tests eat ~1GB of memory

2010-02-06 Thread Ævar Arnfjörð Bjarmason
On Sat, Feb 6, 2010 at 01:04, Ævar Arnfjörð Bjarmason ava...@gmail.com wrote:
 Since the tests were ported from t/ to phpunit's
 phase3/maintenance/tests/ in r61938 and other commits running the
 tests on my machine takes up to 1GB of memory and grows as it runs
 more tests. It seems that phpunit uses the same instance of the php
 interpreter for running all the tests.

 Is there some way around this? Perhaps phpunit.xml could be tweaked so
 that it runs a new php for each test?

 Furthermore when I run `make test' I get this:

    Time: 03:35, Memory: 1849.25Mb

    There were 2 failures:

    1) LanguageConverterTest::testGetPreferredVariantUserOption
    Failed asserting that two strings are equal.
    --- Expected
    +++ Actual
    @@ @@
    -tg-latn
    +tg

    
 /home/avar/src/mw/trunk/phase3/maintenance/tests/LanguageConverterTest.php:82

    2) Warning
    No tests found in class ParserUnitTest.

    FAILURES!
    Tests: 686, Assertions: 3431, Failures: 2, Incomplete: 34

 But when I run phpunit manually on the test then all tests pass:

    $ phpunit LanguageConverterTest.php
    PHPUnit 3.4.5 by Sebastian Bergmann.

    .

    Time: 23 seconds, Memory: 23.75Mb

    OK (9 tests, 34 assertions)

 Also after I get Tests: 686, Assertions: 3431, Failures: 2,
 Incomplete: 34 in the first output phpunit doesn't exit and continues
 hugging my memory. Why is it still running? It has already run all the
 tests.

I've worked around this by adding a 'make tap' target which runs the
phpunit tests individually with Test::Harness. I made it the default
target due to the problems with running all the tests at once with
phpunit:

http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62071
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62072

Does something run these tests or the parsertests automatically? It
would be really neat to test all svn revisions of MediaWiki and report
the results on Special:Code. I think I read somewhere that something
runs the parsertests automatically.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New phpunit tests eat ~1GB of memory

2010-02-06 Thread Chad
On Sat, Feb 6, 2010 at 11:24 AM, Ævar Arnfjörð Bjarmason
ava...@gmail.com wrote:
 On Sat, Feb 6, 2010 at 01:04, Ævar Arnfjörð Bjarmason ava...@gmail.com 
 wrote:
 Since the tests were ported from t/ to phpunit's
 phase3/maintenance/tests/ in r61938 and other commits running the
 tests on my machine takes up to 1GB of memory and grows as it runs
 more tests. It seems that phpunit uses the same instance of the php
 interpreter for running all the tests.

 Is there some way around this? Perhaps phpunit.xml could be tweaked so
 that it runs a new php for each test?

 Furthermore when I run `make test' I get this:

    Time: 03:35, Memory: 1849.25Mb

    There were 2 failures:

    1) LanguageConverterTest::testGetPreferredVariantUserOption
    Failed asserting that two strings are equal.
    --- Expected
    +++ Actual
    @@ @@
    -tg-latn
    +tg

    
 /home/avar/src/mw/trunk/phase3/maintenance/tests/LanguageConverterTest.php:82

    2) Warning
    No tests found in class ParserUnitTest.

    FAILURES!
    Tests: 686, Assertions: 3431, Failures: 2, Incomplete: 34

 But when I run phpunit manually on the test then all tests pass:

    $ phpunit LanguageConverterTest.php
    PHPUnit 3.4.5 by Sebastian Bergmann.

    .

    Time: 23 seconds, Memory: 23.75Mb

    OK (9 tests, 34 assertions)

 Also after I get Tests: 686, Assertions: 3431, Failures: 2,
 Incomplete: 34 in the first output phpunit doesn't exit and continues
 hugging my memory. Why is it still running? It has already run all the
 tests.

 I've worked around this by adding a 'make tap' target which runs the
 phpunit tests individually with Test::Harness. I made it the default
 target due to the problems with running all the tests at once with
 phpunit:

    http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62071
    http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62072

 Does something run these tests or the parsertests automatically? It
 would be really neat to test all svn revisions of MediaWiki and report
 the results on Special:Code. I think I read somewhere that something
 runs the parsertests automatically.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It's supposed to be running the parser tests and uploading them on
commit, but that's been broken for a little while now. If we're got a
nice standard output from the tests (I think the XML is pretty suited
for this), we should be able to upload that result to Code Review.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New phpunit tests eat ~1GB of memory

2010-02-06 Thread Ævar Arnfjörð Bjarmason
On Sat, Feb 6, 2010 at 16:27, Chad innocentkil...@gmail.com wrote:
 On Sat, Feb 6, 2010 at 11:24 AM, Ævar Arnfjörð Bjarmason
 ava...@gmail.com wrote:
 On Sat, Feb 6, 2010 at 01:04, Ævar Arnfjörð Bjarmason ava...@gmail.com 
 wrote:
 Since the tests were ported from t/ to phpunit's
 phase3/maintenance/tests/ in r61938 and other commits running the
 tests on my machine takes up to 1GB of memory and grows as it runs
 more tests. It seems that phpunit uses the same instance of the php
 interpreter for running all the tests.

 Is there some way around this? Perhaps phpunit.xml could be tweaked so
 that it runs a new php for each test?

 Furthermore when I run `make test' I get this:

    Time: 03:35, Memory: 1849.25Mb

    There were 2 failures:

    1) LanguageConverterTest::testGetPreferredVariantUserOption
    Failed asserting that two strings are equal.
    --- Expected
    +++ Actual
    @@ @@
    -tg-latn
    +tg

    
 /home/avar/src/mw/trunk/phase3/maintenance/tests/LanguageConverterTest.php:82

    2) Warning
    No tests found in class ParserUnitTest.

    FAILURES!
    Tests: 686, Assertions: 3431, Failures: 2, Incomplete: 34

 But when I run phpunit manually on the test then all tests pass:

    $ phpunit LanguageConverterTest.php
    PHPUnit 3.4.5 by Sebastian Bergmann.

    .

    Time: 23 seconds, Memory: 23.75Mb

    OK (9 tests, 34 assertions)

 Also after I get Tests: 686, Assertions: 3431, Failures: 2,
 Incomplete: 34 in the first output phpunit doesn't exit and continues
 hugging my memory. Why is it still running? It has already run all the
 tests.

 I've worked around this by adding a 'make tap' target which runs the
 phpunit tests individually with Test::Harness. I made it the default
 target due to the problems with running all the tests at once with
 phpunit:

    http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62071
    http://www.mediawiki.org/wiki/Special:Code/MediaWiki/62072

 Does something run these tests or the parsertests automatically? It
 would be really neat to test all svn revisions of MediaWiki and report
 the results on Special:Code. I think I read somewhere that something
 runs the parsertests automatically.

 It's supposed to be running the parser tests and uploading them on
 commit, but that's been broken for a little while now.

What system is this that's running automatic tests on commits? I was
investigating setting up a buildbot (http://buildbot.net/) which could
have multiple test clients and report tests to IRC/XML which
Special:Code could then use.

What does the now-broken Special:Code test system use?

 If we're got a
 nice standard output from the tests (I think the XML is pretty suited
 for this), we should be able to upload that result to Code Review.

$ prove -j 10 -e 'phpunit --tap' -Q *Test*.php
All tests successful.
Files=20, Tests=692, 31 wallclock secs ( 0.34 usr  0.21 sys +
18443939634.30 cusr 2803481.20 csys = 18446743116.05 CPU)
Result: PASS

You can get pretty HTML like this:

$ prove --formatter TAP::Formatter::HTML -j 10 -e 'phpunit --tap'
-Q *Test*.php  ~/www/mw-tap-out.html

Which gives you something like this:

http://v.nix.is/~avar/mw-tap-out.html

That can be parsed with any XML parser that just has to look for td
class=results and div id=summary class=passed or div
id=summary class=failed

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Version control

2010-02-06 Thread Daniel Friesen
I believe we've had two discussions in the past on switching to git.

I talked to Tim about various other advantages of .git, like the lack of
autoprops annoyance, and corrected the notion that there isn't a Windows
client and his reponse was maybe in a year or two.

Generally the limitation is the fact that we're currently abusing svn's
ability to only check out specific directories rather than an entire
repo to make it easy to check out all the extensions or individual ones
without any trouble.
We've had ideas like using git submodules to mark stable versions of
extensions so extension repos can be flexibly checked out.

Oh something interesting. With a bit of trickery I recently managed to
splice the entire history of one git repo into a branch of another git
repo creating a git repo that has two separate initial commits in two
separate branches. And from the looks of it it's perfectly possible to
fetch history from the original repo into the proper branch. So it
should be interestingly possible to create a script that fetches history
updates for every extension at once by embedding them all into separate
branches of a single git repo, and then locally (with no network
latency) pulling the history from those branches into the real repos.

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

Max Semenik wrote:
 Since there are some talks about migration from SVN anyway[1], I
 decided to unshelf my essay on this matter.[2] It discusses possible
 alternatives to Subversion and is in no way complete, everyone is
 invited to participate in drafting and discussing. Some sections need
 input of those who have practical experience with those systems, as
 for example I've never used Bazaar.

 --
 [1] http://www.mediawiki.org/w/index.php?curid=44222diff=301482oldid=301369
 [2] http://www.mediawiki.org/wiki/Source_control_considerations

   


-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Version control

2010-02-06 Thread Ævar Arnfjörð Bjarmason
On Sat, Feb 6, 2010 at 21:13, Daniel Friesen li...@nadir-seen-fire.com wrote:
 I believe we've had two discussions in the past on switching to git.

 I talked to Tim about various other advantages of .git, like the lack of
 autoprops annoyance, and corrected the notion that there isn't a Windows
 client and his reponse was maybe in a year or two.

 Generally the limitation is the fact that we're currently abusing svn's
 ability to only check out specific directories rather than an entire
 repo to make it easy to check out all the extensions or individual ones
 without any trouble.
 We've had ideas like using git submodules to mark stable versions of
 extensions so extension repos can be flexibly checked out.

 Oh something interesting. With a bit of trickery I recently managed to
 splice the entire history of one git repo into a branch of another git
 repo creating a git repo that has two separate initial commits in two
 separate branches. And from the looks of it it's perfectly possible to
 fetch history from the original repo into the proper branch. So it
 should be interestingly possible to create a script that fetches history
 updates for every extension at once by embedding them all into separate
 branches of a single git repo, and then locally (with no network
 latency) pulling the history from those branches into the real repos.

It's interesting that the #1 con against Git in that document is Lots
of annoying Git/Linux fanboys.

I guess this is as good a time as any to plug the git-svn notes I
scribbled down yesterday: http://www.mediawiki.org/wiki/Git

In order to convert to Git it would help to collect a list of things
that should be split into separate repositories:

 * /USERINFO, /civicrm and /wikimedia-web
 * Everything in /trunk/*
 * Additionally /trunk/extensions/* and maybe some /trunk/tools/*

That should yield around 500 repositories. That might sound crazy but
best practice for any distributed version control system is that
repositories should be split at the boundaries at which code doesn't
pass over, and when's the last time /trunk/FOO shared some code with
/trunk/extensions/BAR for instance?

And if someone really wants to check out all 430 extensions that's
easy enough with an extensions project with 430 submodules, but the
most common case should be someone checking out MediaWiki.git + 3-5
extensions.

I'm doing some experiments with splitting up MediaWiki's Git mirror[1]
using git-filter-branch[2]. It takes a *long* time with this huge
repository but a working conversion is the fastest way to get people
on board.

Of course this is a great chance to clean up some aspects of the
repository, such as:

 * Rewrite the commits to give everyone real names / emails, like Tim
Starling / tstarl...@wikimedia.org instead of tstarling. This can be
done automatically by parsing the USERINFO files  adding to them
where appropriate.
 * Combine users like magnusmanske and magnus_manske into one
 * Rename/drop branches/tags if someone wants that

1. http://gitorious.org/mediawiki-svn-mirror
2. 
http://stackoverflow.com/questions/359424/detach-subdirectory-into-separate-git-repository

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l