On Fri, 30 Dec 2011 20:11:30 +, Dan Nessett wrote:
> I have poked around a bit (using Google), but have not found
> instructions for setting up the MW regression test framework (e.g.,
> CruiseControl or Jenkins or whatever is now being used + PHPUnit tests +
> Selenium tests
uration:
OS: CentOS 5.7
MW revision: 108821
PHP: 5.3.3
PHPUnit: 3.6.7
DB: Postgres 8.3.9
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Thu, 12 Jan 2012 17:59:09 +, Dan Nessett wrote:
> On Thu, 12 Jan 2012 16:17:00 +0100, Antoine Musso wrote:
>
>> Hello,
>>
>> I have added a new continuous integration job to check our postgres
>> support.
>> This is exactly the same job as MediaWiki-p
dea why the local run has
different results that the automated run?
I have attached the most recent run output to bug 33663.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ps://integration.mediawiki.org/ci/job/MediaWiki-postgres-phpunit/
>
> As of now, there are two tests failing.
Excellent. Thank you for this.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 11 Jan 2012 11:17:33 +0100, Antoine Musso wrote:
> Le Wed, 11 Jan 2012 02:31:47 +0100, Dan Nessett a
> écrit:
>> Sure, I can post the results, but I don't think I should just dump them
>> into this list (there are over 700 lines of output). How would you li
On Tue, 10 Jan 2012 23:53:25 +0100, Platonides wrote:
> On 10/01/12 19:52, Dan Nessett wrote:
>> I gave up on Ubuntu 8.04 and moved to Centos 5.7. After getting make
>> safe to work, I get 27 failures and 14 incomplete tests. This is for
>> revision 108474. Is there any w
On Fri, 30 Dec 2011 20:11:30 +, Dan Nessett wrote:
> I have poked around a bit (using Google), but have not found
> instructions for setting up the MW regression test framework (e.g.,
> CruiseControl or Jenkins or whatever is now being used + PHPUnit tests +
> Selenium tests
On Mon, 09 Jan 2012 09:26:24 +0100, Krinkle wrote:
> On Fri, Jan 6, 2012 at 8:06 PM, OQ wrote:
>
>> On Fri, Jan 6, 2012 at 12:56 PM, Dan Nessett
>> wrote:
>> > On Thu, 05 Jan 2012 14:03:14 -0600, OQ wrote:
>> >> uninstall the pear version and d
On Thu, 05 Jan 2012 14:03:14 -0600, OQ wrote:
> On Thu, Jan 5, 2012 at 1:34 PM, Dan Nessett wrote:
>> So, I upgraded PHPUnit (using pear upgrade phpunit/PHPUnit). This
>> installed 3.4.15 (not 3.5).
>>
>> I am running on Ubuntu 8.04. Anyone have an idea how to g
On Fri, 30 Dec 2011 21:29:53 +0100, Roan Kattouw wrote:
> On Fri, Dec 30, 2011 at 9:11 PM, Dan Nessett wrote:
>> I have poked around a bit (using Google), but have not found
>> instructions for setting up the MW regression test framework (e.g.,
>> CruiseControl or Jenki
patches to Bugzilla). Do such instructions exist and if so,
would someone provide a pointer to them?
Thanks,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 07 Dec 2011 12:54:22 +1100, Tim Starling wrote:
> On 07/12/11 12:34, Dan Nessett wrote:
>> On Wed, 07 Dec 2011 12:15:41 +1100, Tim Starling wrote:
>>> How many servers do you have?
>>
>> 3. It would help to get it down to 2.
>>
>> I assume m
On Wed, 07 Dec 2011 12:15:41 +1100, Tim Starling wrote:
> On 07/12/11 09:50, Dan Nessett wrote:
>> OK. Call it something else. The motivation for my question is getting
>> server costs under control. Moving as much processing as possible
>> client side is one way to achieve
way to convert the wiki data
from the old format (i.e., mediawiki markup) to the new.
Dan Nessett
From: Neil Kandalgaonkar
To: Wikimedia developers
Cc: Dan Nessett
Sent: Tuesday, December 6, 2011 4:18 PM
Subject: Re: [Wikitech-l] Mediawiki 2.0
On 12/6/
On Wed, 07 Dec 2011 09:26:50 +1100, Tim Starling wrote:
> On 07/12/11 08:55, Dan Nessett wrote:
>> This is a (admittedly long and elaborate) question, not a proposal. I
>> ask it in order to learn whether anyone has given it or something like
>> it some thought.
>>
On Wed, 07 Dec 2011 07:59:26 +1000, K. Peachey wrote:
> http://www.mediawiki.org/wiki/Project:2.0
Thanks. I have moved my comments to that page's discussion.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.o
ontinually break this support.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
are other constraints that I must observe. I have read
the material at http://www.mediawiki.org/wiki/Localisation, but did not
find the above constraints mentioned there. Is there another place where
they are specified?
--
-- Dan Nessett
___
Wikitech-l ma
On Fri, 04 Feb 2011 15:01:20 +0100, Krinkle wrote:
> Op 3 feb 2011, om 22:42 heeft Dan Nessett het volgende geschreven:
>
>> On Thu, 03 Feb 2011 21:19:58 +, Dan Nessett wrote:
>>
>>> Our site has 4 skins that display the logo - 3 standard and 1 site-
>>> s
On Thu, 03 Feb 2011 13:52:30 -0800, Brion Vibber wrote:
> On Thu, Feb 3, 2011 at 1:19 PM, Dan Nessett wrote:
>
>> Our site has 4 skins that display the logo - 3 standard and 1 site-
>> specific. The site-specific skin uses rounded edges for the individual
>> page area fr
On Thu, 03 Feb 2011 21:19:58 +, Dan Nessett wrote:
> Our site has 4 skins that display the logo - 3 standard and 1 site-
> specific. The site-specific skin uses rounded edges for the individual
> page area frames, while the standard skins use square edges. This means
> a logo
scheme or a different
font).
My question is: has this issue been addressed before? If so, and there is
a good solution, I would appreciate hearing of it.
Regards,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
On Thu, 23 Sep 2010 20:13:23 -0700, Brion Vibber wrote:
> On Thu, Sep 23, 2010 at 7:19 PM, Dan Nessett wrote:
>
>> I appreciate your recent help, so I am going to ignore the tone of your
>> last message and focus on issues. While a test run can set up, use and
>> t
On Thu, 23 Sep 2010 18:10:37 -0700, Brion Vibber wrote:
> On Thu, Sep 23, 2010 at 4:03 PM, Dan Nessett wrote:
>
>> Thinking about this a bit, we seem to have come full circle. If we use
>> a URL per regression test run, then we need to multiplex wiki
>> resources. When
On Thu, 23 Sep 2010 15:50:48 -0700, Brion Vibber wrote:
> On Thu, Sep 23, 2010 at 2:54 PM, Dan Nessett wrote:
>
>> On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
>> > There's no need to have a fixed set of URLs; just as with Wikimedia's
>> > publ
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
> On Thu, Sep 23, 2010 at 2:31 PM, Dan Nessett wrote:
>
>> Not sure I get this. Here is what I understand would happen when a
>> developer checks in a revision:
>>
>> + A script runs that manages the vario
ce access control on the URLs.
+ Once you have an idle URL, you can initialize the wiki per your
previous comments, including loading the revision into the directory
associated with the URL.
How does this fit into the idea of using a wiki per regression test or
regression test run?
--
-- Dan Ne
On Thu, 23 Sep 2010 10:29:58 -0700, Brion Vibber wrote:
> On Thu, Sep 23, 2010 at 9:46 AM, Dan Nessett wrote:
>
>> I am very much in favor of keeping it simple. I think the issue is
>> whether we will support more than one regression test (or individual
>> test associate
ients as you like.
>
> For small test subsets that are being used during testing the equation
> still doesn't change much: reset the wiki to known state, run the tests.
> Keep it simple!
>
> -- brion
>
>
> On Thursday, September 23, 2010, Dan Nessett wrote:
>>
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber wrote:
> On Wed, Sep 22, 2010 at 11:09 AM, Dan Nessett
> wrote:
>
>> Some have mentioned the possibility of using the wiki family logic to
>> help achieve these objectives. Do you have any thoughts on this? If you
>> t
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber wrote:
> On Wed, Sep 22, 2010 at 11:09 AM, Dan Nessett
> wrote:
>
>> Some have mentioned the possibility of using the wiki family logic to
>> help achieve these objectives. Do you have any thoughts on this? If you
>> t
sts will be freely
> runnable on existing instances that are used for development and
> testing, but if you want to work with a blank slate wiki exposed to web
> clients, keep things simple and just make a dedicated instance.
>
> -- brion
>
>
> On Wed, Sep 22, 2010 a
eclaim the temporary resources identified in the entry before its state
is lost. This is a problem.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 22 Sep 2010 18:57:12 +0200, Bryan Tong Minh wrote:
> On Wed, Sep 22, 2010 at 6:47 PM, Dan Nessett wrote:
>>
>> I think the object cache and memcached are alternative ways of storing
>> persistent data. (I also am not an expert in this, so I could be
>>
On Wed, 22 Sep 2010 19:35:31 +0200, Roan Kattouw wrote:
> 2010/9/22 Dan Nessett :
>> How does memcached fit into this? When I looked at BagOStuff, I didn't
>> find a MemcacheBagOStuff class. Is it defined elsewhere?
>>
> Either memcached.php, MemCached.php or M
On Wed, 22 Sep 2010 18:57:12 +0200, Bryan Tong Minh wrote:
> On Wed, Sep 22, 2010 at 6:47 PM, Dan Nessett wrote:
>>
>> I think the object cache and memcached are alternative ways of storing
>> persistent data. (I also am not an expert in this, so I could be
>>
static
identifier. The requirements for parserTests and selenium tests are
significantly different. While we may be able to learn from the
parserTests code, we would have to change the parserTest code
significantly in order to use it. I think it would actually be less work
to start from scratch
On Mon, 20 Sep 2010 22:32:24 +0200, Platonides wrote:
> Dan Nessett wrote:
>> On Sun, 19 Sep 2010 23:42:08 +0200, Platonides wrote:
>>> You load originaldb.objectcache, retrieve the specific configuration,
>>> and switch into it.
>>> For supporting many sumy
On Sun, 19 Sep 2010 23:42:08 +0200, Platonides wrote:
> Dan Nessett wrote:
>> Platonides wrote:
>>> Dan Nessett wrote:
>>>>> What about memcached?
>>>>> (that would be a key based on the original db name)
>>>>
>>>> The st
On Sun, 19 Sep 2010 02:47:00 +0200, Platonides wrote:
> Dan Nessett wrote:
>>> What about memcached?
>>> (that would be a key based on the original db name)
>>
>> The storage has to be persistent to accommodate wiki crashes (e.g.,
>> httpd crash, serv
On Fri, 17 Sep 2010 19:13:33 +, Dan Nessett wrote:
> On Fri, 17 Sep 2010 18:40:53 +0000, Dan Nessett wrote:
>
>> I have been tasked to evaluate whether we can use the parserTests db
>> code for the selenium framework. I just looked it over and have serious
>> reservat
, server OS crash, power outage). It might be possible to use
memcachedb, but as far as I am aware that requires installing Berkeley
DB, which complicated deployment.
Why not employ the already installed DB software used by the wiki? That
prov
On Fri, 10 Sep 2010 23:11:27 +, Dan Nessett wrote:
> We are currently attempting to refactor some specific modifications to
> the standard MW code we use (1.13.2) into an extension so we can upgrade
> to a more recent maintained version. One modification we have keeps a
>
ia) using the same
code. I am not familiar with these mechanisms, so this approach requires
help from someone who is.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, 17 Sep 2010 21:05:12 +0200, Platonides wrote:
> Dan Nessett wrote:
>> Given this background, consider the following (and feel free to comment
>> on it):
>>
>> parserTests temporary table code:
>>
>> A fixed set of tables are specified in the code.
On Fri, 17 Sep 2010 18:40:53 +, Dan Nessett wrote:
> I have been tasked to evaluate whether we can use the parserTests db
> code for the selenium framework. I just looked it over and have serious
> reservations. I would appreciate any comments on the following analysis.
>
>
for a test of some other functionality to figure out which tables
to drop.
For these reasons, I don't think we can reuse the parserTests code.
However, I am open to arguments to the contrary.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikit
On Fri, 10 Sep 2010 23:11:27 +, Dan Nessett wrote:
> We are currently attempting to refactor some specific modifications to
> the standard MW code we use (1.13.2) into an extension so we can upgrade
> to a more recent maintained version. One modification we have keeps a
>
of the top-level directory named 'selenium'. Is that an
official convention?
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ps it belongs in the
core.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
www.hallowelt.biz
> gla...@hallowelt.biz
>
> Sitz: Regensburg
> Handelsregister: HRB 10467
> E.USt.Nr.: DE 253050833
> Geschäftsführer:
> Anja Ebersbach, Markus Glaser,
> Dr. Richard Heigl, Radovan Kubani
>
>
> -Ursprüngliche Nachricht-----
> Von: wikitech-
others are
doing.
Regards,
Dan
--- On Tue, 9/7/10, Mark A. Hershberger wrote:
> From: Mark A. Hershberger
> Subject: Re: Selenium Framework - test run configuration data
> To: "Wikimedia developers"
> Cc: "Dan Nessett"
> Date: Tuesday, September 7, 20
On Mon, 06 Sep 2010 23:15:06 -0400, Mark A. Hershberger wrote:
> Dan Nessett writes:
>
>> Last Friday, mah ripped out the globals and put the configuration
>> information into the execute method of RunSeleniumTests.php with the
>> comment "@todo Add an alternative
w requires reworking because how to
reference configuration data has changed. We need a decision that decides
which of the two approaches to use.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.o
On Sat, 07 Aug 2010 23:30:16 -0400, Mark A. Hershberger wrote:
> Dan Nessett writes:
>
>> I don't think walking through all the extensions looking for test
>> subdirectories and then running all tests therein is a good idea.
>> First, in a large installation wit
sh the convention that it is pushed in the extension require()
file as well. Then all extensions with test suites would automatically
load them. To tailor this, the entries in $wgSeleniumTestSuites could be
removed in LocalSettings.
--
-- Dan Nessett
much like $wgGroupPermissions). We could use a global variable
$wgSelenium and move all selenium framework values into it. For example:
$wgSelenium['wiki']['host'] = 'localhost';
$wgSelenium['wiki']['wikiurl'] = false;
$wgSe
to copy and paste the php
> tests in Selenium IDE to see what would happen and if I could run from
> there.
>
> Michelle Knight
>
>
>
> Message: 6
> Date: Thu, 27 May 2010 17:
icket.
Regards,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
the error messages.
>
> Michelle Knight
>
>
>
> Message: 5
> Date: Tue, 18 May 2010 17:44:03 + (UTC) From: Dan Nessett
> Subject: Re: [Wikitech-l] Selenium testing
> framework To: wikitech-l@lists.wikimedia.org
> Message-ID: Content-Type: text/plain;
> charset=
since you are the
original architect of the framework, it is probably best for you to
comment on them first and perhaps suggest what you consider to be the
best approach to their resolution.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, 17 May 2010 20:16:35 +, Dan Nessett wrote:
> On Mon, 17 May 2010 19:11:21 +0000, Dan Nessett wrote:
>
>> During the meeting last Friday, someone (I sorry, I don't remember who)
>> mentioned he had created a test that runs with the currently checked in
>>
I can use and I will update
you on the status of my attempts to get PagedTiffHandler_tests.php to
work. As a teaser, it appears there is a problem with the sequence of
processing vis-a-vis LocalSettings and LocalSeleniumSettings
Cheers,
Dan
--
-- Dan Nessett
___
lution
> for me was to point Selenium to a Firefox 3.5.
>
> Cheers,
> Markus
My OS is Ubuntu 8.04. The version of Firefox is 3.0.19. Since Ubuntu
automatically updates versions of its software, I assume this is the most
up-to-date.
Is there a list of browser versions compatible w
to test file uploads
$wgSeleniumTestsBrowsers['firefox'] = '*firefox /usr/bin/firefox';
$wgSeleniumTestsBrowsers['ff-chrome'] = '*chrome /usr/bin/firefox';
// Actually, use this browser
$wgSeleniumTestsUseBrowser = 'ff-chrome';
Regards,
--
-- Dan Nessett
_
gTiffMaxEmbedFileResolution = 2560; // max. Resolution 1600 x 1600
pixels
// Maximum size of meta data
$wgTiffMaxMetaSize = 67108864; // 64kB
// TTL of Cacheentries for Errors
$wgTiffErrorCacheTTL = 84600;
Is there some way to use the wiki to look for the file property that is
causing t
happens on line
32 of PagedTiffHandler_tests.php on the statement:
if ($source != 'filetoc') $this->allChecksOk = false;
I'm not an image expert, so I don't know why this is happening.
Regards,
Dan
--
-- Dan Nessett
___
W
On Mon, 17 May 2010 19:11:21 +, Dan Nessett wrote:
> During the meeting last Friday, someone (I sorry, I don't remember who)
> mentioned he had created a test that runs with the currently checked in
> selenium code. Is that test code available somewhere (it doesn't ap
During the meeting last Friday, someone (I sorry, I don't remember who)
mentioned he had created a test that runs with the currently checked in
selenium code. Is that test code available somewhere (it doesn't appear
to be in the current revision)?
--
-- D
One of the URLs supplied by Ryan during the recent phone conference
doesn't work. Specifically: http://
grid.tesla.usability.wikimedia.org:. I get the error: HTTP ERROR: 404
NOT_FOUND RequestURI=/
___
Wikitech-l mailing list
Wikitech-l@lists.wikime
--- On Mon, 8/24/09, Alex wrote:
> I don't
> believe anyone
> except you has actually proposed restructuring the
> extensions directory.
Perhaps not. But, I don't see why that is relevant. I am making arguments why
the extensions directory should be restructured. I may convince no one, but I
d
--- On Mon, 8/24/09, Chad wrote:
> Why skip trying to find the location?
> If MW_INSTALL_PATH
> is already missing, what have we got to lose from trying
> to guess the location? The vast majority of people don't
> screw with the default structure, so it should be just
> fine.
That's a reasonable
--- On Sun, 8/23/09, Aryeh Gregor wrote:
> If they can run commands on the command line, then they can
> use
> environment variables. If they can't, then your
> suggestion doesn't
> help.
>
> > If there are administrators who can execute command
> lines, but cannot set environmental variables (
--- On Sun, 8/23/09, dan nessett wrote:
In my last email, I quoted Andrew Garret:
> $ MW_INSTALL_PATH=/var/wiki/mediawiki php/maintenance/update.php
This was incorrect. I fumbled some of the editing in my reply. What he proposed
was:
> $ MW_INSTALL_PATH=/var/wiki/mediawiki php maint
--- On Sun, 8/23/09, Andrew Garrett wrote:
> $ MW_INSTALL_PATH=/var/wiki/mediawiki php/maintenance/update.php
I don't understand the point you are making. If an MW administrator can set
environmental variables, then, of course, what you suggests works. However,
Brion mentions in his Tues, Aug
--- On Sat, 8/22/09, Platonides wrote:
> How's that better than MW_CONFIG_PATH environment
> variable?
My understanding is that the administrators of certain installations cannot set
environmental variables (I am taking this on faith, because that seems like a
very very restricted site). What
--- On Fri, 8/21/09, Andrew Garrett wrote:
> Yes, this is where we started because this is the status
> quo. What I
> was describing is how it's done now.
Is maintaining the status quo really desirable? Look at the extensions
directory. It currently has ~400 extension sub-directories. If you
--- On Thu, 8/20/09, Andrew Garrett wrote:
> As the title implies, it is a performance limit report. You
> can remove
> it by changing the parser options passed to the parser.
> Look at the
> ParserOptions and Parser classes.
Thanks. It appears dumpHTML has no command option to turn off this
I am looking into the feasibility of writing a comprehensive parser regression
test (CPRT). Before writing code, I thought I would try to get some idea of how
well such a tool would perform and what gotchas might pop up. An easy first
step is to run dump_HTML and capture some data and statistics
--- On Mon, 8/17/09, Brion Vibber wrote:
> Really though, this thread has gotten extremely unfocused;
> it's not
> clear what's being proposed to begin with and we've
> wandered off to a
> lot of confusion.
I'll take partial responsibility for the confusion. Like I said recently, I
think it
--- On Fri, 8/14/09, Tim Starling wrote:
> And please, spare us from your rant about how terrible this
> is. It's
> not PHP's fault that you don't know anything about it.
I'm sorry my questions make you angry. I don't recall ranting about PHP.
Actually, I kind of like it. Lack of thread safety
--- On Fri, 8/14/09, Dmitriy Sintsov wrote:
> I remember some time ago I was strongly discouraged to
> compile and run
> PHP threaded MPM for apache because some functions or
> libraries of PHP
> itself were not thread safe.
While my machine was compiling AMP components, I thought about this a
--- On Fri, 8/14/09, Dmitriy Sintsov wrote:
> I remember some time ago I was strongly discouraged to
> compile and run
> PHP threaded MPM for apache because some functions or
> libraries of PHP
> itself were not thread safe.
OK, this and Chad's comment suggests the option is multi-process/IPC.
One of the first problems to solve in developing the proposed CPRT is how to
call a function with the same name in two different MW distributions. I can
think of 3 ways: 1) use the Namespace facility of PHP 5.3, 2) use threads, or
3) use separate process and IPC. Since MAMP supports none of thes
--- On Wed, 8/12/09, Tim Landscheidt wrote:
> I think though that more people
> would read and embrace your
> thoughts if you would find
> a more concise way to put
> them across :-).
Mea Culpa. I'll shut up for a while.
___
Wikitech-l maili
--- On Wed, 8/12/09, Brion Vibber wrote:
> Your setup is incorrect: the extensions folder *always*
> goes inside the
> MediaWiki root dir. Always.
>
Sorry, my inexperience with Subversion led me in the wrong direction. I didn't
realize I could check out phase3 then point Subversion to the ext
Chad wrote:
> DumpHTML will not be moved back to maintenance in the repo, it was
> already removed from maintenance and made into an extension. Issues
> with it as an extension should be fixed, but it should not be encouraged
> to go back into core.
What I meant was I can move the code in DumpH
--- On Wed, 8/12/09, Roan Kattouw wrote:
> I read this paragraph first, then read the paragraph above
> and
> couldn't help saying "WHAT?!?". Using a huge set of pages
> is a poor
> replacement for decent tests.
I am not proposing that the CPRT be a substitute for "decent tests." We still
need
So. I checked out a copy of phase3 and extensions to start working on
investigating the feasibility of a comprehensive parser regression test. After
getting the working copy downloaded, I do what I usually do - blow away the
extensions directory stub that comes with phase3 and soft link the down
--- On Wed, 8/12/09, dan nessett wrote:
> "If you ran this test on, for example, Wikipedia,
Of course, what I meant is run the test on the Wikipedia database, not on the
live system.
Dan
___
Wikitech-l mailing list
Wi
I am investigating how to write a comprehensive parser regression test. What I
mean by this is something you wouldn't normally run frequently, but rather
something that we could use to get past the "known to fail" tests now disabled.
The problem is no one understands the parser well enough to ha
--- On Wed, 8/12/09, Roan Kattouw wrote:
> On shared hosting, both are impossible. MediaWiki currently
> works with
> minimal write access requirements (only the config/
> directory for the
> installer and the images/ directory if you want uploads),
> and we'd
> like to keep it that way for peopl
--- On Wed, 8/12/09, Brion Vibber wrote:
>
> The suggestions were for explicit manual configuration, not
>
> autodiscovery. Autodiscovery means *not* having to set
> anything. :)
I am insane to keep this going, but the proposal I made did not require doing
anything manually (other than runnin
--- On Wed, 8/12/09, Brion Vibber wrote:
> We *already* automatically discover the MW root directory.
Yes, you're right. I should have said automatically discover the MW root
directory without using file position dependent code.
Dan
___
Wik
I have been playing around with phpunit, in particular its facility for
generating tests from existing PHP code. You do this by processing a suitably
annotated (using /* @assert ... */ comment lines) version of the file with
phpunit --skeleton. Unfortunately, the --skeleton option assumes the fi
--- On Wed, 8/12/09, Chad wrote:
>
> Tests should run in a vanilla install, with minimal
> dependency on
> external stuff. PHPUnit
> (or whatever framework we use) would be considered an
> acceptable dependency for
> test suites. If PHPUnit isn't available (ie: already
> installed and in
> the i
I'm starting a new thread because I noticed my news reader has glued together
messages with the title "A potential land mine" and "MW test infrastructure
architecture," which may confuse someone coming into the discussion late. Also,
the previous thread has branched into several topics and I wan
--- On Tue, 8/11/09, Chad wrote:
> > Neither of these need to be tested directly. If
> AutoLoader breaks,
> > then some other class won't load, and the tests for
> that class will
> > fail. If wfRunHooks() fails, then some hook won't
> work, and any test
> > of that hook will fail.
> >
I will
--- On Tue, 8/11/09, Alexandre Emsenhuber wrote:
> My idea is the move the "backend" of ParserTest
> (parserTests.txt file
> processing, result reporting, ...) and the TestRecorder
> stuff to
> something like a MediaWikiTests class that extends
> Maintenance and
> move the rest in a file in
1 - 100 of 175 matches
Mail list logo