On Fri, 30 Dec 2011 20:11:30 +, Dan Nessett wrote:
I have poked around a bit (using Google), but have not found
instructions for setting up the MW regression test framework (e.g.,
CruiseControl or Jenkins or whatever is now being used + PHPUnit tests +
Selenium tests) on a local machine
: 5.3.3
PHPUnit: 3.6.7
DB: Postgres 8.3.9
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
/MediaWiki-postgres-phpunit/
As of now, there are two tests failing.
Excellent. Thank you for this.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
that the automated run?
I have attached the most recent run output to bug 33663.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Thu, 12 Jan 2012 17:59:09 +, Dan Nessett wrote:
On Thu, 12 Jan 2012 16:17:00 +0100, Antoine Musso wrote:
Hello,
I have added a new continuous integration job to check our postgres
support.
This is exactly the same job as MediaWiki-phpunit, only the database
backend
change
On Wed, 11 Jan 2012 11:17:33 +0100, Antoine Musso wrote:
Le Wed, 11 Jan 2012 02:31:47 +0100, Dan Nessett dness...@yahoo.com a
écrit:
Sure, I can post the results, but I don't think I should just dump them
into this list (there are over 700 lines of output). How would you like
me to go about
On Fri, 30 Dec 2011 20:11:30 +, Dan Nessett wrote:
I have poked around a bit (using Google), but have not found
instructions for setting up the MW regression test framework (e.g.,
CruiseControl or Jenkins or whatever is now being used + PHPUnit tests +
Selenium tests) on a local machine
On Tue, 10 Jan 2012 23:53:25 +0100, Platonides wrote:
On 10/01/12 19:52, Dan Nessett wrote:
I gave up on Ubuntu 8.04 and moved to Centos 5.7. After getting make
safe to work, I get 27 failures and 14 incomplete tests. This is for
revision 108474. Is there any way to know if this is expected
On Mon, 09 Jan 2012 09:26:24 +0100, Krinkle wrote:
On Fri, Jan 6, 2012 at 8:06 PM, OQ overlo...@gmail.com wrote:
On Fri, Jan 6, 2012 at 12:56 PM, Dan Nessett dness...@yahoo.com
wrote:
On Thu, 05 Jan 2012 14:03:14 -0600, OQ wrote:
uninstall the pear version and do a make install
On Thu, 05 Jan 2012 14:03:14 -0600, OQ wrote:
On Thu, Jan 5, 2012 at 1:34 PM, Dan Nessett dness...@yahoo.com wrote:
So, I upgraded PHPUnit (using pear upgrade phpunit/PHPUnit). This
installed 3.4.15 (not 3.5).
I am running on Ubuntu 8.04. Anyone have an idea how to get PHPUnit 3.5
installed
On Fri, 30 Dec 2011 21:29:53 +0100, Roan Kattouw wrote:
On Fri, Dec 30, 2011 at 9:11 PM, Dan Nessett dness...@yahoo.com wrote:
I have poked around a bit (using Google), but have not found
instructions for setting up the MW regression test framework (e.g.,
CruiseControl or Jenkins or whatever
patches to Bugzilla). Do such instructions exist and if so,
would someone provide a pointer to them?
Thanks,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 07 Dec 2011 12:54:22 +1100, Tim Starling wrote:
On 07/12/11 12:34, Dan Nessett wrote:
On Wed, 07 Dec 2011 12:15:41 +1100, Tim Starling wrote:
How many servers do you have?
3. It would help to get it down to 2.
I assume my comments apply to many other small wikis that use MW
break this support.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 07 Dec 2011 07:59:26 +1000, K. Peachey wrote:
http://www.mediawiki.org/wiki/Project:2.0
Thanks. I have moved my comments to that page's discussion.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
On Wed, 07 Dec 2011 09:26:50 +1100, Tim Starling wrote:
On 07/12/11 08:55, Dan Nessett wrote:
This is a (admittedly long and elaborate) question, not a proposal. I
ask it in order to learn whether anyone has given it or something like
it some thought.
Has anyone thought of creating MW 2.0
the wiki data
from the old format (i.e., mediawiki markup) to the new.
Dan Nessett
From: Neil Kandalgaonkar ne...@wikimedia.org
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Cc: Dan Nessett dness...@yahoo.com
Sent: Tuesday, December 6, 2011 4:18 PM
On Wed, 07 Dec 2011 12:15:41 +1100, Tim Starling wrote:
On 07/12/11 09:50, Dan Nessett wrote:
OK. Call it something else. The motivation for my question is getting
server costs under control. Moving as much processing as possible
client side is one way to achieve this. Cleaning up
that I must observe. I have read
the material at http://www.mediawiki.org/wiki/Localisation, but did not
find the above constraints mentioned there. Is there another place where
they are specified?
--
-- Dan Nessett
___
Wikitech-l mailing list
On Fri, 04 Feb 2011 15:01:20 +0100, Krinkle wrote:
Op 3 feb 2011, om 22:42 heeft Dan Nessett het volgende geschreven:
On Thu, 03 Feb 2011 21:19:58 +, Dan Nessett wrote:
Our site has 4 skins that display the logo - 3 standard and 1 site-
specific. The site-specific skin uses rounded
scheme or a different
font).
My question is: has this issue been addressed before? If so, and there is
a good solution, I would appreciate hearing of it.
Regards,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https
On Thu, 03 Feb 2011 21:19:58 +, Dan Nessett wrote:
Our site has 4 skins that display the logo - 3 standard and 1 site-
specific. The site-specific skin uses rounded edges for the individual
page area frames, while the standard skins use square edges. This means
a logo with square edges
On Thu, 03 Feb 2011 13:52:30 -0800, Brion Vibber wrote:
On Thu, Feb 3, 2011 at 1:19 PM, Dan Nessett dness...@yahoo.com wrote:
Our site has 4 skins that display the logo - 3 standard and 1 site-
specific. The site-specific skin uses rounded edges for the individual
page area frames, while
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber wrote:
On Wed, Sep 22, 2010 at 11:09 AM, Dan Nessett dness...@yahoo.com
wrote:
Some have mentioned the possibility of using the wiki family logic to
help achieve these objectives. Do you have any thoughts on this? If you
think it is a good
.
For small test subsets that are being used during testing the equation
still doesn't change much: reset the wiki to known state, run the tests.
Keep it simple!
-- brion
On Thursday, September 23, 2010, Dan Nessett dness...@yahoo.com wrote:
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber
On Thu, 23 Sep 2010 10:29:58 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 9:46 AM, Dan Nessett dness...@yahoo.com wrote:
I am very much in favor of keeping it simple. I think the issue is
whether we will support more than one regression test (or individual
test associated
previous comments, including loading the revision into the directory
associated with the URL.
How does this fit into the idea of using a wiki per regression test or
regression test run?
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 2:31 PM, Dan Nessett dness...@yahoo.com wrote:
Not sure I get this. Here is what I understand would happen when a
developer checks in a revision:
+ A script runs that manages the various regression tests run
On Thu, 23 Sep 2010 15:50:48 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 2:54 PM, Dan Nessett dness...@yahoo.com wrote:
On Thu, 23 Sep 2010 14:41:32 -0700, Brion Vibber wrote:
There's no need to have a fixed set of URLs; just as with Wikimedia's
public-hosted sites you can add
On Thu, 23 Sep 2010 20:13:23 -0700, Brion Vibber wrote:
On Thu, Sep 23, 2010 at 7:19 PM, Dan Nessett dness...@yahoo.com wrote:
I appreciate your recent help, so I am going to ignore the tone of your
last message and focus on issues. While a test run can set up, use and
then delete
families that we could use without much modification.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, 22 Sep 2010 18:57:12 +0200, Bryan Tong Minh wrote:
On Wed, Sep 22, 2010 at 6:47 PM, Dan Nessett dness...@yahoo.com wrote:
I think the object cache and memcached are alternative ways of storing
persistent data. (I also am not an expert in this, so I could be
wrong). My understanding
the temporary resources identified in the entry before its state
is lost. This is a problem.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
things simple and just make a dedicated instance.
-- brion
On Wed, Sep 22, 2010 at 9:47 AM, Dan Nessett dness...@yahoo.com wrote:
On Wed, 22 Sep 2010 15:49:40 +0200, Markus Glaser wrote:
Hi,
here are my thoughts about phpunit and selenium testing.
The wiki under test is set up
On Wed, 22 Sep 2010 12:30:35 -0700, Brion Vibber wrote:
On Wed, Sep 22, 2010 at 11:09 AM, Dan Nessett dness...@yahoo.com
wrote:
Some have mentioned the possibility of using the wiki family logic to
help achieve these objectives. Do you have any thoughts on this? If you
think it is a good
On Mon, 20 Sep 2010 22:32:24 +0200, Platonides wrote:
Dan Nessett wrote:
On Sun, 19 Sep 2010 23:42:08 +0200, Platonides wrote:
You load originaldb.objectcache, retrieve the specific configuration,
and switch into it.
For supporting many sumyltaneous configurations, the keyname could
have
On Fri, 17 Sep 2010 19:13:33 +, Dan Nessett wrote:
On Fri, 17 Sep 2010 18:40:53 +, Dan Nessett wrote:
I have been tasked to evaluate whether we can use the parserTests db
code for the selenium framework. I just looked it over and have serious
reservations. I would appreciate any
On Sun, 19 Sep 2010 02:47:00 +0200, Platonides wrote:
Dan Nessett wrote:
What about memcached?
(that would be a key based on the original db name)
The storage has to be persistent to accommodate wiki crashes (e.g.,
httpd crash, server OS crash, power outage). It might be possible to
use
On Sun, 19 Sep 2010 23:42:08 +0200, Platonides wrote:
Dan Nessett wrote:
Platonides wrote:
Dan Nessett wrote:
What about memcached?
(that would be a key based on the original db name)
The storage has to be persistent to accommodate wiki crashes (e.g.,
httpd crash, server OS crash, power
that requires installing Berkeley
DB, which complicated deployment.
Why not employ the already installed DB software used by the wiki? That
provides persistent storage and requires no additional software.
--
-- Dan Nessett
___
Wikitech-l mailing list
for a test of some other functionality to figure out which tables
to drop.
For these reasons, I don't think we can reuse the parserTests code.
However, I am open to arguments to the contrary.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l
On Fri, 17 Sep 2010 18:40:53 +, Dan Nessett wrote:
I have been tasked to evaluate whether we can use the parserTests db
code for the selenium framework. I just looked it over and have serious
reservations. I would appreciate any comments on the following analysis.
The environment
On Fri, 17 Sep 2010 21:05:12 +0200, Platonides wrote:
Dan Nessett wrote:
Given this background, consider the following (and feel free to comment
on it):
parserTests temporary table code:
A fixed set of tables are specified in the code. parserTests creates
temporary tables with the same
not familiar with these mechanisms, so this approach requires
help from someone who is.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, 10 Sep 2010 23:11:27 +, Dan Nessett wrote:
We are currently attempting to refactor some specific modifications to
the standard MW code we use (1.13.2) into an extension so we can upgrade
to a more recent maintained version. One modification we have keeps a
flag in the revisions
On Fri, 10 Sep 2010 23:11:27 +, Dan Nessett wrote:
We are currently attempting to refactor some specific modifications to
the standard MW code we use (1.13.2) into an extension so we can upgrade
to a more recent maintained version. One modification we have keeps a
flag in the revisions
of the top-level directory named 'selenium'. Is that an
official convention?
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
it belongs in the
core.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, 06 Sep 2010 23:15:06 -0400, Mark A. Hershberger wrote:
Dan Nessett dness...@yahoo.com writes:
Last Friday, mah ripped out the globals and put the configuration
information into the execute method of RunSeleniumTests.php with the
comment @todo Add an alternative where settings
others are
doing.
Regards,
Dan
--- On Tue, 9/7/10, Mark A. Hershberger m...@everybody.org wrote:
From: Mark A. Hershberger m...@everybody.org
Subject: Re: Selenium Framework - test run configuration data
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Cc: Dan Nessett dness...@yahoo.com
253050833
Geschäftsführer:
Anja Ebersbach, Markus Glaser,
Dr. Richard Heigl, Radovan Kubani
-Ursprüngliche Nachricht-
Von: wikitech-l-boun...@lists.wikimedia.org
[mailto:wikitech-l-boun...@lists.wikimedia.org] Im Auftrag von dan
nessett Gesendet: Dienstag, 7. September 2010 17:20
On Sat, 07 Aug 2010 23:30:16 -0400, Mark A. Hershberger wrote:
Dan Nessett dness...@yahoo.com writes:
I don't think walking through all the extensions looking for test
subdirectories and then running all tests therein is a good idea.
First, in a large installation with many extensions
.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
that it is pushed in the extension require()
file as well. Then all extensions with test suites would automatically
load them. To tailor this, the entries in $wgSeleniumTestSuites could be
removed in LocalSettings.
--
-- Dan Nessett
___
Wikitech-l mailing
.
Regards,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Message: 5
Date: Tue, 18 May 2010 17:44:03 + (UTC) From: Dan Nessett
dness...@yahoo.com Subject: Re: [Wikitech-l] Selenium testing
framework To: wikitech-l@lists.wikimedia.org
Message-ID: hsujl3$v7...@dough.gmane.org Content-Type: text/plain;
charset=UTF-8
On Tue, 18 May 2010 19:27:38 +0200
to their resolution.
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, 17 May 2010 20:16:35 +, Dan Nessett wrote:
On Mon, 17 May 2010 19:11:21 +, Dan Nessett wrote:
During the meeting last Friday, someone (I sorry, I don't remember who)
mentioned he had created a test that runs with the currently checked in
selenium code. Is that test code
-chrome'] = '*chrome /usr/bin/firefox';
// Actually, use this browser
$wgSeleniumTestsUseBrowser = 'ff-chrome';
Regards,
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo
was to point Selenium to a Firefox 3.5.
Cheers,
Markus
My OS is Ubuntu 8.04. The version of Firefox is 3.0.19. Since Ubuntu
automatically updates versions of its software, I assume this is the most
up-to-date.
Is there a list of browser versions compatible with selenium?
--
-- Dan Nessett
on the status of my attempts to get PagedTiffHandler_tests.php to
work. As a teaser, it appears there is a problem with the sequence of
processing vis-a-vis LocalSettings and LocalSeleniumSettings
Cheers,
Dan
--
-- Dan Nessett
___
Wikitech-l mailing
During the meeting last Friday, someone (I sorry, I don't remember who)
mentioned he had created a test that runs with the currently checked in
selenium code. Is that test code available somewhere (it doesn't appear
to be in the current revision)?
--
-- Dan Nessett
On Mon, 17 May 2010 19:11:21 +, Dan Nessett wrote:
During the meeting last Friday, someone (I sorry, I don't remember who)
mentioned he had created a test that runs with the currently checked in
selenium code. Is that test code available somewhere (it doesn't appear
to be in the current
!= 'filetoc') $this-allChecksOk = false;
I'm not an image expert, so I don't know why this is happening.
Regards,
Dan
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
for Errors
$wgTiffErrorCacheTTL = 84600;
Is there some way to use the wiki to look for the file property that is
causing the problem?
Regards,
Dan
--
-- Dan Nessett
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org
One of the URLs supplied by Ryan during the recent phone conference
doesn't work. Specifically: http://
grid.tesla.usability.wikimedia.org:. I get the error: HTTP ERROR: 404
NOT_FOUND RequestURI=/
___
Wikitech-l mailing list
--- On Sun, 8/23/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
If they can run commands on the command line, then they can
use
environment variables. If they can't, then your
suggestion doesn't
help.
If there are administrators who can execute command
lines, but cannot set
--- On Mon, 8/24/09, Chad innocentkil...@gmail.com wrote:
Why skip trying to find the location?
If MW_INSTALL_PATH
is already missing, what have we got to lose from trying
to guess the location? The vast majority of people don't
screw with the default structure, so it should be just
fine.
--- On Mon, 8/24/09, Alex mrzmanw...@gmail.com wrote:
I don't
believe anyone
except you has actually proposed restructuring the
extensions directory.
Perhaps not. But, I don't see why that is relevant. I am making arguments why
the extensions directory should be restructured. I may convince
--- On Sun, 8/23/09, Andrew Garrett agarr...@wikimedia.org wrote:
$ MW_INSTALL_PATH=/var/wiki/mediawiki php/maintenance/update.php
I don't understand the point you are making. If an MW administrator can set
environmental variables, then, of course, what you suggests works. However,
Brion
--- On Sun, 8/23/09, dan nessett dness...@yahoo.com wrote:
In my last email, I quoted Andrew Garret:
$ MW_INSTALL_PATH=/var/wiki/mediawiki php/maintenance/update.php
This was incorrect. I fumbled some of the editing in my reply. What he proposed
was:
$ MW_INSTALL_PATH=/var/wiki/mediawiki
I am looking into the feasibility of writing a comprehensive parser regression
test (CPRT). Before writing code, I thought I would try to get some idea of how
well such a tool would perform and what gotchas might pop up. An easy first
step is to run dump_HTML and capture some data and
--- On Thu, 8/20/09, Andrew Garrett agarr...@wikimedia.org wrote:
As the title implies, it is a performance limit report. You
can remove
it by changing the parser options passed to the parser.
Look at the
ParserOptions and Parser classes.
Thanks. It appears dumpHTML has no command
--- On Fri, 8/14/09, Tim Starling tstarl...@wikimedia.org wrote:
And please, spare us from your rant about how terrible this
is. It's
not PHP's fault that you don't know anything about it.
I'm sorry my questions make you angry. I don't recall ranting about PHP.
Actually, I kind of like it.
One of the first problems to solve in developing the proposed CPRT is how to
call a function with the same name in two different MW distributions. I can
think of 3 ways: 1) use the Namespace facility of PHP 5.3, 2) use threads, or
3) use separate process and IPC. Since MAMP supports none of
--- On Fri, 8/14/09, Dmitriy Sintsov ques...@rambler.ru wrote:
I remember some time ago I was strongly discouraged to
compile and run
PHP threaded MPM for apache because some functions or
libraries of PHP
itself were not thread safe.
OK, this and Chad's comment suggests the option is
I'm starting a new thread because I noticed my news reader has glued together
messages with the title A potential land mine and MW test infrastructure
architecture, which may confuse someone coming into the discussion late. Also,
the previous thread has branched into several topics and I want
--- On Wed, 8/12/09, Chad innocentkil...@gmail.com wrote:
Tests should run in a vanilla install, with minimal
dependency on
external stuff. PHPUnit
(or whatever framework we use) would be considered an
acceptable dependency for
test suites. If PHPUnit isn't available (ie: already
I have been playing around with phpunit, in particular its facility for
generating tests from existing PHP code. You do this by processing a suitably
annotated (using /* @assert ... */ comment lines) version of the file with
phpunit --skeleton. Unfortunately, the --skeleton option assumes the
--- On Wed, 8/12/09, Brion Vibber br...@wikimedia.org wrote:
The suggestions were for explicit manual configuration, not
autodiscovery. Autodiscovery means *not* having to set
anything. :)
I am insane to keep this going, but the proposal I made did not require doing
anything manually
--- On Wed, 8/12/09, Roan Kattouw roan.katt...@gmail.com wrote:
On shared hosting, both are impossible. MediaWiki currently
works with
minimal write access requirements (only the config/
directory for the
installer and the images/ directory if you want uploads),
and we'd
like to keep it
I am investigating how to write a comprehensive parser regression test. What I
mean by this is something you wouldn't normally run frequently, but rather
something that we could use to get past the known to fail tests now disabled.
The problem is no one understands the parser well enough to
So. I checked out a copy of phase3 and extensions to start working on
investigating the feasibility of a comprehensive parser regression test. After
getting the working copy downloaded, I do what I usually do - blow away the
extensions directory stub that comes with phase3 and soft link the
--- On Wed, 8/12/09, Roan Kattouw roan.katt...@gmail.com wrote:
I read this paragraph first, then read the paragraph above
and
couldn't help saying WHAT?!?. Using a huge set of pages
is a poor
replacement for decent tests.
I am not proposing that the CPRT be a substitute for decent tests.
Chad innocentkil...@gmail.com wrote:
DumpHTML will not be moved back to maintenance in the repo, it was
already removed from maintenance and made into an extension. Issues
with it as an extension should be fixed, but it should not be encouraged
to go back into core.
What I meant was I can
--- On Mon, 8/10/09, Tim Starling tstarl...@wikimedia.org wrote:
No, the reason is because LocalSettings.php is in the
directory
pointed to by $IP, so you have to work out what $IP is
before you can
include it.
Web entry points need to locate WebStart.php, and command
line scripts
need
--- On Tue, 8/11/09, Chad innocentkil...@gmail.com wrote:
The problem with putting it in a single function is you
still have
to find where that function is to begin with (I'd assume
either
GlobalFunctions or install-utils would define this). At
which point
you're back to the original
Brion Vibber br...@wikimedia.org wrote:
Unless there's some reason to do otherwise, I'd recommend dropping the
$IP from the autogen'd LocalSettings.php and pulling in
DefaultSettings.php from the level above. (Keeping in mind that we
should retain compat with existing LocalSettings.php
--- On Tue, 8/11/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
Then you're doing almost exactly the same thing we're doing
now,
except with MWInit.php instead of LocalSettings.php.
$IP is normally
set in LocalSettings.php for most page views. Some
places still must
figure it out
--- On Tue, 8/11/09, Brion Vibber br...@wikimedia.org wrote:
I'm not sure there's a compelling reason to even have $IP
set in
LocalSettings.php anymore; the base include path should
probably be
autodetected in all cases, which is already being done in
WebStart.php
and commandLine.inc,
--- On Tue, 8/11/09, Brion Vibber br...@wikimedia.org wrote:
These scripts should simply be updated to initialize the
framework
properly instead of trying to half-ass it and load
individual classes.
I agree, which is why I am trying to figure out how to consolidate the tests in
/tests/ and
--- On Tue, 8/11/09, lee worden won...@riseup.net wrote:
Placing it in the include path could make it hard to run
more than one version of the MW code on the same server,
since both would probably find the same file and one of them
would likely end up using the other one's $IP.
That is a
--- On Tue, 8/11/09, Trevor Parscal tpars...@wikimedia.org wrote:
Not to worry. I've given up on this issue, at least for the moment.
Dan
What seems to be being discussed here are particular
offensive areas of
MediaWiki - however if you really get to know MediaWiki you
will likely
find
--- On Tue, 8/11/09, Chad innocentkil...@gmail.com wrote:
To be perfectly honest, I'm of the opinion that tests/ and
t/
should be scrapped and it should all be done over,
properly.
What we need is an easy and straightforward way to write
test cases, so people are encouraged to write them.
--- On Tue, 8/11/09, Alexandre Emsenhuber alex.emsenhu...@bluewin.ch wrote:
+1, we could maybe write our own test system that can be
based on the
new Maintenance class, since we already some test scripts
in /
maintenance/ (cdb-test.php, fuzz-tester.php,
parserTests.php,
Brion Vibber br...@wikimedia.org wrote:
Starting about a week ago, parser test results are now included in code
review listings for development trunk:
http://www.mediawiki.org/w/index.php?title=Special:Code/MediaWiki/pathpath=%2
Ftrunk%2Fphase3
Regressions are now quickly noted and
--- On Tue, 8/11/09, Robert Leverington rob...@rhl.me.uk wrote:
Please can you properly break your lines in e-mail though,
to 73(?)
characters per a line - should be possible to specify this
in your
client.
I'm using the web interface provided by yahoo. If you can
point me in the right
--- On Tue, 8/11/09, Alexandre Emsenhuber alex.emsenhu...@bluewin.ch wrote:
My idea is the move the backend of ParserTest
(parserTests.txt file
processing, result reporting, ...) and the TestRecorder
stuff to
something like a MediaWikiTests class that extends
Maintenance and
move the
--- On Tue, 8/11/09, Chad innocentkil...@gmail.com wrote:
Neither of these need to be tested directly. If
AutoLoader breaks,
then some other class won't load, and the tests for
that class will
fail. If wfRunHooks() fails, then some hook won't
work, and any test
of that hook will
For various reasons I have noticed that several files independently compute the
value of $IP. For example, maintenance/Command.inc and includes/WebStart.php
both calculate its value. One would expect this value to be computed in one
place only and used globally. The logical place is
1 - 100 of 146 matches
Mail list logo