[Wikitech-l] Invoking maintenance scripts return nothing at all

2010-11-16 Thread Andrew Dunbar
I wish to do some MediaWiki hacking which uses the codebase,
specifically the parser, but not the database or web server.
I'm running on Windows XP on an offline machine with PHP installed but
no MySql or web server.
I've unarchived the source and grabbed a copy of somebody's
LocalSettings.php but not attempted to to install MediaWiki beyond
this.

Obviously I don't expect to be able to do much, but when I try to run
any of the maintenance scripts I get no output whatsoever, not even
errors.

I was hoping to let the error messages guide me as to what is
essential, what needs to be stubbed, wrapped etc.

Am I missing something obvious or do these scripts return no errors by design?

Andrew Dunbar (hippietrail)

-- 
http://wiktionarydev.leuksman.com http://linguaphile.sf.net

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Invoking maintenance scripts return nothing at all

2010-11-16 Thread Dmitriy Sintsov
* Andrew Dunbar hippytr...@gmail.com [Tue, 16 Nov 2010 23:01:33 
+1100]:
 I wish to do some MediaWiki hacking which uses the codebase,
 specifically the parser, but not the database or web server.
 I'm running on Windows XP on an offline machine with PHP installed but
 no MySql or web server.
 I've unarchived the source and grabbed a copy of somebody's
 LocalSettings.php but not attempted to to install MediaWiki beyond
 this.

 Obviously I don't expect to be able to do much, but when I try to run
 any of the maintenance scripts I get no output whatsoever, not even
 errors.

 I was hoping to let the error messages guide me as to what is
 essential, what needs to be stubbed, wrapped etc.

 Am I missing something obvious or do these scripts return no errors by
 design?

 Andrew Dunbar (hippietrail)

In the web environment, error messages may expose vulnerabilities to 
potential attacker. The errors might be written to php's error log, 
which is set up by

error_log=path

directive in php.ini. You may find the actual location of php.ini by 
executing

php --ini

Look also at the whole Error handling and logging section

Does php work at all? Is there an configuration output

php -r phpinfo();

when issued from cmd.exe ?

Does

php dumpBackup.php --help

being issued from /maintenance directory, produces the command line 
help?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Job queue on Wiki Farms

2010-11-16 Thread Platonides
Roan Kattouw wrote:
 This whole thing is in maintenance/jobs-loop.sh

It's not in maintenance but in tools/jobs-loop
http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/jobs-loop/

The nextJobDB.php in that dir is completely outdated, the right one is
in maintenance folder.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Job queue on Wiki Farms

2010-11-16 Thread Daniel Friesen
On 10-11-16 08:51 AM, Roan Kattouw wrote:
 2010/11/15 Daniel Friesenli...@nadir-seen-fire.com:

 There was a thought about the job queue that popped into my mind today.

   From what I understand, for a Wiki Farm, in order to use runJobs.php
 instead of using the in-request queue (which on high traffic sites is
 less desireable) the Wiki Farm has to run runJobs.php periodically for
 each and every wiki on the farm.
 So, for example. If a Wiki Farm has 10,000 wiki it's hosting, say the
 Wiki Host really wants to ensure that the queue is run at least hourly
 to keep the data on the wiki reasonably up to date, the wiki farm
 essentially needs to call runJobs.php 10,000 times an hour (ie: one time
 for each individual wiki), irrelevantly of whether a wiki has jobs or
 not. Either that or poll each database before hand, which in itself is
 10,000 database calls an hour plus the runJobs execution which still
 isn't that desireable.


 Have you considered the fact that the WMF cluster is in this exact situation? 
 ;)

 However, we don't call runJobs.php for all wikis periodically.
 Instead, we call nextJobDB.php which generates a list of wikis that
 have pending jobs (by connecting to all of their DBs), caches it in
 memcached (caching was broken until a few minutes ago, oops) and
 outputs a random DB name. We then run runJobs.php on that random DB
 name. This whole thing is in maintenance/jobs-loop.sh

 Roan Kattouw (Catrope)

Ok, then...
How many databases are in the cluster being served by nextJobDB?
How long does it take to connect to all the databases and figure out 
what ones have pending jobs?

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]


-- 
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] recursiveTagParse makes data vanish

2010-11-16 Thread Roan Kattouw
2010/11/16 Cindy Cicalese cical...@mitre.org:
 $wgUseAjax = true;
 $wgAjaxExportList[] = 'testQueryPopulateDiv';

This AJAX framework is obsolete. You should use the bot API for AJAX
instead. Documentation is at http://www.mediawiki.org/wiki/API .
There's a section on how to create your own modules from an extension
too.

If all you need to do is parse some wikitext without otherwise needing
to do things in PHP (i.e. if you can generate the wikitext to parse on
the JS side), you could use the existing action=parse module to parse
it.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] recursiveTagParse makes data vanish

2010-11-16 Thread Platonides
Cindy Cicalese wrote:
 The first issue is that I need access to the parser in the Ajax callback so I
 can call recursiveTagParse on the results for the query. I tried using
 $wgParser, but it does not appear to be in a good state in the callback. I 
 had a
 series of errors where the parser was calling functions on non-objects. I 
 played
 around with it a bit and was able to get past all of these errors with this
 rather ugly code:
 
 global $wgParser;
 $wgParser-mOptions = new ParserOptions;
 $wgParser-initialiseVariables();
 $wgParser-clearState();
 $wgParser-setTitle(new Title($title));

You are not expected to use recursiveTagParse() not being called by the
Parser. You are initializing the parser by hand.

Try instead using $wgParser-parse()



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Job queue on Wiki Farms

2010-11-16 Thread Roan Kattouw
2010/11/16 Daniel Friesen li...@nadir-seen-fire.com:
 Ok, then...
 How many databases are in the cluster being served by nextJobDB?
 How long does it take to connect to all the databases and figure out
 what ones have pending jobs?

I don't know how long it takes exactly. I do know that we're caching
the list for 5 minutes (this caching was broken between Sep 09 and
today, causing the script to regenerate the list each time).

The list is generated by connecting to each cluster and running one
large query that covers all the databases in that cluster. We have
about 815 databases spread over 6 clusters (three clusters of one, one
of three, one of ~20 and one with the other ~790), so we only need to
connect to 6 DB servers and run one query on each.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Invoking maintenance scripts return nothing at all

2010-11-16 Thread Andrew Dunbar
On 17 November 2010 02:37, Dmitriy Sintsov ques...@rambler.ru wrote:
 * Andrew Dunbar hippytr...@gmail.com [Tue, 16 Nov 2010 23:01:33
 +1100]:
 I wish to do some MediaWiki hacking which uses the codebase,
 specifically the parser, but not the database or web server.
 I'm running on Windows XP on an offline machine with PHP installed but
 no MySql or web server.
 I've unarchived the source and grabbed a copy of somebody's
 LocalSettings.php but not attempted to to install MediaWiki beyond
 this.

 Obviously I don't expect to be able to do much, but when I try to run
 any of the maintenance scripts I get no output whatsoever, not even
 errors.

 I was hoping to let the error messages guide me as to what is
 essential, what needs to be stubbed, wrapped etc.

 Am I missing something obvious or do these scripts return no errors by
 design?

 Andrew Dunbar (hippietrail)

 In the web environment, error messages may expose vulnerabilities to
 potential attacker. The errors might be written to php's error log,
 which is set up by

 error_log=path

 directive in php.ini. You may find the actual location of php.ini by
 executing

 php --ini

 Look also at the whole Error handling and logging section

 Does php work at all? Is there an configuration output

 php -r phpinfo();

 when issued from cmd.exe ?

 Does

 php dumpBackup.php --help

 being issued from /maintenance directory, produces the command line
 help?
 Dmitriy

Thanks Dmitry. PHP does work. The --help options always work. It
turned out the LocalSettings.php somebody on #mediawiki pointed me to
require_once()'d several extensions I didn't have and require_once()
seems to fail silently. I'll try to aquaint myself better with the
Error handling and logging section as you suggest.

Is there somewhere an official blank or example LocalSettings.php
file that would be better to use for people like me to avoid such
problems? Rolling my own from scratch doesn't seem ideal either.

Andrew Dunbar (hippietrail)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
http://wiktionarydev.leuksman.com http://linguaphile.sf.net

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Invoking maintenance scripts return nothing at all

2010-11-16 Thread Dmitriy Sintsov
* Andrew Dunbar hippytr...@gmail.com [Wed, 17 Nov 2010 13:32:25 
+1100]:

 Thanks Dmitry. PHP does work. The --help options always work. It
 turned out the LocalSettings.php somebody on #mediawiki pointed me to
 require_once()'d several extensions I didn't have and require_once()
 seems to fail silently. I'll try to aquaint myself better with the
 Error handling and logging section as you suggest.

require_once() should produce a warning on non-existent files. Warning 
reports should not be suppressed. For a development there should be

error_reporting = E_ALL | E_STRICT

in php.ini

 Is there somewhere an official blank or example LocalSettings.php
 file that would be better to use for people like me to avoid such
 problems? Rolling my own from scratch doesn't seem ideal either.

Usually it's being produced by web installer.
http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/config/Installer.php?view=markup

New installer is large and I haven't studied it throughly. I am more 
often upgrading from old versions than install from scratch.

However, when upgrading from old version, you have to add new options 
manually by observing the HISTORY file.
http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/HISTORY?view=markup

I wish the installer was able to parse original settings and warn of 
something outdated. However, that might make it really over complicated.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] recursiveTagParse makes data vanish

2010-11-16 Thread Dmitriy Sintsov
* Roan Kattouw roan.katt...@gmail.com [Tue, 16 Nov 2010 18:10:51 
+0100]:
 2010/11/16 Cindy Cicalese cical...@mitre.org:
  $wgUseAjax = true;
  $wgAjaxExportList[] = 'testQueryPopulateDiv';
 
 This AJAX framework is obsolete. You should use the bot API for AJAX
 instead. Documentation is at http://www.mediawiki.org/wiki/API .
 There's a section on how to create your own modules from an extension
 too.

What if my ajax call PHP function is required for extension's client 
scripts only and is meaningless to bots? (On-page interactivity). Why 
should everything to be an API, ajax is more than bots?
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] recursiveTagParse makes data vanish

2010-11-16 Thread Robert Leverington
On 2010-11-17, Dmitriy Sintsov wrote:
 What if my ajax call PHP function is required for extension's client 
 scripts only and is meaningless to bots? (On-page interactivity). Why 
 should everything to be an API, ajax is more than bots?

Because it provides a consistent, clean framework for making and
handling requests with the potential to reduce duplication in a lot of
cases.  The API is not just for bots.

Adding an API module is fairly trivial and is the correct way to provide
AJAX interactivity.

Robert

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] recursiveTagParse makes data vanish

2010-11-16 Thread Dmitriy Sintsov
* Robert Leverington rob...@rhl.me.uk [Wed, 17 Nov 2010 07:38:23 
+]:
 Because it provides a consistent, clean framework for making and
 handling requests with the potential to reduce duplication in a lot of
 cases.  The API is not just for bots.

API is supposed to be useful for another (remote) clients. Some of my 
ajax calls are useful only locally and only to my own extension. Why 
should I expose these openly.

 Adding an API module is fairly trivial and is the correct way to 
provide
 AJAX interactivity.

In some cases it's unneeded complication, where you have to build the 
tables of parameter types, parameter descriptions and so on. And also 
expose all of that functionality in api.php help.

I use both API and wgAjaxExportList[], for different purposes. I may 
completely switch to API, however that doesn't look nice to me. But 
anyway, you decide.
Dmitriy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l