Re: [SMW-devel] Namespace Clutter

2011-04-15 Thread Robert Murphy
OK, 2 namespaces: fundamentals and derived things.  TYPE's are not defined
in terms of anything else.  Property's, Form's, Filter's and Concept's all
have definitions written on them..

On Fri, Apr 15, 2011 at 4:02 PM, Jeroen De Dauw jeroended...@gmail.comwrote:

 Hey,

 You cannot put all the stuff in one namespace; how would you then know what
 it actually is? For example properties show a list of pages on which they
 are used (plus the value they have there). This information would need to be
 represented in another way if it all was in a single namespace. The current
 approach makes it very obvious what the thing is, a property, type or
 something else. Also, it avoids naming conflicts. I honestly don't see why
 you'd put everything in a single ns, it seems it'd only cause issues and not
 provide any benefits.

 Cheers

 --
 Jeroen De Dauw
 http://www.bn2vs.com
 Don't panic. Don't be evil.
 --


 --
 Benefiting from Server Virtualization: Beyond Initial Workload
 Consolidation -- Increasing the use of server virtualization is a top
 priority.Virtualization can reduce costs, simplify management, and improve
 application availability and disaster protection. Learn more about boosting
 the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev
 ___
 Semediawiki-devel mailing list
 Semediawiki-devel@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


--
Benefiting from Server Virtualization: Beyond Initial Workload 
Consolidation -- Increasing the use of server virtualization is a top
priority.Virtualization can reduce costs, simplify management, and improve 
application availability and disaster protection. Learn more about boosting 
the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] [Semediawiki-user] Upgrade Fail

2010-08-16 Thread Robert Murphy
Thanks, Markus, that helped.  I think it must have been the Start updating
data on Special:SMWAdmin I did a while back.  Something got lost in there
when I had incompatible versions of SMW and SMW+.  Appreciate your help.


On Mon, Aug 16, 2010 at 7:08 AM, Markus Krötzsch 
mar...@semantic-mediawiki.org wrote:

 On 16/08/2010 06:34, Robert Murphy wrote:

 Dear Sirs and Madams,

 I posted on this before but the situation is worse than I feared.  I
 upgraded to SMW 1.15.1.1 to get the n-ary functionality I so desperately
 need.  But now, properties on pages are disappearing and runJobs.php (my
 bread and butter) doesn't work.  Well, it does about 4 pages and then says
 Fatal error: Call to a member function getNamespace() on a non-object in
 /../www/mediawiki/includes/JobQueue.php on line 277
 I have disabled EVERY other extension and this still happens!  Any help
 would be greatly appreciated.


 It seems that some problematic jobs are in your job queue. A brute force
 way to kill these is to empty that table job in your MediaWiki
 installation. This can be done either using some graphical UI, or by
 executing the MySQL query:

 TRUNCATE TABLE job;

 after logging in to your server (selecting the appropriate database).

 I recall such issues some time back in rare cases, and I thought the SMW
 jobs had been guarded against this potential problem since then. Maybe there
 is another place where it happens. In general, the problem should occur due
 to the job queue containing a job for a page name that is not a valid title.
 When trying to use this page name, a page object is attempted to be created,
 but no such object is returned. The software then seems to access this
 supposed object without checking its validitiy, leading to the error you
 see.

 For debugging this, it would be useful if you could post some more output,
 if available. Especially the name of the job and, if possible, of the page
 it acts on.

 Regards,

 Markus


 Sadly,

 Robert Murphy

 --
 This SF.net email is sponsored by

 Make an app they can't live without
 Enter the BlackBerry Developer Challenge
 http://p.sf.net/sfu/RIM-dev2dev
 ___
 Semediawiki-user mailing list
 semediawiki-u...@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/semediawiki-user



--
This SF.net email is sponsored by 

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev ___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] SMW scalability

2010-02-13 Thread Robert Murphy
I install APC it makes some difference in speed.  I turned on slow-message
logging for mySQL and that log file is filling up fast!  Some examples:

# u...@host: reformedword[reformedword] @ localhost []
# Query_time: 9  Lock_time: 2  Rows_sent: 5  Rows_examined: 5
SELECT /* SMW::deleteSubject::Nary Aquatiki */  smw_id  FROM `rw_smw_ids`
WHERE smw_title='' AND smw_namespace='712136' AND smw_iw=':smw';
# u...@host: reformedword[reformedword] @ localhost []
# Query_time: 46  Lock_time: 0  Rows_sent: 0  Rows_examined: 0
DELETE /* SMW::deleteSubject::Atts2 Aquatiki */ FROM `rw_smw_atts2` WHERE
s_id = '230051';
# Time: 100213  3:02:55
# u...@host: reformedword[reformedword] @ localhost []
# Query_time: 55  Lock_time: 1  Rows_sent: 0  Rows_examined: 0
INSERT /* SMW::updateRel2Data 216.129.119.43 */  INTO `rw_smw_rels2`
(s_id,p_id,o_id) VALUES ('202900','4987','
202901'),('202900','5064','5065'),('202900','5463','202899'),('202900','135','289060'),('289060','225879','304'
),('202900','135','289061'),('289061','225879','118784'),('289061','225881','331'),('202900','135','289062'),('
289062','225879','119703'),('289062','225881','386'),('202900','135','289065'),('289065','225879','277'),('2029
00','135','289066'),('289066','225879','207525'),('289066','225881','385'),('202900','135','289067'),('289067',
'225879','119048'),('289067','225881','225124'),('202900','135','601368'),('601368','225879','584'),('601368','
225881','387'),('202900','1011','1417340'),('1417340','225879','1425'),('202900','1011','1417342'),('1417342','
225879','1446'),('1417342','225881','1174937'),('202900','1011','1663477'),('1663477','225879','1182228'),('166
3477','225881','209798'),('202900','1464','1663478'),('1663478','225879','926'),('1663478','225881','375867'),(
'202900','1464','1663479'),('1663479','225879','1490'),('202900','1464','1663480'),('1663480','225879','223216'
),('1663480','225881','687547'),('202900','1464','1663481'),('1663481','225879','845'),('202900','1464','222487
0'),('2224870','225879','1651'),('202900','6840','4980'),('202900','6906','5727'),('202900','223343','6503');
# Time: 100213  3:03:12
# u...@host: reformedword[reformedword] @ localhost []
# Query_time: 36  Lock_time: 19  Rows_sent: 1  Rows_examined: 1948
SELECT /* SMW::getQueryResult 216.129.119.43 */  COUNT(DISTINCT t0.smw_id)
AS count  FROM `rw_smw_ids` AS t0 IN
NER JOIN `rw_smw_rels2` AS t2 ON t0.smw_id=t2.s_id INNER JOIN `rw_smw_rels2`
AS t5 ON t2.o_id=t5.s_id INNER JOI
N `rw_smw_inst2` AS t7 ON t2.s_id=t7.s_id  WHERE t2.p_id='1464' AND
t5.p_id='225879' AND t5.o_id='177656' AND t
7.o_id='3304'  LIMIT 10001;
# Time: 100213  3:03:30





 Regarding PHP, I don't think that a memory limit of more than 50MB or
 maximally 100MB can be recommended to any public site. What ever dies
 beyond
 this point cannot be saved. On the other hand, PHP Out of Memory issues are
 hard to track since there cause is often not the function that adds the
 final
 byte that uses up all memory. You have seen this in your logs.

 One general thing that should be done on larger sites (actually on all
 sites!)
 is bytecode caching, see [1]. This significantly reduces the impact that
 large
 PHP files as such have on your memory requirements.

 Out of mem issues usually result in blank pages that can only be edited by
 changing the URL manually to use the edit action. Finding these pages is
 crucial to track down the problem. In the context of SMW, I have seen
 memory
 issues when inline queries return a long list of results each of which
 contains a lot of values. This problem is worse when using templates for
 formatting, but it occurs also with tables. I have tracked down this
 problem
 to MediaWiki in my tests: manually writing a page with the contents
 produced
 by the large inline query has also used up all memory, even without SMW
 being
 involved. If this is the case on your wiki, then my only advise is to
 change
 the SMW settings to restrict the size of query outputs so that pages cannot
 become so large. If this is not the problem you have, then it is important
 to
 find out which pages cause the issues in your wiki. Note that problems that
 are caused by MediaWiki jobs could also appear for random pages since they
 are
 not depending on the page contents.


 Regarding MySQL, you should activate and check the slow query logging of
 MySQL. It will create log files that show you which queries took
 particularly
 long. This can often be used to track down problematic queries and to do
 something to prevent them.


 If you experience general site overload in a burst-like fashion then it
 might
 be that some over-zealous crawler is visiting your site, possibly
 triggering
 complicated activities. Check your Apache logs to see if you have high
 loads
 for certain robots or suspicious user agents, especially on special pages
 like
 Ask. Update your robots.txt to disallow crawlers to browse all results of
 an
 inline query (crawlers have been observed to do this).

 -- Markus

 

[SMW-devel] SMW scalability

2010-02-12 Thread Robert Murphy
Coders,

I am not a coder.  I'm not even any good at server maintenance.  But SMW is
taking my site down several times a day now.  My wiki is either the biggest,
or nearly the biggest SMW wiki (according to
http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with
250,000 pages.  My site runs out of memory and chokes all the time.  I
looked in /var/log/messages and it is full of things like

httpd: PHP Fatal error:  Out of memory (allocated 10747904) (tried to
allocate 4864 bytes) in
/home/reformedword/public_html/includes/AutoLoader.php on line 582

but the php file in question is different every time.  I'm getting one of
these kind of errors every half hour or more.
Before you say, Up your PHP memory, know that I did!  I went up from 64MB
to 128MB to 256MB.  Same story.  So I switched to babysitting top -cd2.
When I change a page without semantic data, HTTPD and MYSQLD requests come,
linger and go.  But when I change a page with Semantic::Values, the HTTPD
and MYSQLD processes take a VERY long time to die, sometimes never.
Eventually the site runs out of memory.

Like I said, php.ini has 128MB memory and 60 second timeout for mysql.
apache has a 60 second timeout too.  Any help?

-Robert
--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] SMW scalability

2010-02-12 Thread Robert Murphy
It's commented out, so the system is going with what's in php.ini, right?

On Fri, Feb 12, 2010 at 4:55 AM, Marco Mauritczat ma...@fzi.de wrote:

 Hi Robert,
 I am also still new to SMW, but did you also adjust your settings in
 MediaWikis LocalSettings.php? Try to set the line
 Ini_set( 'memory_limit', '32M');
 to some higher value.

 Greetings
 Marco

 -Original Message-
 From: Robert Murphy [mailto:mrandmrsmur...@gmail.com]
 Sent: Friday, February 12, 2010 11:39 AM
 To: Semantic MediaWiki Developers List
 Subject: [SMW-devel] SMW scalability

 Coders,

 I am not a coder.  I'm not even any good at server maintenance.  But SMW is
 taking my site down several times a day now.  My wiki is either the
 biggest,
 or nearly the biggest SMW wiki (according to
 http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki)
 with
 250,000 pages.  My site runs out of memory and chokes all the time.  I
 looked in /var/log/messages and it is full of things like

 httpd: PHP Fatal error:  Out of memory (allocated 10747904) (tried to
 allocate 4864 bytes) in
 /home/reformedword/public_html/includes/AutoLoader.php on line 582

 but the php file in question is different every time.  I'm getting one of
 these kind of errors every half hour or more.
 Before you say, Up your PHP memory, know that I did!  I went up from 64MB
 to 128MB to 256MB.  Same story.  So I switched to babysitting top -cd2.
 When I change a page without semantic data, HTTPD and MYSQLD requests come,
 linger and go.  But when I change a page with Semantic::Values, the HTTPD
 and MYSQLD processes take a VERY long time to die, sometimes never.
 Eventually the site runs out of memory.

 Like I said, php.ini has 128MB memory and 60 second timeout for mysql.
 apache has a 60 second timeout too.  Any help?

 -Robert


--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] SMW scalability

2010-02-12 Thread Robert Murphy
Does
/etc/init.d/httpd restart
do enough?  That's what I did.

On Fri, Feb 12, 2010 at 6:38 AM, Ryan Lane rlan...@gmail.com wrote:

 Robert,

 Did you restart the web server after upping the memory? Those setting
 won't take affect otherwise.

 V/r,

 Ryan Lane

 On Fri, Feb 12, 2010 at 8:24 AM, Robert Murphy mrandmrsmur...@gmail.com
 wrote:
  It's commented out, so the system is going with what's in php.ini, right?
 
  On Fri, Feb 12, 2010 at 4:55 AM, Marco Mauritczat ma...@fzi.de wrote:
 
  Hi Robert,
  I am also still new to SMW, but did you also adjust your settings in
  MediaWikis LocalSettings.php? Try to set the line
  Ini_set( 'memory_limit', '32M');
  to some higher value.
 
  Greetings
  Marco
 
  -Original Message-
  From: Robert Murphy [mailto:mrandmrsmur...@gmail.com]
  Sent: Friday, February 12, 2010 11:39 AM
  To: Semantic MediaWiki Developers List
  Subject: [SMW-devel] SMW scalability
 
  Coders,
 
  I am not a coder.  I'm not even any good at server maintenance.  But SMW
  is
  taking my site down several times a day now.  My wiki is either the
  biggest,
  or nearly the biggest SMW wiki (according to
  http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki)
  with
  250,000 pages.  My site runs out of memory and chokes all the time.  I
  looked in /var/log/messages and it is full of things like
 
  httpd: PHP Fatal error:  Out of memory (allocated 10747904) (tried to
  allocate 4864 bytes) in
  /home/reformedword/public_html/includes/AutoLoader.php on line 582
 
  but the php file in question is different every time.  I'm getting one
 of
  these kind of errors every half hour or more.
  Before you say, Up your PHP memory, know that I did!  I went up from
  64MB
  to 128MB to 256MB.  Same story.  So I switched to babysitting top
 -cd2.
  When I change a page without semantic data, HTTPD and MYSQLD requests
  come,
  linger and go.  But when I change a page with Semantic::Values, the
 HTTPD
  and MYSQLD processes take a VERY long time to die, sometimes never.
  Eventually the site runs out of memory.
 
  Like I said, php.ini has 128MB memory and 60 second timeout for mysql.
  apache has a 60 second timeout too.  Any help?
 
  -Robert
 
 
 
 
 --
  SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
  Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
  http://p.sf.net/sfu/solaris-dev2dev
  ___
  Semediawiki-devel mailing list
  Semediawiki-devel@lists.sourceforge.net
  https://lists.sourceforge.net/lists/listinfo/semediawiki-devel
 
 

--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


[SMW-devel] Preg_match errors

2009-09-24 Thread Robert Murphy
My site is really screeching to a halt these days.  I took a look at
/var/log/messages and it's full of errors all the time.

httpd: PHP Warning:  preg_split() [a
href='function.preg-split'function.preg-split/a]: Compilation failed:
nothing to repeat at offset 8 in
/home/public_html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php
on line 805

is very common and

httpd: PHP Notice:  Undefined variable: info_id in
/home/public_html/extensions/SemanticForms/includes/SF_FormInputs.inc on
line 589

occurs sometimes.

My site is mixed English, Ancient Greek, Ancient Hebrew and Latin, so
there's a lot of Unicode.  I haven't gone with the new MySQL charset

# Experimental charset support for MySQL 4.1/5.0.
$wgDBmysql5 = false;

Is this a fault of mine or SMW?

-Robert
--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] Preg_match errors

2009-09-24 Thread Robert Murphy
I should've said what versions I'm using

Product Version  MediaWiki http://www.mediawiki.org/ 1.15.1
(r54337http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/?pathrev=54337
)  PHP http://www.php.net/ 5.2.9 (apache2handler)
MySQLhttp://www.mysql.com/
5.0.45
Semantic Forms 1.6
Semantic Internal Objects 0.1
SMW 1.5e-SVN
SRF 1.4.4

-Robert

On Thu, Sep 24, 2009 at 4:48 PM, Yaron Koren yaro...@gmail.com wrote:

 That PHP notice from Semantic Forms is from an older version, by the way;
 it's since been fixed.

 -Yaron

 On Thu, Sep 24, 2009 at 7:08 PM, Robert Murphy 
 mrandmrsmur...@gmail.comwrote:

 My site is really screeching to a halt these days.  I took a look at
 /var/log/messages and it's full of errors all the time.

 httpd: PHP Warning:  preg_split() [a
 href='function.preg-split'function.preg-split/a]: Compilation failed:
 nothing to repeat at offset 8 in
 /home/public_html/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.php
 on line 805

 is very common and

 httpd: PHP Notice:  Undefined variable: info_id in
 /home/public_html/extensions/SemanticForms/includes/SF_FormInputs.inc on
 line 589

 occurs sometimes.

 My site is mixed English, Ancient Greek, Ancient Hebrew and Latin, so
 there's a lot of Unicode.  I haven't gone with the new MySQL charset

 # Experimental charset support for MySQL 4.1/5.0.
 $wgDBmysql5 = false;

 Is this a fault of mine or SMW?

 -Robert


 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay
 ahead of the curve. Join us from November 9#45;12, 2009. Register
 now#33;
 http://p.sf.net/sfu/devconf
 ___
 Semediawiki-devel mailing list
 Semediawiki-devel@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/semediawiki-devel



--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


[SMW-devel] PHP Fatal error

2009-05-07 Thread Robert Murphy
Dear SMW Coding Dudes,
Has anyone else seen this error when you try to run runJobs.php?

PHP Fatal error:  Call to undefined function wfProfileIn() in
/home/imurphy/***.com/extensions/SemanticMediaWiki/includes/SMW_GlobalFunctions.php
on line 412

I run SMW 1.5e and MW 1.16a

I wouldn't bug you except runJobs is kind of crucial to what we do.

-Robert Murphy
--
The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your
production scanning environment may not be a perfect world - but thanks to
Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700
Series Scanner you'll get full speed at 300 dpi even with all image 
processing features enabled. http://p.sf.net/sfu/kodak-com___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


[SMW-devel] Circos - Table Data as a Circle

2009-04-21 Thread Robert Murphy
I guess I've become a troll, just subscribing to these lists to see if
anyone will ever implement what I need n-ary data to do.  I don't really
know how to program anything, but perhaps this comment will be welcome
anyway.
There is an amazing new tool for visualizing tabular data called Circos
http://mkweb.bcgsc.ca/circos/?Visualizing_Tabular_Data

To see the shorter write-up that demonstrates what a cool tool this would be
for SMW people, check out this blog post
http://flowingdata.com/2009/04/21/visual-representation-of-tabular-information-how-to-fix-the-uncommunicative-table/

It's got lots of pretty colors and does of a good job of selling this new
technique.

Robert
--
Stay on top of everything new and different, both inside and 
around Java (TM) technology - register by April 22, and save
$200 on the JavaOne (SM) conference, June 2-5, 2009, San Francisco.
300 plus technical and hands-on sessions. Register today. 
Use priority code J9JMT32. http://p.sf.net/sfu/p___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] 1.4 and N-ary

2008-10-07 Thread Robert Murphy
Since there seems to be a general fielding of ideas for the push towards
1.4, let me once again reiterate my deep concerns of the way SMW handles
n-ary data.  I am not a programmer, so I'm sure I'm asking a lot without
knowing what it costs, but I think I speak for a growing number of SMW users
and their desire for functionality.

I think the way n-ary data is handled in #asks needs to be radically
rethought.  Take for example the S-MW.org example of Einstein and his jobs.
 [[Property:Had job]] is something like [[Property:Job]] (Has type::Page),
[[Property:Employer]] (Page), [[Property::Start date]] (Date) and
[[Property::End date]] (date).  The natural way to want to display this data
for multiple people is a table.  The assumption would be that you'd have
sortable table headings of Person, Job, Employer, Start and End.  Einstein
would have multiple rows for his multiple jobs.  Other people would populate
the table the same way.  This way, you could sort for people who worked for
[[Deutche Bank]] or were [[Patent Clerks]].  But this kind of layout is
impossible with MASSIVE programming, MANY other extensions and some
on-the-fly JavaScripting.

On my wiki, I have an enormous about of data is simple, two-dimensions: a
word and it's form.  But if I want to list all the forms a word is used in
across all pages, I have to either hand write it, or write regex loops to
parse the n-ary data, using vast amounts of server resources.

I hope those of you who code will consider revisiting n-ary in queries and
make it more functional.  If I win the Lotto, the second thing I would do
would be to pay a developer to work on this!
Sincerely,

Robert Murphy
-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


[SMW-devel] unicode error

2008-09-18 Thread Robert Murphy
I'd just noticed that my job queue was running down pretty slow, when I
decided to SSH into my server and run 'nice php runJobs.php' to speed things
up.  I discovered that since I'd updated a template concerned solely with my
Greek language pages, is generating an error on every update:

2008-09-18 12:16:05  494100  refreshLinks Greek:ΑΙΩΝ
PHP Warning:  preg_split(): Compilation failed: nothing to repeat at offset
8 in /home/imurphy/
reformedword.imurphy.com/extensions/SemanticMediaWiki/includes/SMW_QueryProcessor.phpon
line 754

Every single update is generating this error!  Any ideas as to why regex is
freaking out over unicode?

-Robert
-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK  win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100url=/___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel


Re: [SMW-devel] [Semediawiki-user] refreshData script kills itself over a too-long regex

2008-07-12 Thread Robert Murphy
I too see the long preg-match regex error, even when just doing a
runJobs.php to clear the workqueue.  Perhaps it has something to do with
Unicode pages with lots of semantic data (the only data connection I can see
between Temlakos' wiki and mine).  Please let me know how I can help us all
track down the source of this glitch.

-Robert

On Fri, Jul 11, 2008 at 8:07 PM, Temlakos [EMAIL PROTECTED] wrote:

 Markus Krötzsch wrote:
  On Freitag, 11. Juli 2008, Temlakos wrote:
 
  Everyone:
 
  Several weeks ago, I finally figured out how to install SMW's
  maintenance scripts as symlinks in my server's wiki maintenance
  subdirectory so that I could run them.
 
  But when I ran SMW_refreshData.php, I got multiple warnings saying that
  a call to preg_match failed on an overly long regular expression. The
  implicated file was my custom historical date file. And after multiple
  such warnings, the execution of the file finally ended with one word:
  Killed.
 
  Markus, I believe you have a copy of the historical-date file
  (SMW_DV_HxDate.php). The longest regular expression (regex) in it is
  $screenpat, and my file calls preg_match with that string in order to
  screen out date texts that are not in a form that the script would
  recognize. I do that to ensure that any annotated date that passed that
  test would be sure to represent a valid date, so long as month names
  were spelled correctly, etc.
 
  But if a long regex is creating a problem, then I must solve it today,
  before I update my wiki. Otherwise, SMW_refreshData.php will kill itself
  again, and it will leave the job unfinished.
 
  How long can a regex be and not cause a problem with the execution of
  SMW_refreshData?
 
  The regex strings in the file are $screenpat and $format1, $format2,
  $format3, $format4, and $format5.
 
  These strings have 219, 83, 89, 84, 85, and 55 characters, respectively.
 
  Any assistance would be appreciated. Furthermore, if anyone else hopes
  to use the Historical Date script, then I can't have it creating a
  problem every time someone wants to run SMW_refreshData.php.
 
 
  I never encountered a similar problem. We also have long regexps in SMW,
 and
  the lengths you gave do not sound impressingly long to me either. Are all
  regexps static and do not use any variables of possibly unexpected
 content?
  Can a websearch help you on your warning/error messages?
 
  Of course, Killed sounds like an emergency break due to the shortage of
 some
  resource (such as memory). Does the problem occur when you start on that
 very
  page (using -v and then -s id as options fo refreshData)?
 
  SMW_refreshData.php as such does not do many things that would be
 different
  from normal page editing, though it calls functions in a slightly
 different
  program context (I just fixed some bugs in SemanticCalendar, which relied
 on
  the global $wgTitle that is not ensured to contain anythin during parsing
 in
  general and refreshData in particular). Besides these things, it would
  normally use the same code as during writing a page. Of course, the php
  command on a server may have different behaviour than the php module in
  Apache (and the admissible length of regexps appears to be rather
 specific to
  PHP).
 
  Anyway, you can use the option -v to see which page id causes the
 problem, and
  then use the option -s id+1 to continue after that page. This way, you
 skip
  one page but can still refresh the rest. Use the MediaWiki API or the
  database to find out which page causes the problem, and check whether it
  works normally when read/edited on the web.
 
  Markus
 
 
  P.S. It seems that this is a discussion for the developers' list ...
 
 Duly noted. I will now publish this to the development list as well,
 though the other users might want to see my answers.

 I have no insight on the problem of long regex strings. My regexes are
 static, first of all. The warning messages created such confusion and
 went by so fast that I didn't have a chance to see where the kill
 occurred before it happened. This might or might not be significant: I
 did not at first run refreshData and restrict it to refreshing type and
 property pages only. Instead I ran it on the entire database, using the
 -v option. That's why I saw all those warnings.

 Terry A. Hurlbut

 PS: Thank you for detailing the -s option. I did not at first see that
 on the commented documentation in the file.

 TAH


 -
 Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
 Studies have shown that voting for your favorite open source project,
 along with a healthy diet, reduces your potential for chronic lameness
 and boredom. Vote Now at http://www.sourceforge.net/community/cca08
 ___
 Semediawiki-devel mailing list
 Semediawiki-devel@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/semediawiki-devel




-- 
Roses 

[SMW-devel] N-ary

2008-05-15 Thread Robert Murphy
Dear Developers,

I run a wiki with a vast amount of N-ary data.  I have yet to hear
from anyone who has successfully figured out how to query N-ary data.
Some of you may have seen my many question on the User Mailing List,
but no has ever said anything other than I hear someone somewhere at
sometime solved that with templates.  I have not been able to solve
it with templates and no semantic mediawiki I've found has either.
I'm afraid no one has seriously considered the possibility that N-ary
data is inaccessible from with Mediawiki.

Basically, if you do pass a query to templates, and one of the
parameters is N-ary, then a comma separated list of all fields is
passed and there seems to be no way to handle that data from within a
template, even if there's only one instances of the n-ary value.  If I
were just within Perl of PHP, splitting on regex's the passed N-ary
data would be a piece of cake.  However, such programming language
features as split and arrays are not possible within a template.  I
think this problem warrants serious developer attention.

Thank you,

Robert

-
This SF.net email is sponsored by: Microsoft 
Defy all challenges. Microsoft(R) Visual Studio 2008. 
http://clk.atdmt.com/MRT/go/vse012070mrt/direct/01/
___
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel