Just as a reference, I have a wiki (mw 1.13.2, smw 1.3, php 5.2.3,
mysql 5.0.45) with ~210,000 pages and ~5.2 million properties values
(over 51 defined property), PHP memory is set to 128M, and I believe
it is using APC.  No problems in terms of memory limits being reached
-- how many properties/page and queries/page do you have? what types
of queries?

As always, Markus' suggestions are right on.

-Tom

On Fri, Feb 12, 2010 at 10:18 AM, Markus Krötzsch
<mar...@semantic-mediawiki.org> wrote:
> On Freitag, 12. Februar 2010, Robert Murphy wrote:
>> Coders,
>>
>> I am not a coder.  I'm not even any good at server maintenance.  But SMW is
>> taking my site down several times a day now.  My wiki is either the
>>  biggest, or nearly the biggest SMW wiki (according to
>> http://www.semantic-mediawiki.org/wiki/Sites_using_Semantic_MediaWiki) with
>> 250,000 pages.  My site runs out of memory and chokes all the time.  I
>> looked in /var/log/messages and it is full of things like
>>
>> httpd: PHP Fatal error:  Out of memory (allocated 10747904) (tried to
>> allocate 4864 bytes) in
>> /home/reformedword/public_html/includes/AutoLoader.php on line 582
>>
>> but the php file in question is different every time.  I'm getting one of
>> these kind of errors every half hour or more.
>> Before you say, "Up your PHP memory", know that I did!  I went up from 64MB
>> to 128MB to 256MB.  Same story.  So I switched to babysitting "top -cd2".
>> When I change a page without semantic data, HTTPD and MYSQLD requests come,
>> linger and go.  But when I change a page with Semantic::Values, the HTTPD
>> and MYSQLD processes take a VERY long time to die, sometimes never.
>> Eventually the site runs out of memory.
>>
>> Like I said, php.ini has 128MB memory and 60 second timeout for mysql.
>> apache has a 60 second timeout too.  Any help?
>
> Great, finally someone has a performance-related request (I sometimes feel
> that I am the only one who is concerned about performance).
>
> Regarding PHP, I don't think that a memory limit of more than 50MB or
> maximally 100MB can be recommended to any public site. What ever dies beyond
> this point cannot be saved. On the other hand, PHP Out of Memory issues are
> hard to track since there cause is often not the function that adds the final
> byte that uses up all memory. You have seen this in your logs.
>
> One general thing that should be done on larger sites (actually on all sites!)
> is bytecode caching, see [1]. This significantly reduces the impact that large
> PHP files as such have on your memory requirements.
>
> Out of mem issues usually result in blank pages that can only be edited by
> changing the URL manually to use the edit action. Finding these pages is
> crucial to track down the problem. In the context of SMW, I have seen memory
> issues when inline queries return a long list of results each of which
> contains a lot of values. This problem is worse when using templates for
> formatting, but it occurs also with tables. I have tracked down this problem
> to MediaWiki in my tests: manually writing a page with the contents produced
> by the large inline query has also used up all memory, even without SMW being
> involved. If this is the case on your wiki, then my only advise is to change
> the SMW settings to restrict the size of query outputs so that pages cannot
> become so large. If this is not the problem you have, then it is important to
> find out which pages cause the issues in your wiki. Note that problems that
> are caused by MediaWiki jobs could also appear for random pages since they are
> not depending on the page contents.
>
>
> Regarding MySQL, you should activate and check the slow query logging of
> MySQL. It will create log files that show you which queries took particularly
> long. This can often be used to track down problematic queries and to do
> something to prevent them.
>
>
> If you experience general site overload in a burst-like fashion then it might
> be that some over-zealous crawler is visiting your site, possibly triggering
> complicated activities. Check your Apache logs to see if you have high loads
> for certain robots or suspicious user agents, especially on special pages like
> Ask. Update your robots.txt to disallow crawlers to browse all results of an
> inline query (crawlers have been observed to do this).
>
> -- Markus
>
> [1] http://www.mediawiki.org/wiki/User:Robchurch/Performance_tuning
>
>
>
> --
> Markus Krötzsch  <mar...@semantic-mediawiki.org>
> * Personal page: http://korrekt.org
> * Semantic MediaWiki: http://semantic-mediawiki.org
> * Semantic Web textbook: http://semantic-web-book.org
> --
>
> ------------------------------------------------------------------------------
> SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
> Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
> http://p.sf.net/sfu/solaris-dev2dev
> _______________________________________________
> Semediawiki-devel mailing list
> Semediawiki-devel@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/semediawiki-devel
>
>

------------------------------------------------------------------------------
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
_______________________________________________
Semediawiki-devel mailing list
Semediawiki-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/semediawiki-devel

Reply via email to