>> Can anybody think of some set of wiki operations that would overload
    >> 4GB of memory?

    Thomas> I think going beyond 4GB is maybe not that easy (except if you
    Thomas> have many persistent worker processes running for many requests
    Thomas> for wikis having lots of pages - moin tries to cache some
    Thomas> informations about pages in memory).

If you look at the Python wiki as a prototype for what you want to do, you
might try:

    * Create lots of pages (a couple thousand at least)
    * Enable the antispam stuff and create a LocalBadContent page which
      contains a lot of regular expressions (100 or more)
    * Perform many simultaneous checkins which must be checked against all
      those REs ...
    * ... while an ill-behaved web crawler hammers away at the wiki

:-)

-- 
Skip Montanaro - [EMAIL PROTECTED] - http://www.webfast.com/~skip/

-------------------------------------------------------------------------
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
_______________________________________________
Moin-user mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/moin-user

Reply via email to