Great work indeed !

Concerning view, don't forget to also analyse the full execution time in 
the browser (not only the server processing time).
My guess is we loose a lot of time because of big css and big js files, 
most of them being unused 99% of the time.

Ludovic

Sergiu Dumitriu wrote:
> Hi,
>
> I spent most of today profiling XWiki, and here is a summary of my findings.
>
> First, I tested the following three situations:
> (1) Creating many documents with 500 versions each, using XWiki 1.1
> (2) Updating those documents to XWiki 1.2
> (3) Creating many documents with 500 versions each, using XWiki 1.2
>
> (1) spent most of the time in Hibernate and RCS. And by most, I mean 
> almost all time. For rcs, each new version required parsing all the 
> history, adding the new version, reserializing the history. And the RCS 
> implementation we're using (org.suigeneris.jrcs) seems very inefficient, 
> a lot of calls ending at charAt, substring, split. And since each update 
> requires sending all the history back, the rest of the time was spent in 
> hibernate, sending large strings to mysql. Creating one document with 
> all its 500 versions takes several minutes.
>
> (2) spent most of the time doing one thing: parsing the old RCS archive. 
> Still, a lot less time was spent in Hibernate, about 2% of all the 
> execution time, as opposed to 80% in (1). Updating one 500 versions long 
> archive took 6 seconds. Updating a 300 version document takes 2 seconds, 
> while a document with one version is almost instantly updated.
>
> (3) spends incredibly much less time in rcs, as expected (1% of all the 
> running time). So the new archive mechanism is a major performance 
> boost. Instead, most of the time is spent in saveBacklinks, which goes 
> to Radeox. This is even more serious, given the fact that the document 
> content was very small (200 random characters). I'd say that a major 
> flaw is that the backlinks gathering process uses the radeox engine with 
> all the filters and macros enabled, while all we need is the links 
> filter. Here, creating a document requires a few seconds (2-3).
>
>  From the remaining (little) time that is left, a big part is consumed 
> with document cloning. See http://jira.xwiki.org/jira/browse/XWIKI-1950 
> for this. This is true both for XWiki 1.1 and 1.2.
>
> On the database performance, beside the fact that Hibernate is slow, too 
> much time is spent on saveOrUpdate. It seems that instead of checking 
> the document object to see if it needs to be saved or updated, Hibernate 
> retrieves the document from the database and compares the retrieved 
> versions with the provided object to see if it is a new object that 
> needs to be inserter or an existing one that needs updating.
>
>
> As a total for (3): 90% of the time is spent in saveDocument, of which 
> 38% in saveLinks (most of which is spent in radeox), 22% in 
> hibernate.saveOrUpdate, 16% in hibernate.endTransaction (actual saving 
> of the document to the database) and 13% in updateArchive.
>
> Thus, gathering backlinks with only one filter enabled will probably 
> reduce 35% of the running time, and improving the calls to saveOrUpdate 
> will give another 10-20%.
>
>
>
> Remarks:
> - This shows only the performance when saving documents, so I will need 
> another day to test the performance on view. I'd say that most of the 
> time will be spent in radeox, parsing the wiki content.
> - As it is known, observing a process modifies it. All the numbers would 
> probably be (a little) different under real usage, without a profiler 
> sniffing everything.
>
>
> Sergiu
> _______________________________________________
> devs mailing list
> [email protected]
> http://lists.xwiki.org/mailman/listinfo/devs
>
>   


-- 
Ludovic Dubost
Blog: http://blog.ludovic.org/
XWiki: http://www.xwiki.com
Skype: ldubost GTalk: ldubost

_______________________________________________
devs mailing list
[email protected]
http://lists.xwiki.org/mailman/listinfo/devs

Reply via email to