> However, the $EnableDrafts capability was always disabling the HTML cache, > as well as any recipes that happen to call the CondAuth() function. > I've now fixed this in pmwiki-2.2.0-beta50, just released. > For long pagelists, it's pretty dramatic.
That was it ! Indeed, the improvement is really dramatic. Thanks a lot, Patrick ! One more question: The BreakPageList recipe (http://www.pmwiki.org/wiki/Cookbook/BreakPageList) creates urls like Group1/Name1?p=10. As far as I can see, the page Group1/Name1 is cached, but going from one "subpage" to another is as slow as without caching. When not using the recipe markup, the huge pagelist is shown at once, but very fast because it was cached. Would it be technically possible to cache every "subset" of pages shown by the recipe individually ? Thanks again for this fantastic improvement. This combined with the "import" function will allow to emulate the behaviour of an external database very easily and save a lot of time for many sites that used an external database because there was no other efficient option to store some data... For example, I had a MySQL table with 2.500 rows that was populated from an application which is only able to export to XML in UTF-8. This required a program to parse the XML file and generate SQL INSERT statement, taking in account all i18n problems (accents, etc.). Then truncate table, reload, etc. Now instead of generating SQL insert statements, I can generate a wiki page for each row, containing one PageTextVariable per field (or lawrite it in the import directory) and have all the pagelist features to display what I need with the proper template. With the HTML caching enable, only the very first request is quite slow, but the next ones are faster than the queries made on the original SQL server. ...and it is a lot more stable and easier to back up... _______________________________________________ pmwiki-users mailing list pmwiki-users@pmichaud.com http://www.pmichaud.com/mailman/listinfo/pmwiki-users