Does Cocoon provide a mechanism by which all pages on the site can be cached
(perhaps via a crawler)? I'm aware of the command-line interface (and had
trouble getting the crawler to get past the first page, but that's another
story). Ultimately, I would like to use Cocoon as a servlet but have as many
pages cached as possible at the "click of a button", as opposed to waiting
for each page to be requested. I suppose this could be done externally (with
my own crawler) but I was wondering if Cocoon had some built-in mechanism
for doing this.

Also, I am building a site that has three versions per page (Flash,
non-Flash, etc.) and that uses cookies to set a user's preference. All of my
cookie logic is specified in sitemap.xmap, so I am already committed to
using Cocoon as a servlet. Are there caching issues with such an approach?
If performance ultimately becomes a problem, I suppose I could statically
generate most of the pages and just use readers for each version of each
page, but that wouldn't be ideal, as certain portions of the site are indeed
dynamic.

Finally, if anyone has any words of wisdom with respect to using Cocoon for
serving multiple versions of a page (from the same URL), I'd be happy to
hear them.

Thanks,
Evan

---------------------------------------------------------------------
Please check that your question  has not already been answered in the
FAQ before posting.     <http://xml.apache.org/cocoon/faq/index.html>

To unsubscribe, e-mail:     <[EMAIL PROTECTED]>
For additional commands, e-mail:   <[EMAIL PROTECTED]>

Reply via email to