On Thu, Feb 06, 2003 at 06:07:02PM +0100, Andrea Censi wrote:
> Which is the best tool to create an offline copy of a cocoon live site?
>
> It should spider through the pages, follow every link and gather both html and
> images/css/pdf. Then it should rearrange internal links from absolute to
> relative ("http://site/page" -> page.html, "/" -> "index.html").
> The result is to be loaded on a low-spec [ = no cocoon :( ] webserver.
>
> I don't consider the batch use from command line to be a viable alternative,
> because:
> - There are different views of single xml files.
Don't the different views have different URLs? If so, and if you link to
those different URLs, then they will each have a file written.
> - I don't want to explicitly change the internal URL format used by the site
> (/ ... /page/ with a final slash)
I think the crawler will convert links to a directory, eg 'foo/' to
'foo/index.html'.
> - (not sure) Would it work with dynamic SVG->gif?
Yes. I'd say, give it a try. Works fine rendering Forrest sites.
Alternatively, you could try spidering tools like 'wget'.
--Jeff
---------------------------------------------------------------------
Please check that your question has not already been answered in the
FAQ before posting. <http://xml.apache.org/cocoon/faq/index.html>
To unsubscribe, e-mail: <[EMAIL PROTECTED]>
For additional commands, e-mail: <[EMAIL PROTECTED]>