Ack! Fingers slipped! Here's what I meant to write:

> I did something like this once, to aggregate a list of html 
> pages together. Each page had a [next] link at the bottom, 
> and my pipeline basically followed these links. I think you 
> want something similar. I don't have the code still, but from 
> memory, something like this:
> 

<map:match pattern="page/*">
        <map:generate type="html" src="http://www.foo.com/?page="{1}"/>
        <map:transform src="check-if-found-or-keep-looking.xsl">
                <map:parameter name="current-page" value="{1}"/>
        </map:transform>
        <map:transform type="xinclude"/>
        <map:serialize type="xml"/>
</map:match>

Call the pipeline first with "page/1". 

The XSL check-if-found-or-keep-looking.xsl should check if the record is in the page: 
if so it should return the contents of the page (i.e. an identity transform). If not 
found, it should return only an xinclude statement. So the entire output document 
would be just the single element:

<xi:include 
        xmlns:xi="I-don't-remember-the-xinclude-namespace-uri :-)" 
        href="cocoon:/page/{$current-page+1}"/>

Then the xinclude transformer handles the rest. If the record was found then the 
xinclude transformer does nothing, and the page is serialized. If the record was not 
found then the xinclude transformer will make the recursive sitemap call.

Cheers

Con

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to