On Thu, 2006-03-09 at 12:52 -0500, Justin Hannus wrote:
> I can't seem to get an aggregate pipeline part to respect or handle 
> errors correctly when an
> exception occurs in one of the aggregate parts. It almost seems as if 
> its impossible. This is
> whats happening...
> 
> When an exception occurs in the a pipeline which is called as one of the 
> <map:aggregate> part elements, the erroring pipeline or parent pipelines 
> do not handle the errors correctly. In fact, the <map:handle-errors> is 
> completely ignored. What happens is the sitemap processing continues 
> even after the exception occurs and, eventually the main entry matching 
> pipeline's <handle-errors> is invoked. This is bad because when using a 
> cocoon:// request as your <map:part> src any pipelines under that 

what do you refer to with 'pipelines' here? The other map:part's?
Transformers after the map:aggregate?

> cocoon:// request will continue processing as if no error has occurred 
> but..... you will still see the cocoon error page as if processing 
> actually stopped!
> 
> I have a pipeline which aggregates several other pipelines and then 
> writes the generated content to disk. If there is an exception in one of 
> the aggregate parts I need the sitemap processing to stop and handle the 
> exception appropriately. Instead the generated content, which has 
> errored and therefore invalid, is still written to disk. If I remove the 
> aggregate pipeline and just use a regular generator the handle-errors is 
> respected correctly and processing stops before writing the content.
> 
> Am I misusing the <map:aggregate> elements? Or is this the expected 
> behavoir?
> 

I have no experience with map:aggregate, but from a quick look at the
code, it doesn't catch any exceptions. What it does always do (also when
an exception occurs) is closing the root tag and sending the end
document event.

Now I'm just going to guess wildly (since you didn't mention), but if
after the map:generate you have an XSLT transformer and you write the
content using the source writing transformer, I can image the file
indeed still gets written. This is because on the one hand Xalan can
cope with the invalid input, and the endDocument event will cause it to
do the transform and thus cause the source writing transformer to do its
job.

While the close-root-tag-and-send-end-document-event behaviour of the
aggregate is debatable, it is the nature of a SAX-pipeline that
everything in the pipeline starts executing together. Therefore things
which have side-effects and for which error recovery is important should
not be done in a streaming pipeline (therefore the source writing
transformer is considered an evil one -- don't use it).

The alternative approach is to use flowscript and its processPipelineTo
function, where you can use a try-catch block and remove the file (if
needed) when an error occurs.

-- 
Bruno Dumon                             http://outerthought.org/
Outerthought - Open Source, Java & XML Competence Support Center
[EMAIL PROTECTED]                          [EMAIL PROTECTED]


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to