[ 
https://issues.apache.org/jira/browse/HIVE-251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ashish Thusoo updated HIVE-251:
-------------------------------

    Attachment: patch-251.txt

Fixed the issue.

The problem was the Operator.close was catching and ignoring HiveExceptions.

Also in FileSinkOperator we through HiveExceptions that need to be ignored, 
e.g. while copying the transient _tmp files. This is thrown from copyUtils in 
hadoop.

Further, the input20.q test was actually failing but no failure was reported 
due to masking by this bug. We cannot support unix pipes yet. So as a 
workaround I have moved them to within a unix script.


> Failures in Transform don't stop the job
> ----------------------------------------
>
>                 Key: HIVE-251
>                 URL: https://issues.apache.org/jira/browse/HIVE-251
>             Project: Hadoop Hive
>          Issue Type: Bug
>          Components: Serializers/Deserializers
>            Reporter: S. Alex Smith
>            Assignee: Ashish Thusoo
>            Priority: Blocker
>         Attachments: patch-251.txt
>
>
> If the program executed via a SELECT TRANSFORM() USING 'foo' exits with a 
> non-zero exit status, Hive proceeds as if nothing bad happened.  The main way 
> that the user knows something bad has happened is if the user checks the logs 
> (probably because he got no output).  This is doubly bad if the program only 
> fails part of the time (say, on certain inputs) since the job will still 
> produce output and thus the problem will likely go undetected.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to