[ 
https://issues.apache.org/jira/browse/SPARK-7736?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14601923#comment-14601923
 ] 

Shay Rojansky commented on SPARK-7736:
--------------------------------------

The problem is simply with the YARN status for the application. If a Spark 
application throws an exception after having instantiated the SparkContext, the 
application obviously terminates but YARN lists the job as SUCCEEDED. This 
makes it hard for users to see what happened to their jobs in the YARN UI.

Let me know if this is still unclear.

> Exception not failing Python applications (in yarn cluster mode)
> ----------------------------------------------------------------
>
>                 Key: SPARK-7736
>                 URL: https://issues.apache.org/jira/browse/SPARK-7736
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>         Environment: Spark 1.3.1, Yarn 2.7.0, Ubuntu 14.04
>            Reporter: Shay Rojansky
>
> It seems that exceptions thrown in Python spark apps after the SparkContext 
> is instantiated don't cause the application to fail, at least in Yarn: the 
> application is marked as SUCCEEDED.
> Note that any exception right before the SparkContext correctly places the 
> application in FAILED state.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to