[ 
https://issues.apache.org/jira/browse/SPARK-26954?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

deshanxiao updated SPARK-26954:
-------------------------------
    Description: 
Yarn attemps the failed App depending on YarnRMClient#unregister. However, some 
attemps are useless like the example:

{code:java}
sc.parallelize(Seq(1,2,3)).map(_ => throw new 
RuntimeException("exception")).collect()
{code}

Also some attemps when "FileNorFoundException" is thrown in user code looks 
unreasonable.

Some environment errors, such as node dead, attemps reasonablely. So, it will 
be better that user exception will not attemp.


  was:
Yarn attemps the failed App depending on YarnRMClient#unregister. However, some 
attemps are useless:

{code:java}
sc.parallelize(Seq(1,2,3)).map(_ => throw new 
RuntimeException("exception")).collect()
{code}

Some environment errors, such as node dead, attemps reasonablely. So, it will 
be bettler to at



> Do not attemp when user code throws exception
> ---------------------------------------------
>
>                 Key: SPARK-26954
>                 URL: https://issues.apache.org/jira/browse/SPARK-26954
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 2.3.3, 2.4.0
>            Reporter: deshanxiao
>            Priority: Critical
>
> Yarn attemps the failed App depending on YarnRMClient#unregister. However, 
> some attemps are useless like the example:
> {code:java}
> sc.parallelize(Seq(1,2,3)).map(_ => throw new 
> RuntimeException("exception")).collect()
> {code}
> Also some attemps when "FileNorFoundException" is thrown in user code looks 
> unreasonable.
> Some environment errors, such as node dead, attemps reasonablely. So, it will 
> be better that user exception will not attemp.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to