[ 
https://issues.apache.org/jira/browse/SPARK-8642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin resolved SPARK-8642.
-----------------------------------
    Resolution: Won't Fix

Even though a better error here would be nice, I'll close this because there's 
no good way to do this with the current YARN API. As silly as it may be, 
"0.0.0.0:8032" is a valid address - it would work if the RM was running on the 
same host as the Spark app, so it sort of makes sense as a default. (I'd prefer 
a null default and an error if it's not set, but that boat has sailed long ago.)

It also doesn't look like you can configure the retry policy used by the YARN 
client...

So yeah, it sucks, but there's not much that Spark can do here without making 
assumptions about how configuration should be deployed.

> Ungraceful failure when yarn client is not configured.
> ------------------------------------------------------
>
>                 Key: SPARK-8642
>                 URL: https://issues.apache.org/jira/browse/SPARK-8642
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.3.0, 1.3.1
>            Reporter: Juliet Hougland
>            Priority: Minor
>         Attachments: yarnretries.log
>
>
> When HADOOP_CONF_DIR is not configured (ie yarn-site.xml is not available) 
> the yarn client will try to submit an application. No connection to the 
> resource manager will be able to be established. The client will try to 
> connect 10 times (with a max retry of ten), and then do that 30 more time. 
> This takes about 5 minutes before an Error is recorded for spark context 
> initialization, which is caused by a connect exception. I would expect that 
> after the first 1- tries fail, the initialization of the spark context should 
> fail too. At least that is what I would think given the logs. An earlier 
> failure would be ideal/preferred.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to