I bet you are running on YARN in cluster mode.

If you are running on yarn in client mode, 
.set(“spark.yarn.maxAppAttempts”,”1”) works as you expect,
because YARN doesn’t start your app on the cluster until you call 
SparkContext().

But If you are running on yarn in cluster mode, the driver program runs from a 
cluster node.
So your app is already running on the cluster when you call .set().
To make it work in cluster mode,  the property must be set on the spark-submit 
command line via 
"—conf spark.yarn.maxAppAttempts=1”
or —driver-options “-Dspark.yarn.maxAppAttempts=1”


A note should be added to running-on-yarn.html in the "Important notes” section 
that
says that in cluster mode you need to set  spark.yarn.* properties from 
spark-submit command line.

Cheers,

Doug




> On May 7, 2015, at 2:34 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:
> 
> How i can stop Spark to stop triggering second attempt in case the first 
> fails.
> I do not want to wait for the second attempt to fail again so that i can 
> debug faster.
> 
> .set("spark.yarn.maxAppAttempts", "0") OR .set("spark.yarn.maxAppAttempts", 
> "1")
> 
> is not helping.
> 
> 
> -- 
> Deepak
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to