Hi I have one Spark job which runs fine locally with less data but when I
schedule it on YARN to execute I keep on getting the following ERROR and
slowly all executors gets removed from UI and my job fails

15/07/30 10:18:13 ERROR cluster.YarnScheduler: Lost executor 8 on
myhost1.com: remote Rpc client disassociated
15/07/30 10:18:13 ERROR cluster.YarnScheduler: Lost executor 6 on
myhost2.com: remote Rpc client disassociated
I use the following command to schedule spark job in yarn-client mode

 ./spark-submit --class com.xyz.MySpark --conf
"spark.executor.extraJavaOptions=-XX:MaxPermSize=512M" --driver-java-options
-XX:MaxPermSize=512m --driver-memory 3g --master yarn-client
--executor-memory 2G --executor-cores 8 --num-executors 12 
/home/myuser/myspark-1.0.jar

I dont know what is the problem please guide. I am new to Spark. Thanks in
advance.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-control-Spark-Executors-from-getting-Lost-when-using-YARN-client-mode-tp24084.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to