Do you have YARN_CONF_DIR set in your environment to point Spark to where your 
yarn configs are?

Greg

From: Raghuveer Chanda 
<raghuveer.cha...@gmail.com<mailto:raghuveer.cha...@gmail.com>>
Date: Wednesday, September 24, 2014 12:25 PM
To: "u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>" 
<u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>>
Subject: Spark with YARN

Hi,

I'm new to spark and facing problem with running a job in cluster using YARN.

Initially i ran jobs using spark master as --master spark://dml2:7077 and it is 
running fine on 3 workers.

But now im shifting to YARN, so installed YARN in cloud era on 3 node cluster 
and changed the master to yarn-cluster but it is not working I attached the 
screenshots of UI which are not progressing and just hanging on.

Output on terminal :

This error is repeating

./spark-submit --class "class-name" --master yarn-cluster --num-executors 3 
--executor-cores 3  jar-with-dependencies.jar


Do i need to configure YARN or why it is not getting all the workers .. please 
help ...


14/09/24 22:44:21 INFO yarn.Client: Application report from ASM:
application identifier: application_1411578463780_0001
appId: 1
clientToAMToken: null
appDiagnostics:
appMasterHost: dml3
appQueue: root.chanda
appMasterRpcPort: 0
appStartTime: 1411578513545
yarnAppState: RUNNING
distributedFinalState: UNDEFINED
appTrackingUrl: http://dml2:8088/proxy/application_1411578463780_0001/
appUser: chanda
14/09/24 22:44:22 INFO yarn.Client: Application report from ASM:
application identifier: application_1411578463780_0001
appId: 1
clientToAMToken: null
appDiagnostics:
appMasterHost: dml3
appQueue: root.chanda
appMasterRpcPort: 0
appStartTime: 1411578513545
yarnAppState: RUNNING
distributedFinalState: UNDEFINED
appTrackingUrl: http://dml2:8088/proxy/application_1411578463780_0001/




--
Regards,
Raghuveer Chanda
4th year Undergraduate Student
Computer Science and Engineering
IIT Kharagpur

Reply via email to