Hi all,

We're running Spark 1.0 on CDH 5.1.2.  We're using Spark in YARN-client
mode.

We're seeing that one of our nodes is not being assigned any tasks, and no
resources (RAM,cpu) are being used on this node.  In the CM UI this worker
node is in good health and the spark Worker process is running, along with
the yarn-NODEMANAGER and hdfs-DATANODE .

We've tried re-starting the Spark Worker process while the application is
running, but there are still no tasks assigned to the worker.

Any hints or thoughts on this?  We can wait until the current job finishes
and restart spark, yarn, etc, but I wonder if there is a way to make the
currently running job recognize the worker & begin assigning tasks.

Thanks!
- Jon



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Worker-with-no-Executor-YARN-client-mode-tp15708.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to