Hey Jean,
Thanks for the quick response. I am using spark 1.4.1 pre-built with hadoop
2.6.
Yes the Yarn cluster has multiple running worker nodes.
It would a great help if you can tell how to look for the executors logs.
Regards,
Sushrut Ikhar
[image: https://]about.me/sushrutikhar
Hi,
I am new to Spark and I have been trying to run Spark in yarn-client mode.
I get this error in yarn logs :
Error: Could not find or load main class
org.apache.spark.executor.CoarseGrainedExecutorBackend
Also, I keep getting these warnings:
WARN YarnScheduler: Initial job has not accepted
Hi Sushrut,
which packaging of Spark do you use ?
Do you have a working Yarn cluster (with at least one worker) ?
spark-hadoop-x ?
Regards
JB
On 10/08/2015 07:23 AM, Sushrut Ikhar wrote:
Hi,
I am new to Spark and I have been trying to run Spark in yarn-client mode.
I get this error in yarn