Hi Guys , Thanks for your valuable inputs , I have tried few alternatives as suggested but it all leads me to same result - Unable to start Spark Context
@Dhiraj Peechara I am able to start my spark SC(SparkContext) in stand-alone mode by just issuing the *$spark-shell* command from the terminal , so it makes me believe that the HADOOP_CONF_DIR is set correctly . But just for confirmation i have double check the same and the variable is correctly pointing to the installed path . I am attaching content of my Spark-env.sh file . Let me know if you think something needs to be modified to get it all rite. spark-env.txt <http://apache-spark-user-list.1001560.n3.nabble.com/file/n26709/spark-env.txt> @jasmine i did try to include the <ActiveNamenode Hostname:port> into the spark-assembly.jar path . but it didnot solve the problem , but it did gives a different error now.I have also tried to set the SPARK_JAR variable in spark-env.sh file but no success. I also tried using the below command , *spark-shell --master yarn-client --conf spark.yarn.jar=hdfs://ptfhadoop01v:8020/user/spark/share/lib/spark-assembly.jar* Issuing this command gives me the following error message , Spark-Error.txt <http://apache-spark-user-list.1001560.n3.nabble.com/file/n26709/Spark-Error.txt> I have not setup anything in my *spark-defaults.conf* file , I am not sure if that is mandatory to make it all work.I can confirm that my YARN daemons namely (ResourceManager & NodeManager) are running in the cluster . I am also attaching a copy of my *yarn-site.xml* just to make sure its all correct and not missing in any required property. yarn-site.txt <http://apache-spark-user-list.1001560.n3.nabble.com/file/n26709/yarn-site.txt> I hope i can get over this soon , Thanks again guys for your quick thoughts on this issue. Regards Ashesh -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-on-Yarn-Client-Cluster-mode-tp26691p26709.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org