when I am launching with yarn-client also its giving me below error bin/spark-sql --master yarn-client 15/08/25 13:53:20 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED! Exception in thread "Yarn application state monitor" org.apache.spark.SparkException: Error asking standalone scheduler to shut down executors at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.stopExecutors(CoarseGrainedSchedulerBackend.scala:261) at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.stop(CoarseGrainedSchedulerBackend.scala:266) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:158) at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416) at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1411) at org.apache.spark.SparkContext.stop(SparkContext.scala:1644) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$$anon$1.run(YarnClientSchedulerBackend.scala:139) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1326) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.stopExecutors(CoarseGrainedSchedulerBackend.scala:257)
On 25 August 2015 at 14:26, Yanbo Liang <yblia...@gmail.com> wrote: > spark-shell and spark-sql can not be deployed with "yarn-cluster" mode, > because you need to make spark-shell or spark-sql scripts run on your local > machine rather than container of YARN cluster. > > 2015-08-25 16:19 GMT+08:00 Jeetendra Gangele <gangele...@gmail.com>: > >> Hi All i am trying to launch the spark shell with --master yarn-cluster >> its giving below error. >> why this is not supported? >> >> >> bin/spark-sql --master yarn-cluster >> Error: Cluster deploy mode is not applicable to Spark SQL shell. >> Run with --help for usage help or --verbose for debug output >> >> >> Regards >> Jeetendra >> >