It seems, your spark-on-yarn application is not able to get it's
application master,

org.apache.spark.SparkException: Yarn application has already ended!
It might have been killed or unable to launch application master.


Check once on yarn logs

Thanks,
Sathish-


On Fri, Jun 8, 2018 at 2:22 PM, Jeff Zhang <zjf...@gmail.com> wrote:

>
> Check the yarn AM log for details.
>
>
>
> Aakash Basu <aakash.spark....@gmail.com>于2018年6月8日周五 下午4:36写道:
>
>> Hi,
>>
>> Getting this error when trying to run Spark Shell using YARN -
>>
>> Command: *spark-shell --master yarn --deploy-mode client*
>>
>> 2018-06-08 13:39:09 WARN  Client:66 - Neither spark.yarn.jars nor 
>> spark.yarn.archive is set, falling back to uploading libraries under 
>> SPARK_HOME.
>> 2018-06-08 13:39:25 ERROR SparkContext:91 - Error initializing SparkContext.
>> org.apache.spark.SparkException: Yarn application has already ended! It 
>> might have been killed or unable to launch application master.
>>
>>
>> The last half of stack-trace -
>>
>> 2018-06-08 13:56:11 WARN  YarnSchedulerBackend$YarnSchedulerEndpoint:66 - 
>> Attempted to request executors before the AM has registered!
>> 2018-06-08 13:56:11 WARN  MetricsSystem:66 - Stopping a MetricsSystem that 
>> is not running
>> org.apache.spark.SparkException: Yarn application has already ended! It 
>> might have been killed or unable to launch application master.
>>   at 
>> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:89)
>>   at 
>> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
>>   at 
>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
>>   at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
>>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
>>   at 
>> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
>>   at 
>> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
>>   at scala.Option.getOrElse(Option.scala:121)
>>   at 
>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
>>   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)
>>   ... 55 elided
>> <console>:14: error: not found: value spark
>>        import spark.implicits._
>>               ^
>> <console>:14: error: not found: value spark
>>        import spark.sql
>>
>>
>> Tried putting the *spark-yarn_2.11-2.3.0.jar *in Hadoop yarn, still not
>> working, anything else to do?
>>
>> Thanks,
>> Aakash.
>>
>

Reply via email to