Hi

I am trying to launch a Spark application on a CM cluster and I get the
following error.

Exception in thread "main" org.apache.spark.SparkException: Yarn
application has already ended! It might have been killed or unable to
launch application master.

at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:113)

at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:59)

at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)

What is the remedy for this type of problem

-- 

*Clint McNeil*

BI & Data Science Engineer | Impact Radius

202 Suntyger, 313 Durban Road, Bellville, 7530

o: +2721 914-1764 <%2B2721%20910-3195> | m: +2782 4796 309 |
cl...@impactradius.com

*Learn more  – Watch our 2 minute overview
<http://www.impactradius.com/?src=slsap>*

www.impactradius.com | Twitter <http://twitter.com/impactradius> | Facebook
<https://www.facebook.com/pages/Impact-Radius/153376411365183> | LinkedIn
<http://www.linkedin.com/company/impact-radius-inc.> | YouTube
<https://www.youtube.com/user/ImpactRadius>

Maximizing Return on Ad Spend

Reply via email to