Hi I have Spark job where I keep queue of 12 Spark jobs to execute in
parallel. Now I see job is almost completed and only task is pending and
because of last task job will keep on waiting I can see in UI. Please see
attached snaps. Please help me how to resolve Spark jobs from waiting for
last tasks and hence it is not moving into SUCCEDED state it is always
running and it is chalking other jobs not to run. Please guide.
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n25555/IMG_20151203_193927269.jpg>
 
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n25555/IMG_20151203_193659953.jpg>
 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Why-does-Spark-job-stucks-and-waits-for-only-last-tasks-to-get-finished-tp25555.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to