Hello All,

I have several Spark Streaming applications running on Standalone mode in
Spark 1.5.  Spark is currently set up for dynamic resource allocation.  The
issue I am seeing is that I can have about 12 Spark Streaming Jobs running
concurrently.  Occasionally I would see more than half where to fail due to
Stage cancelled because SparkContext was shut down.  It would automatically
restart as it runs on supervised mode.  Attached is the screenshot of one of
the jobs that failed.  Anyone have any insight as to what is going on?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n24885/Screen_Shot_2015-09-29_at_8.png>
 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Standalone-1-5-Stage-cancelled-because-SparkContext-was-shut-down-tp24885.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to