[ 
https://issues.apache.org/jira/browse/SPARK-15479?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15326589#comment-15326589
 ] 

Elizabeth Keddy commented on SPARK-15479:
-----------------------------------------

Thanks [~srowen].  I have that code in place already - a shutdown hook that 
calls stop() with the stopGracefully flag set to true.
The problem is I don't know what external command to use to have it invoked 
when the application runs in a yarn cluster.   "yarn kill" doesn't appear to do 
it.

> Spark job doesn't shut gracefully in yarn mode.
> -----------------------------------------------
>
>                 Key: SPARK-15479
>                 URL: https://issues.apache.org/jira/browse/SPARK-15479
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.5.1
>            Reporter: Rakesh
>         Attachments: driver.rtf, executor.rtf
>
>
> Issue i am having is similar to the one mentioned here :
> http://stackoverflow.com/questions/36911442/how-to-stop-gracefully-a-spark-streaming-application-on-yarn
> I am creating a rdd from sequence of 1 to 300 and creating streaming RDD out 
> of it.
> val rdd = ssc.sparkContext.parallelize(1 to 300)
> val dstream = new ConstantInputDStream(ssc, rdd)
> dstream.foreachRDD{ rdd =>
>   rdd.foreach{ x =>
>     log(x)
>     Thread.sleep(50)
>   }
> }
> When i kill this job, i expect elements 1 to 300 to be logged before shutting 
> down. It is indeed the case when i run it locally. It waits for the job to 
> finish before shutting down.
> But when i launch the job in cluster with "yarn-cluster" mode, it abruptly 
> shuts down.
> Executor prints following log
> ERROR executor.CoarseGrainedExecutorBackend: 
> Driver xx.xx.xx.xxx:yyyyy disassociated! Shutting down.
> and then it shuts down. It is not a graceful shutdown.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to