[ 
https://issues.apache.org/jira/browse/SPARK-33699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17253305#comment-17253305
 ] 

Xiaoming W commented on SPARK-33699:
------------------------------------

I found the 'kill' link on Web UI finally run StandaloneSchedulerBackend.dead() 
method, and in the end run sc.stopInNewThread() method. So, it only stop 
SparkContext in a new daemon thread, and does not work on the method of driver 
class outside of SparkContext.

> Spark web ui kill application but the thread still exists background
> --------------------------------------------------------------------
>
>                 Key: SPARK-33699
>                 URL: https://issues.apache.org/jira/browse/SPARK-33699
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Web UI
>    Affects Versions: 2.4.3
>         Environment: spark-2.4.3
> CentOS 7.0
> JDK 1.8.0_201
>            Reporter: Xiaoming W
>            Priority: Major
>
> When I kill an application on the Web UI (which I submit with 
> standalone-client mode), it seems to be killed already; But when I use 'jps' 
> command I can still see the application running background. This is my demo 
> code to reappear this problem.
> {code:java}
> // code placeholder
> val rdd = sparkSession.sparkContext.parallelize(List(1, 2, 3, 
> 4)).repartition(4)
> //section 1: 
> //do something for a long time
> val rdd2 = rdd.map(x => { 
>   for (i <- 1 to 300) {
>     for (j <- 1 to 999999999) {
>     }
>     if (i % 10 == 0) {
>       println(i + " rdd map process running!")
>     }
>   }
>   x * 2
> })
> rdd2.take(10).foreach(println(_))
> //section 2: 
> //do something for a long time in driver
> for (i <- 1 to 500) {
>   for (j <- 1 to 999999999) {
>     }
>   if (i % 10 == 0) {
>     println(i + " main process running!")
>   }
> }
> {code}
> And, 
>  # If I kill the application on web ui when section 1 rdd.map process, it can 
> be stopped clearly;
>  # If I kill the application on web ui when section 2 do something in the 
> driver, and then it stll running background.
> So, is it a bug of spark and how to solve it ?
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to