Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15618#discussion_r86767539
  
    --- Diff: 
streaming/src/test/scala/org/apache/spark/streaming/MapWithStateSuite.scala ---
    @@ -38,19 +38,16 @@ class MapWithStateSuite extends SparkFunSuite
       protected val batchDuration = Seconds(1)
     
       before {
    -    StreamingContext.getActive().foreach { _.stop(stopSparkContext = 
false) }
    -    checkpointDir = Utils.createTempDir("checkpoint")
    +    StreamingContext.getActive().foreach(_.stop(stopSparkContext = false))
       }
     
       after {
    -    StreamingContext.getActive().foreach { _.stop(stopSparkContext = 
false) }
    -    if (checkpointDir != null) {
    -      Utils.deleteRecursively(checkpointDir)
    -    }
    +    StreamingContext.getActive().foreach(_.stop(stopSparkContext = false))
       }
     
       override def beforeAll(): Unit = {
         super.beforeAll()
    +    checkpointDir = Utils.createTempDir("checkpoint")
    --- End diff --
    
    Actually, it seems the original codes are already fine in here.. I took a 
look and ran several tests and it seems now I may understand why @taoli91 tried 
to fix the codes like this.
    
    It seems somehow the state in `ReceiverTracker` went wrong on Windows and 
ended up without closing `checkpointDir`. It seems the original test was 
failed[1] due to the issue in `ReceiverTracker`. So, I think he tried to only 
create/delete the folder once[2] and ensure closing in `ReceiverTracker`.
    
    I tested this with manually stopping  `ReceivedBlockTracker` regardless of 
the state (the original proposal) and it seems fine without the changes in 
here, `MapWithStateSuite.scala`[3]. Of course,  it is fine with this change[4]. 
    
    
    
[1]https://ci.appveyor.com/project/spark-test/spark/build/56-F88EDDAF-E576-4787-9530-A4185FC46B1E
    
[2]https://ci.appveyor.com/project/spark-test/spark/build/57-test-MapWithStateSuite
    
[3]https://ci.appveyor.com/project/spark-test/spark/build/58-test-MapWithStateSuite
    
[4]https://ci.appveyor.com/project/spark-test/spark/build/59-test-MapWithStateSuite


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to