Github user gaborgsomogyi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20888#discussion_r183067778
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/DataFrameRangeSuite.scala ---
    @@ -152,39 +154,53 @@ class DataFrameRangeSuite extends QueryTest with 
SharedSQLContext with Eventuall
       }
     
       test("Cancelling stage in a query with Range.") {
    -    val listener = new SparkListener {
    -      override def onJobStart(jobStart: SparkListenerJobStart): Unit = {
    -        eventually(timeout(10.seconds), interval(1.millis)) {
    -          assert(DataFrameRangeSuite.stageToKill > 0)
    +    // Save and restore the value because SparkContext is shared
    +    val savedInterruptOnCancel = sparkContext
    +      .getLocalProperty(SparkContext.SPARK_JOB_INTERRUPT_ON_CANCEL)
    +
    +    try {
    +      
sparkContext.setLocalProperty(SparkContext.SPARK_JOB_INTERRUPT_ON_CANCEL, 
"true")
    +
    +      for (codegen <- Seq(true, false)) {
    +        // This countdown latch used to make sure with all the stages 
cancelStage called in listener
    +        val latch = new CountDownLatch(2)
    --- End diff --
    
    I've just taken a look at the logs here: 
    
    
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/89602/artifact/sql/core/target/unit-tests.log
    
    and as I see the fixed test passed properly.
    
    ```
    ===== FINISHED o.a.s.sql.DataFrameRangeSuite: 'Cancelling stage in a query 
with Range.' =====
    ```
    
    Hardcoding 2 is not really elegant I admit. To answer your question 
`SparkContext` is initialized in `TestSparkSession` this way: 
`SparkContext("local[2]", "test-sql-context", 
sparkConf.set("spark.sql.testkey", "true")`.
    
    Because of this and the fact that executors put to wait state 
`CountDownLatch(2)` is there.
    ```
    sparkContext.range(0, 10000L, numSlices = slices).mapPartitions { x =>
      x.synchronized {
        x.wait()
      }
      x
    }.toDF("id").agg(sum("id")).collect()
    ```



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to