Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19394#discussion_r143060736
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsSuite.scala
 ---
    @@ -228,7 +228,7 @@ class SQLMetricsSuite extends SparkFunSuite with 
SQLMetricsTestUtils with Shared
         testSparkPlanMetrics(df, 2, Map(
           1L -> (("BroadcastHashJoin", Map(
             "number of output rows" -> 2L,
    -        "avg hash probe (min, med, max)" -> "\n(1, 1, 1)"))))
    +        "avg hash probe (min, med, max)" -> "\n(1.1, 1.1, 1.1)"))))
    --- End diff --
    
    This test was incorrect. I debugged the metric and the value is 
consistently 1.1, with 9 lookups and 10 probes. 10/9 = 1.1... The reason why 
the test was passing before is that the metrics check was hitting a [race 
condition](https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsTestUtils.scala#L165-L169)
 that caused the test to pass. If I change the values in master to 1.1, the 
tests still pass and demonstrate that the check isn't really happening.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to