Github user seancxmao commented on a diff in the pull request: https://github.com/apache/spark/pull/23258#discussion_r240038550 --- Diff: sql/core/src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsSuite.scala --- @@ -182,10 +182,13 @@ class SQLMetricsSuite extends SparkFunSuite with SQLMetricsTestUtils with Shared } test("Sort metrics") { - // Assume the execution plan is - // WholeStageCodegen(nodeId = 0, Range(nodeId = 2) -> Sort(nodeId = 1)) - val ds = spark.range(10).sort('id) - testSparkPlanMetrics(ds.toDF(), 2, Map.empty) + // Assume the execution plan with node id is + // Sort(nodeId = 0) + // Exchange(nodeId = 1) + // LocalTableScan(nodeId = 2) + val df = Seq(1, 3, 2).toDF("id").sort('id) + testSparkPlanMetrics(df, 2, Map.empty) --- End diff -- @cloud-fan This case tries to check metrics of `SortExec`, however these metrics (`sortTime`, `peakMemory`, `spillSize`) change each time the query is executed, they are not fixed. So far what I did is to check whether `SortExec` exists. Do you mean we should further check whether these metrics names exist? Though we can't know their values beforehand.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org