Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22621#discussion_r222642107
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsSuite.scala
 ---
    @@ -517,4 +517,57 @@ class SQLMetricsSuite extends SparkFunSuite with 
SQLMetricsTestUtils with Shared
       test("writing data out metrics with dynamic partition: parquet") {
         testMetricsDynamicPartition("parquet", "parquet", "t1")
       }
    +
    +  test("SPARK-25602: SparkPlan.getByteArrayRdd should not consume the 
input when not necessary") {
    +    def checkFilterAndRangeMetrics(
    +        df: DataFrame,
    +        filterNumOutputs: Int,
    +        rangeNumOutputs: Int): Unit = {
    +      var filter: FilterExec = null
    --- End diff --
    
    In the future if we need to catch more nodes, we should abstract it. But 
for now it's only range and filter, I think it's ok.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to