Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10942#discussion_r51356299
  
    --- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/sources/BucketedReadSuite.scala ---
    @@ -59,6 +60,136 @@ class BucketedReadSuite extends QueryTest with 
SQLTestUtils with TestHiveSinglet
         }
       }
     
    +  // To verify bucket pruning, we compare the contents of remaining 
buckets (before filtering)
    +  // with the expectedAnswer.
    +  private def checkPrunedAnswers(
    +      bucketedDataFrame: DataFrame,
    +      expectedAnswer: DataFrame): Unit = {
    +    val rdd = 
bucketedDataFrame.queryExecution.executedPlan.find(_.isInstanceOf[PhysicalRDD])
    +    assert(rdd.isDefined)
    +    checkAnswer(
    +      expectedAnswer.orderBy(expectedAnswer.logicalPlan.output.map(attr => 
Column(attr)) : _*),
    +      rdd.get.executeCollectPublic().sortBy(_.toString()))
    +  }
    +
    +  test("read partitioning bucketed tables with bucket pruning filters") {
    +    val df = (10 until 50).map(i => (i % 5, i % 13 + 10, 
i.toString)).toDF("i", "j", "k")
    +
    +    withTable("bucketed_table") {
    +      // The number of buckets should be large enough to make sure each 
bucket contains
    --- End diff --
    
    ah got it. But I think we should improve test instead of avoiding this 
problem, which makes people think bucket pruning doesn't work for more than 2 
bucket values in each bucket. How about we check each RDD partitions, make sure 
the should-be-pruned ones are empty, and also check answer at last.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to