Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/21859#discussion_r213560104 --- Diff: core/src/main/scala/org/apache/spark/Partitioner.scala --- @@ -138,7 +138,8 @@ class RangePartitioner[K : Ordering : ClassTag, V]( partitions: Int, rdd: RDD[_ <: Product2[K, V]], private var ascending: Boolean = true, - val samplePointsPerPartitionHint: Int = 20) + val samplePointsPerPartitionHint: Int = 20, + needCacheSample: Boolean = false) --- End diff -- can we create a different `RangePartitioner` in Spark SQL? it's a little weird to make a change in the core module which only make sense in the SQL module.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org