Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7403#discussion_r38458181
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala 
---
    @@ -565,12 +603,25 @@ class PairRDDFunctions[K, V](self: RDD[(K, V)])
       }
     
       /**
    -   * Simplified version of combineByKey that hash-partitions the resulting 
RDD using the
    -   * existing partitioner/parallelism level.
    +   * This method is here for backward compatibility. It
    +   * does not provide combiner classtag information to
    +   * the shuffle.
    +   *
    +   * @see [[combineByKeyWithClassTag]]
        */
       def combineByKey[C](createCombiner: V => C, mergeValue: (C, V) => C, 
mergeCombiners: (C, C) => C)
    -    : RDD[(K, C)] = self.withScope {
    -    combineByKey(createCombiner, mergeValue, mergeCombiners, 
defaultPartitioner(self))
    +    : RDD[(K, C)] = {
    +    combineByKeyWithClassTag(createCombiner, mergeValue, 
mergeCombiners)(null)
    +  }
    +
    +  /**
    +   * Simplified version of combineByKeyWithClassTag that hash-partitions 
the resulting RDD using the
    +   * existing partitioner/parallelism level.
    +   */
    +  def combineByKeyWithClassTag[C](createCombiner: V => C, mergeValue: (C, 
V) => C,
    +                                  mergeCombiners: (C, C) => C)
    --- End diff --
    
    style:
    ```
    def combineByKeyWithClassTag[C](
        createCombiner: V => C,
        mergeValue: (C, V) => C,
        mergeCombiners: (C, C) => C)(implicit ct: ClassTag[C]): RDD[(K, C)] = 
self.withScope {
      ...
    }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to