Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2665#discussion_r20183390
  
    --- Diff: 
streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala
 ---
    @@ -492,6 +509,26 @@ class JavaPairDStream[K, V](val dstream: DStream[(K, 
V)])(
         dstream.updateStateByKey(convertUpdateStateFunction(updateFunc), 
partitioner)
       }
     
    +  /**
    +   * Return a new "state" DStream where the state for each key is updated 
by applying
    +   * the given function on the previous state of the key and the new 
values of the key.
    +   * org.apache.spark.Partitioner is used to control the partitioning of 
each RDD.
    +   * @param updateFunc State update function. If `this` function returns 
None, then
    +   *                   corresponding state key-value pair will be 
eliminated.
    +   * @param partitioner Partitioner for controlling the partitioning of 
each RDD in the new
    +   *                    DStream.
    +   * @param initialRDD initial state value of each key.
    +   * @tparam S State type
    +   */
    +  def updateStateByKey[S](
    --- End diff --
    
    I guess this makes it different from the Scala api, and the reason is that 
the Java API does not expose the version of the API that uses iterators. And as 
far as I remember, that was not done because its tricky to write a function 
Java dealing with scala iterators and all. 
    
    Lets try to make the Java and Scala API consistent (minus the iterator 
based ones). So could you please add a Scala API equivalent to this, that is, 
(simple non-iterator function, partitioner, initial RDD)? Also unit test for 
that please :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to