[ 
https://issues.apache.org/jira/browse/SPARK-3712?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-3712.
------------------------------
          Resolution: Won't Fix
    Target Version/s:   (was: 1.1.1, 1.2.0)

Withdrawn in the PR by the proposer.

> add a new UpdateDStream to update a rdd dynamically
> ---------------------------------------------------
>
>                 Key: SPARK-3712
>                 URL: https://issues.apache.org/jira/browse/SPARK-3712
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>            Reporter: uncleGen
>            Priority: Minor
>
> Maybe, we can achieve the aim by using "forEachRdd"  function. But I feel 
> weird in this way, because I need to pass a closure, like this:
>     val baseRdd = ...
>     var updatedRDD = ...
>     val inputStream = ...
>     val func = (rdd: RDD[T], t: Time) => {
>          updatedRDD = baseRDD.op(rdd)
>     }
>     inputStream.foreachRDD(func _)
> In my PR, we can update a rdd like:
>     val updateStream = inputStream.updateRDD(baseRDD, func).asInstanceOf[T, 
> V, U]
> and obtain the updatedRDD like this:
>     val updatedRDD = updateStream.getUpdatedRDD



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to