A DStream is a sequence of RDDs, not of elements. I don't think I'd
expect to express an operation on a DStream as if it were elements.

On Tue, Jul 5, 2016 at 2:47 PM, Jacek Laskowski <ja...@japila.pl> wrote:
> Sort of. Your example works, but could you do a mere
> ds.foreachPartition(println)? Why not? What should I even see the Java
> version?
>
> scala> val ds = spark.range(10)
> ds: org.apache.spark.sql.Dataset[Long] = [id: bigint]
>
> scala> ds.foreachPartition(println)
> <console>:26: error: overloaded method value foreachPartition with 
> alternatives:
>   (func: 
> org.apache.spark.api.java.function.ForeachPartitionFunction[Long])Unit
> <and>
>   (f: Iterator[Long] => Unit)Unit
>  cannot be applied to (Unit)
>        ds.foreachPartition(println)
>           ^
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Tue, Jul 5, 2016 at 3:32 PM, Sean Owen <so...@cloudera.com> wrote:
>> Do you not mean ds.foreachPartition(_.foreach(println)) or similar?
>>
>> On Tue, Jul 5, 2016 at 2:22 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>>> Hi,
>>>
>>> It's with the master built today. Why can't I call
>>> ds.foreachPartition(println)? Is using type annotation the only way to
>>> go forward? I'd be so sad if that's the case.
>>>
>>> scala> ds.foreachPartition(println)
>>> <console>:28: error: overloaded method value foreachPartition with 
>>> alternatives:
>>>   (func: 
>>> org.apache.spark.api.java.function.ForeachPartitionFunction[Record])Unit
>>> <and>
>>>   (f: Iterator[Record] => Unit)Unit
>>>  cannot be applied to (Unit)
>>>        ds.foreachPartition(println)
>>>           ^
>>>
>>> scala> sc.version
>>> res9: String = 2.0.0-SNAPSHOT
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> ----
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to