RDD.toLocalIterator return the partition one by one but with all elements in 
the partition, which is not lazy calculated. Given the design of spark, it is 
very hard to maintain the state of iterator across runJob.

  def toLocalIterator: Iterator[T] = {
    def collectPartition(p: Int): Array[T] = {
      sc.runJob(this, (iter: Iterator[T]) => iter.toArray, Seq(p), allowLocal = 
false).head
    }
    (0 until partitions.length).iterator.flatMap(i => collectPartition(i))
  }

Thanks.

Zhan Zhang

On Oct 29, 2014, at 3:43 AM, Yanbo Liang <yanboha...@gmail.com> wrote:

> RDD.toLocalIterator() is the suitable solution.
> But I doubt whether it conform with the design principle of spark and RDD.
> All RDD transform is lazily computed until it end with some actions. 
> 
> 2014-10-29 15:28 GMT+08:00 Sean Owen <so...@cloudera.com>:
> Call RDD.toLocalIterator()?
> 
> https://spark.apache.org/docs/latest/api/java/org/apache/spark/rdd/RDD.html
> 
> On Wed, Oct 29, 2014 at 4:15 AM, Dai, Kevin <yun...@ebay.com> wrote:
> > Hi, ALL
> >
> >
> >
> > I have a RDD[T], can I use it like a iterator.
> >
> > That means I can compute every element of this RDD lazily.
> >
> >
> >
> > Best Regards,
> >
> > Kevin.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to