!
Ya that’s what I’m doing so far, but I wanted to see if it’s possible to
keep the tuples inside Spark for fault tolerance purposes.
-A
From: Mark Hamstra [mailto:m...@clearstorydata.com]
Sent: March-28-14 10:45 AM
To: user@spark.apache.org
Subject: Re: function state lost when next
As long as the amount of state being passed is relatively small, it's
probably easiest to send it back to the driver and to introduce it into RDD
transformations as the zero value of a fold.
On Fri, Mar 28, 2014 at 7:12 AM, Adrian Mocanu amoc...@verticalscope.comwrote:
I'd like to resurrect
I'd like to resurrect this thread since I don't have an answer yet.
From: Adrian Mocanu [mailto:amoc...@verticalscope.com]
Sent: March-27-14 10:04 AM
To: u...@spark.incubator.apache.org
Subject: function state lost when next RDD is processed
Is there a way to pass a custom function to spark to
Thanks!
Ya that's what I'm doing so far, but I wanted to see if it's possible to keep
the tuples inside Spark for fault tolerance purposes.
-A
From: Mark Hamstra [mailto:m...@clearstorydata.com]
Sent: March-28-14 10:45 AM
To: user@spark.apache.org
Subject: Re: function state lost when next RDD
*Subject:* Re: function state lost when next RDD is processed
As long as the amount of state being passed is relatively small, it's
probably easiest to send it back to the driver and to introduce it into RDD
transformations as the zero value of a fold.
On Fri, Mar 28, 2014 at 7:12 AM, Adrian