This is https://issues.apache.org/jira/browse/SPARK-10422, which has been
fixed in Spark 1.5.1.
On Wed, Oct 21, 2015 at 4:40 PM, Sourav Mazumder <
sourav.mazumde...@gmail.com> wrote:
> In 1.5.0 if I use randomSplit on a data frame I get this error.
>
> Here is teh code snippet -
>
> val splitData
Thank u very much ! when will the Spark 1.5.1 come out.
guoqing0...@yahoo.com.hk
From: Yin Huai
Date: 2015-09-12 04:49
To: guoqing0...@yahoo.com.hk
CC: user
Subject: Re: java.util.NoSuchElementException: key not found
Looks like you hit https://issues.apache.org/jira/browse/SPARK-10422, it
Looks like you hit https://issues.apache.org/jira/browse/SPARK-10422, it
has been fixed in branch 1.5. 1.5.1 release will have it.
On Fri, Sep 11, 2015 at 3:35 AM, guoqing0...@yahoo.com.hk <
guoqing0...@yahoo.com.hk> wrote:
> Hi all ,
> After upgrade spark to 1.5 , Streaming throw
> java.util.No
aha ok, thanks.
If I create different RDDs from a parent RDD and force evaluation
thread-by-thread, then it should presumably be fine, correct? Or do I need
to checkpoint the child RDDs as a precaution in case it needs to be removed
from memory and recomputed?
On Sat, Feb 28, 2015 at 4:28 AM, Shi
RDD is not thread-safe. You should not use it in multiple threads.
Best Regards,
Shixiong Zhu
2015-02-27 23:14 GMT+08:00 rok :
> I'm seeing this java.util.NoSuchElementException: key not found: exception
> pop up sometimes when I run operations on an RDD from multiple threads in a
> python appli