I can reproduce it in spark-shell. But it works for batch job. Looks like
spark repl issue.

On Thu, Mar 3, 2016 at 10:43 AM, Rahul Palamuttam <rahulpala...@gmail.com>
wrote:

> Hi All,
>
> We recently came across this issue when using the spark-shell and zeppelin.
> If we assign the sparkcontext variable (sc) to a new variable and reference
> another variable in an RDD lambda expression we get a task not
> serializable exception.
>
> The following three lines of code illustrate this :
>
> val temp = 10
> val newSC = sc
> val new RDD = newSC.parallelize(0 to 100).map(p => p + temp).
>
> I am not sure if this is a known issue, or we should file a JIRA for it.
> We originally came across this bug in the SciSpark project.
>
> Best,
>
> Rahul P
>



-- 
Best Regards

Jeff Zhang

Reply via email to