Hi All,

We recently came across this issue when using the spark-shell and zeppelin.
If we assign the sparkcontext variable (sc) to a new variable and reference
another variable in an RDD lambda expression we get a task not serializable
exception.

The following three lines of code illustrate this :

val temp = 10
val newSC = sc
val new RDD = newSC.parallelize(0 to 100).map(p => p + temp).

I am not sure if this is a known issue, or we should file a JIRA for it.
We originally came across this bug in the SciSpark project.

Best,

Rahul P

Reply via email to