[ 
https://issues.apache.org/jira/browse/SPARK-550?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matthew Farrellee closed SPARK-550.
-----------------------------------
    Resolution: Done

> Hiding the default spark context in the spark shell creates serialization 
> issues
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-550
>                 URL: https://issues.apache.org/jira/browse/SPARK-550
>             Project: Spark
>          Issue Type: Bug
>            Reporter: tjhunter
>
> I copy-pasted a piece of code along these lines in the spark shell:
> ...
> val sc = new SparkContext("local[%d]" format num_splits,"myframework")
> val my_rdd = sc.textFile(...)
> my_rdd.count()
> This leads to the shell crashing with a java.io.NotSerializableException: 
> spark.SparkContext
> It took me a while to realize it was due to the new spark context created. 
> Maybe a warning/error should be triggered if the user tries to change the 
> definition of sc?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to