[ 
https://issues.apache.org/jira/browse/SPARK-2003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14049084#comment-14049084
 ] 

Diana Carroll commented on SPARK-2003:
--------------------------------------

Actually you can't create a SparkContext in the repl...it's already created
for you.  Therefore this bug is moot in the repl, and only applies to a
"standalone" pyspark program.

as far as I know, in Scala the officially recommended way to
programmatically configure a SparkContext is "new SparkContext(sparkConf)".
 Therefore the same should be true for PySpark.


On Tue, Jul 1, 2014 at 12:40 PM, Matthew Farrellee (JIRA) <j...@apache.org>



> SparkContext(SparkConf) doesn't work in pyspark
> -----------------------------------------------
>
>                 Key: SPARK-2003
>                 URL: https://issues.apache.org/jira/browse/SPARK-2003
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation, PySpark
>    Affects Versions: 1.0.0
>            Reporter: Diana Carroll
>             Fix For: 1.0.1, 1.1.0
>
>
> Using SparkConf with SparkContext as described in the Programming Guide does 
> NOT work in Python:
> conf = SparkConf.setAppName("blah")
> sc = SparkContext(conf)
> When I tried I got 
> AttributeError: 'SparkConf' object has no attribute '_get_object_id'
> [This equivalent code in Scala works fine:
> val conf = new SparkConf().setAppName("blah")
> val sc = new SparkContext(conf)]
> I think this is because there's no equivalent for the Scala constructor 
> SparkContext(SparkConf).  
> Workaround:
> If I explicitly set the conf parameter in the python call, it does work:
> sconf = SparkConf.setAppName("blah")
> sc = SparkContext(conf=sconf)



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to