[ https://issues.apache.org/jira/browse/SPARK-2003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14046125#comment-14046125 ]
Matthew Farrellee commented on SPARK-2003: ------------------------------------------ i'm taking a look at this > SparkContext(SparkConf) doesn't work in pyspark > ----------------------------------------------- > > Key: SPARK-2003 > URL: https://issues.apache.org/jira/browse/SPARK-2003 > Project: Spark > Issue Type: Bug > Components: Documentation, PySpark > Affects Versions: 1.0.0 > Reporter: Diana Carroll > > Using SparkConf with SparkContext as described in the Programming Guide does > NOT work in Python: > conf = SparkConf.setAppName("blah") > sc = SparkContext(conf) > When I tried I got > AttributeError: 'SparkConf' object has no attribute '_get_object_id' > [This equivalent code in Scala works fine: > val conf = new SparkConf().setAppName("blah") > val sc = new SparkContext(conf)] > I think this is because there's no equivalent for the Scala constructor > SparkContext(SparkConf). > Workaround: > If I explicitly set the conf parameter in the python call, it does work: > sconf = SparkConf.setAppName("blah") > sc = SparkContext(conf=sconf) -- This message was sent by Atlassian JIRA (v6.2#6252)