[ https://issues.apache.org/jira/browse/SPARK-2003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Diana Carroll reopened SPARK-2003: ---------------------------------- > SparkContext(SparkConf) doesn't work in pyspark > ----------------------------------------------- > > Key: SPARK-2003 > URL: https://issues.apache.org/jira/browse/SPARK-2003 > Project: Spark > Issue Type: Bug > Components: Documentation, PySpark > Affects Versions: 1.0.0 > Reporter: Diana Carroll > Fix For: 1.0.1, 1.1.0 > > > Using SparkConf with SparkContext as described in the Programming Guide does > NOT work in Python: > conf = SparkConf.setAppName("blah") > sc = SparkContext(conf) > When I tried I got > AttributeError: 'SparkConf' object has no attribute '_get_object_id' > [This equivalent code in Scala works fine: > val conf = new SparkConf().setAppName("blah") > val sc = new SparkContext(conf)] > I think this is because there's no equivalent for the Scala constructor > SparkContext(SparkConf). > Workaround: > If I explicitly set the conf parameter in the python call, it does work: > sconf = SparkConf.setAppName("blah") > sc = SparkContext(conf=sconf) -- This message was sent by Atlassian JIRA (v6.2#6252)