[ 
https://issues.apache.org/jira/browse/SPARK-2003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14049456#comment-14049456
 ] 

Patrick Wendell commented on SPARK-2003:
----------------------------------------

If I understand correctly, [~dcarr...@cloudera.com] is asking us to change this 
API to make it more consistent with other languages. I don't see a way of doing 
this without breaking the existing behavior for old users (which we can't do). 
In python, it's not possible to overload constructors in the same way as in 
Java because it's not strongly typed. I'd guess this is why Matei didn't change 
it when he refactored the constructor to take a configuration.

For that reason I'm going to close this as "wontFix" - but if there is indeed a 
backwards-compatible way to do that, please feel free to re-open it with a 
proposal.


> SparkContext(SparkConf) doesn't work in pyspark
> -----------------------------------------------
>
>                 Key: SPARK-2003
>                 URL: https://issues.apache.org/jira/browse/SPARK-2003
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation, PySpark
>    Affects Versions: 1.0.0
>            Reporter: Diana Carroll
>             Fix For: 1.0.1, 1.1.0
>
>
> Using SparkConf with SparkContext as described in the Programming Guide does 
> NOT work in Python:
> conf = SparkConf.setAppName("blah")
> sc = SparkContext(conf)
> When I tried I got 
> AttributeError: 'SparkConf' object has no attribute '_get_object_id'
> [This equivalent code in Scala works fine:
> val conf = new SparkConf().setAppName("blah")
> val sc = new SparkContext(conf)]
> I think this is because there's no equivalent for the Scala constructor 
> SparkContext(SparkConf).  
> Workaround:
> If I explicitly set the conf parameter in the python call, it does work:
> sconf = SparkConf.setAppName("blah")
> sc = SparkContext(conf=sconf)



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to