Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/14959
  
    It means that if you do this:
    
        val conf = new SparkConf()
        val sc = new SparkContext(conf)
    
    The internal SparkConf of the context will not be the same instance as 
`conf`. With the changes I reviews, in the python case, the internal conf of 
the context would be the same instance as the user's, which is different 
behavior.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to