Repository: spark Updated Branches: refs/heads/master 65533c7ec -> 94c6c06ea
[FIX] do not load defaults when testing SparkConf in pyspark The default constructor loads default properties, which can fail the test. Author: Xiangrui Meng <m...@databricks.com> Closes #775 from mengxr/pyspark-conf-fix and squashes the following commits: 83ef6c4 [Xiangrui Meng] do not load defaults when testing SparkConf in pyspark Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/94c6c06e Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/94c6c06e Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/94c6c06e Branch: refs/heads/master Commit: 94c6c06ea13032b80610b3f54401d2ef2aa4874a Parents: 65533c7 Author: Xiangrui Meng <m...@databricks.com> Authored: Wed May 14 14:57:17 2014 -0700 Committer: Reynold Xin <r...@apache.org> Committed: Wed May 14 14:57:17 2014 -0700 ---------------------------------------------------------------------- python/pyspark/conf.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/94c6c06e/python/pyspark/conf.py ---------------------------------------------------------------------- diff --git a/python/pyspark/conf.py b/python/pyspark/conf.py index 49b68d5..8eff4a2 100644 --- a/python/pyspark/conf.py +++ b/python/pyspark/conf.py @@ -33,7 +33,7 @@ u'My app' >>> sc.sparkHome == None True ->>> conf = SparkConf() +>>> conf = SparkConf(loadDefaults=False) >>> conf.setSparkHome("/path") <pyspark.conf.SparkConf object at ...> >>> conf.get("spark.home")