Repository: spark
Updated Branches:
  refs/heads/master e59dd8fa0 -> 709f541dd


[DOCS] Update configuration.md

changed $SPARK_HOME/conf/spark-default.conf to 
$SPARK_HOME/conf/spark-defaults.conf

no testing necessary as this was a change to documentation.

Closes #22116 from KraFusion/patch-1.

Authored-by: Joey Krabacher <jkrabac...@gmail.com>
Signed-off-by: Sean Owen <sean.o...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/709f541d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/709f541d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/709f541d

Branch: refs/heads/master
Commit: 709f541dd0c41c2ae8c0871b2593be9100bfc4ee
Parents: e59dd8f
Author: Joey Krabacher <jkrabac...@gmail.com>
Authored: Thu Aug 16 16:47:52 2018 -0700
Committer: Sean Owen <sean.o...@databricks.com>
Committed: Thu Aug 16 16:47:52 2018 -0700

----------------------------------------------------------------------
 docs/configuration.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/709f541d/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index 9c4742a..0270dc2 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -2213,7 +2213,7 @@ Spark's classpath for each application. In a Spark 
cluster running on YARN, thes
 files are set cluster-wide, and cannot safely be changed by the application.
 
 The better choice is to use spark hadoop properties in the form of 
`spark.hadoop.*`. 
-They can be considered as same as normal spark properties which can be set in 
`$SPARK_HOME/conf/spark-default.conf`
+They can be considered as same as normal spark properties which can be set in 
`$SPARK_HOME/conf/spark-defaults.conf`
 
 In some cases, you may want to avoid hard-coding certain configurations in a 
`SparkConf`. For
 instance, Spark allows you to simply create an empty conf and set spark/spark 
hadoop properties.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to