Repository: spark Updated Branches: refs/heads/master 680b33f16 -> 457dc9ccb
[MINOR][DOC] Improve the docs about how to correctly set configurations ## What changes were proposed in this pull request? Spark provides several ways to set configurations, either from configuration file, or from `spark-submit` command line options, or programmatically through `SparkConf` class. It may confuses beginners why some configurations set through `SparkConf` cannot take affect. So here add some docs to address this problems and let beginners know how to correctly set configurations. ## How was this patch tested? N/A Author: jerryshao <ss...@hortonworks.com> Closes #18552 from jerryshao/improve-doc. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/457dc9cc Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/457dc9cc Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/457dc9cc Branch: refs/heads/master Commit: 457dc9ccbf8404fef6c1ebf8f82e59e4ba480a0e Parents: 680b33f Author: jerryshao <ss...@hortonworks.com> Authored: Mon Jul 10 11:22:28 2017 +0800 Committer: Wenchen Fan <wenc...@databricks.com> Committed: Mon Jul 10 11:22:28 2017 +0800 ---------------------------------------------------------------------- docs/configuration.md | 7 +++++++ 1 file changed, 7 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/457dc9cc/docs/configuration.md ---------------------------------------------------------------------- diff --git a/docs/configuration.md b/docs/configuration.md index 6ca8424..91b5bef 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -95,6 +95,13 @@ in the `spark-defaults.conf` file. A few configuration keys have been renamed si versions of Spark; in such cases, the older key names are still accepted, but take lower precedence than any instance of the newer key. +Spark properties mainly can be divided into two kinds: one is related to deploy, like +"spark.driver.memory", "spark.executor.instances", this kind of properties may not be affected when +setting programmatically through `SparkConf` in runtime, or the behavior is depending on which +cluster manager and deploy mode you choose, so it would be suggested to set through configuration +file or `spark-submit` command line options; another is mainly related to Spark runtime control, +like "spark.task.maxFailures", this kind of properties can be set in either way. + ## Viewing Spark Properties The application web UI at `http://<driver>:4040` lists Spark properties in the "Environment" tab. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org