Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/22504#discussion_r230467097 --- Diff: docs/configuration.md --- @@ -266,6 +266,41 @@ of the most common options to set are: Only has effect in Spark standalone mode or Mesos cluster deploy mode. </td> </tr> +<tr> + <td><code>spark.driver.log.dfsDir</code></td> + <td>(none)</td> + <td> + Base directory in which Spark driver logs are synced, if <code>spark.driver.log.persistToDfs.enabled</code> + is true. Within this base directory, Spark creates a sub-directory for each application, and logs the driver + logs specific to the application in this directory. Users may want to set this to a unified location like an + HDFS directory so driver log files can be persisted for later usage. This directory should allow any Spark + user to read/write files and the Spark History Server user to delete files. Additionally, older logs from + this directory are cleaned by Spark History Server if <code>spark.history.fs.driverlog.cleaner.enabled</code> + is true or if not configured, falling back to <code>spark.history.fs.cleaner.enabled</code>. They are cleaned + if they are older than max age configured at <code>spark.history.fs.driverlog.cleaner.maxAge</code> or if not + configured, falling back to <code>spark.history.fs.cleaner.maxAge</code>. + </td> +</tr> +<tr> + <td><code>spark.driver.log.persistToDfs.enabled</code></td> + <td>false</td> + <td> + If true, spark application running in client mode will write driver logs to a persistent storage, configured + in <code>spark.driver.log.dfsDir</code>. If <code>spark.driver.log.dfsDir</code> is not configured, driver logs + will not be persisted. Additionally, enable the cleaner by setting <code>spark.history.fs.driverlog.cleaner.enabled</code> --- End diff -- Instead of mentioning the cleaner config, I'd add a link to the SHS config page. That makes it clear that the cleaner is not part of the application.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org