Github user squito commented on a diff in the pull request: https://github.com/apache/spark/pull/22504#discussion_r226453812 --- Diff: docs/configuration.md --- @@ -266,6 +266,37 @@ of the most common options to set are: Only has effect in Spark standalone mode or Mesos cluster deploy mode. </td> </tr> +<tr> + <td><code>spark.driver.log.dfsDir</code></td> + <td>(none)</td> + <td> + Base directory in which Spark driver logs are synced, if spark.driver.log.syncToDfs.enabled is true. + Within this base directory, Spark creates a sub-directory for each application, and logs the driver logs + specific to the application in this directory. Users may want to set this to a unified location like an + HDFS directory so driver log files can be persisted for later usage. This directory should allow any spark + user to read/write files and the spark history server user to delete files. Additionally, older logs from + this directory are cleaned by Spark History Server if spark.history.fs.driverlog.cleaner.enabled is true. + They are cleaned if they are older than max age configured at spark.history.fs.driverlog.cleaner.maxAge. + </td> +</tr> +<tr> + <td><code>spark.driver.log.syncToDfs.enabled</code></td> + <td>false</td> + <td> + If true, spark application running in client mode will sync driver logs to a persistent storage, configured --- End diff -- I guess I just don't like using the word "sync" here, it makes me think the logs are getting stored somewhere, just not synced to persistent storage, even if this is false. How about renaming the conf to "spark.driver.log.persistToDfs.enabled" and rewording this to If true, spark application running in client mode will write driver logs to a persistent storage, configured in spark.driver.log.dfsDir. If spark.driver.log.dfsDir is not configured, driver logs will not be stored to persistent storage. Additionally, enable the cleaner by setting spark.history.fs.driverlog.cleaner.enabled to true. ?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org