Repository: spark
Updated Branches:
  refs/heads/branch-2.2 9789d5c57 -> c5f559315


[SPARK-20521][DOC][CORE] The default of 'spark.worker.cleanup.appDataTtl' 
should be 604800 in spark-standalone.md

## What changes were proposed in this pull request?

Currently, our project needs to be set to clean up the worker directory cleanup 
cycle is three days.
When I follow http://spark.apache.org/docs/latest/spark-standalone.html, 
configure the 'spark.worker.cleanup.appDataTtl' parameter, I configured to 3 * 
24 * 3600.
When I start the spark service, the startup fails, and the worker log displays 
the error log as follows:

2017-04-28 15:02:03,306 INFO Utils: Successfully started service 'sparkWorker' 
on port 48728.
Exception in thread "main" java.lang.NumberFormatException: For input string: 
"3 * 24 * 3600"
        at 
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Long.parseLong(Long.java:430)
        at java.lang.Long.parseLong(Long.java:483)
        at 
scala.collection.immutable.StringLike$class.toLong(StringLike.scala:276)
        at scala.collection.immutable.StringOps.toLong(StringOps.scala:29)
        at 
org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380)
        at 
org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380)
        at scala.Option.map(Option.scala:146)
        at org.apache.spark.SparkConf.getLong(SparkConf.scala:380)
        at org.apache.spark.deploy.worker.Worker.<init>(Worker.scala:100)
        at 
org.apache.spark.deploy.worker.Worker$.startRpcEnvAndEndpoint(Worker.scala:730)
        at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:709)
        at org.apache.spark.deploy.worker.Worker.main(Worker.scala)

**Because we put 7 * 24 * 3600 as a string, forced to convert to the dragon 
type,  will lead to problems in the program.**

**So I think the default value of the current configuration should be a 
specific long value, rather than 7 * 24 * 3600,should be 604800. Because it 
would mislead users for similar configurations, resulting in spark start 
failure.**

## How was this patch tested?
manual tests

Please review http://spark.apache.org/contributing.html before opening a pull 
request.

Author: 郭小龙 10207633 <guo.xiaolo...@zte.com.cn>
Author: guoxiaolong <guo.xiaolo...@zte.com.cn>
Author: guoxiaolongzte <guo.xiaolo...@zte.com.cn>

Closes #17798 from guoxiaolongzte/SPARK-20521.

(cherry picked from commit 4d99b95ad0d0c7ef909c8e492ec45e94cf0189b4)
Signed-off-by: Sean Owen <so...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c5f55931
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c5f55931
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c5f55931

Branch: refs/heads/branch-2.2
Commit: c5f559315c88935cd6ac76c6db97327f5d1ee669
Parents: 9789d5c
Author: 郭小龙 10207633 <guo.xiaolo...@zte.com.cn>
Authored: Sun Apr 30 09:06:25 2017 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sun Apr 30 09:06:34 2017 +0100

----------------------------------------------------------------------
 docs/spark-standalone.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c5f55931/docs/spark-standalone.md
----------------------------------------------------------------------
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index 1c0b60f..34ced9e 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -242,7 +242,7 @@ SPARK_WORKER_OPTS supports the following system properties:
 </tr>
 <tr>
   <td><code>spark.worker.cleanup.appDataTtl</code></td>
-  <td>7 * 24 * 3600 (7 days)</td>
+  <td>604800 (7 days, 7 * 24 * 3600)</td>
   <td>
     The number of seconds to retain application work directories on each 
worker.  This is a Time To Live
     and should depend on the amount of available disk space you have.  
Application logs and jars are


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to