GitHub user guoxiaolongzte opened a pull request: https://github.com/apache/spark/pull/17798
[SPARK-20521][DOC][CORE] ## What changes were proposed in this pull request? Currently, our project needs to be set to clean up the worker directory cleanup cycle is three days. When I follow http://spark.apache.org/docs/latest/spark-standalone.html, configure the 'spark.worker.cleanup.appDataTtl' parameter, I configured to 3 * 24 * 3600. When I start the spark service, the startup fails, and the worker log displays the error log as follows: 2017-04-28 15:02:03,306 INFO Utils: Successfully started service 'sparkWorker' on port 48728. Exception in thread "main" java.lang.NumberFormatException: For input string: "3 * 24 * 3600" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:430) at java.lang.Long.parseLong(Long.java:483) at scala.collection.immutable.StringLike$class.toLong(StringLike.scala:276) at scala.collection.immutable.StringOps.toLong(StringOps.scala:29) at org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380) at org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380) at scala.Option.map(Option.scala:146) at org.apache.spark.SparkConf.getLong(SparkConf.scala:380) at org.apache.spark.deploy.worker.Worker.<init>(Worker.scala:100) at org.apache.spark.deploy.worker.Worker$.startRpcEnvAndEndpoint(Worker.scala:730) at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:709) at org.apache.spark.deploy.worker.Worker.main(Worker.scala) **Because we put 7 * 24 * 3600 as a string, forced to convert to the dragon type, will lead to problems in the program.** **So I think the default value of the current configuration should be a specific long value, rather than 7 * 24 * 3600,should be 604800. Because it would mislead users for similar configurations, resulting in spark start failure.** ## How was this patch tested? manual tests Please review http://spark.apache.org/contributing.html before opening a pull request. You can merge this pull request into a Git repository by running: $ git pull https://github.com/guoxiaolongzte/spark SPARK-20521 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/17798.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #17798 ---- commit d383efba12c66addb17006dea107bb0421d50bc3 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-03-31T13:57:09Z [SPARK-20177]Document about compression way has some little detail changes. commit 3059013e9d2aec76def14eb314b6761bea0e7ca0 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-01T01:38:02Z [SPARK-20177] event log add a space commit 555cef88fe09134ac98fd0ad056121c7df2539aa Author: guoxiaolongzte <guo.xiaolo...@zte.com.cn> Date: 2017-04-02T00:16:08Z '/applications/[app-id]/jobs' in rest api,status should be [running|succeeded|failed|unknown] commit 46bb1ad3ddd9fb55b5607ac4f20213a90186cfe9 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-05T03:16:50Z Merge branch 'master' of https://github.com/apache/spark into SPARK-20177 commit 0efb0dd9e404229cce638fe3fb0c966276784df7 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-05T03:47:53Z [SPARK-20218]'/applications/[app-id]/stages' in REST API,add description. commit 0e37fdeee28e31fc97436dabd001d3c85c5a7794 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-05T05:22:54Z [SPARK-20218] '/applications/[app-id]/stages/[stage-id]' in REST API,remove redundant description. commit 52641bb01e55b48bd9e8579fea217439d14c7dc7 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-07T06:24:58Z Merge branch 'SPARK-20218' commit d3977c9cab0722d279e3fae7aacbd4eb944c22f6 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-08T07:13:02Z Merge branch 'master' of https://github.com/apache/spark commit 137b90e5a85cde7e9b904b3e5ea0bb52518c4716 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-10T05:13:40Z Merge branch 'master' of https://github.com/apache/spark commit 0fe5865b8022aeacdb2d194699b990d8467f7a0a Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-10T10:25:22Z Merge branch 'SPARK-20190' of https://github.com/guoxiaolongzte/spark commit cf6f42ac84466960f2232c025b8faeb5d7378fe1 Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-10T10:26:27Z Merge branch 'master' of https://github.com/apache/spark commit 685cd6b6e3799c7be65674b2670159ba725f0b8f Author: éå°é¾ 10207633 <guo.xiaolo...@zte.com.cn> Date: 2017-04-14T01:12:41Z Merge branch 'master' of https://github.com/apache/spark commit c716a9231e9ab117d2b03ba67a1c8903d8d9da93 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-17T06:57:21Z Merge branch 'master' of https://github.com/apache/spark commit 679cec36a968fbf995b567ca5f6f8cbd8e32673f Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-19T07:20:08Z Merge branch 'master' of https://github.com/apache/spark commit 3c9387af84a8f39cf8c1ce19e15de99dfcaf0ca5 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-19T08:15:26Z Merge branch 'master' of https://github.com/apache/spark commit cb71f4462a0889cbb0843875b1e4cf14bcb0d020 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-20T05:52:06Z Merge branch 'master' of https://github.com/apache/spark commit ce92a7415a2026f5bf909820110a13750a0949e1 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-21T05:21:48Z Merge branch 'master' of https://github.com/apache/spark commit dd64342206041a8c3a282459e5f2b898dc558d89 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-21T08:44:25Z Merge branch 'master' of https://github.com/apache/spark commit bffd2bd00c6b0e20313756e133adca4c97707c67 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-28T01:36:29Z Merge branch 'master' of https://github.com/apache/spark commit 588d42a382345a071532ace1eab5457911f6aa46 Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-28T05:02:36Z Merge branch 'master' of https://github.com/apache/spark commit b32b753b05543f76e3327a29c8daaeccb3186d1b Author: guoxiaolong <guo.xiaolo...@zte.com.cn> Date: 2017-04-28T06:57:11Z [SPARK-20521]The default of 'spark.worker.cleanup.appDataTtl' should be 604800 in spark-standalone.md. ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org