[GitHub] spark pull request #17798: [SPARK-20521][DOC][CORE]The default of 'spark.wor...

2017-04-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/17798


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17798: [SPARK-20521][DOC][CORE]The default of 'spark.wor...

2017-04-29 Thread guoxiaolongzte
Github user guoxiaolongzte commented on a diff in the pull request:

https://github.com/apache/spark/pull/17798#discussion_r114063875
  
--- Diff: docs/spark-standalone.md ---
@@ -242,7 +242,7 @@ SPARK_WORKER_OPTS supports the following system 
properties:
 
 
   spark.worker.cleanup.appDataTtl
-  7 * 24 * 3600 (7 days)
+  604800 (7 days)
--- End diff --

Thank you code review, this will make the user more understanding.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17798: [SPARK-20521][DOC][CORE]

2017-04-29 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/17798#discussion_r114058622
  
--- Diff: docs/spark-standalone.md ---
@@ -242,7 +242,7 @@ SPARK_WORKER_OPTS supports the following system 
properties:
 
 
   spark.worker.cleanup.appDataTtl
-  7 * 24 * 3600 (7 days)
+  604800 (7 days)
--- End diff --

could you change this to
`604800 (7 days, 7 * 24 * 3600)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17798: [SPARK-20521][DOC][CORE]

2017-04-28 Thread guoxiaolongzte
GitHub user guoxiaolongzte opened a pull request:

https://github.com/apache/spark/pull/17798

[SPARK-20521][DOC][CORE]

## What changes were proposed in this pull request?

Currently, our project needs to be set to clean up the worker directory 
cleanup cycle is three days. 
When I follow http://spark.apache.org/docs/latest/spark-standalone.html, 
configure the 'spark.worker.cleanup.appDataTtl' parameter, I configured to 3 * 
24 * 3600.
When I start the spark service, the startup fails, and the worker log 
displays the error log as follows:

2017-04-28 15:02:03,306 INFO Utils: Successfully started service 
'sparkWorker' on port 48728.
Exception in thread "main" java.lang.NumberFormatException: For input 
string: "3 * 24 * 3600"
at 
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:430)
at java.lang.Long.parseLong(Long.java:483)
at 
scala.collection.immutable.StringLike$class.toLong(StringLike.scala:276)
at scala.collection.immutable.StringOps.toLong(StringOps.scala:29)
at 
org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380)
at 
org.apache.spark.SparkConf$$anonfun$getLong$2.apply(SparkConf.scala:380)
at scala.Option.map(Option.scala:146)
at org.apache.spark.SparkConf.getLong(SparkConf.scala:380)
at org.apache.spark.deploy.worker.Worker.(Worker.scala:100)
at 
org.apache.spark.deploy.worker.Worker$.startRpcEnvAndEndpoint(Worker.scala:730)
at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:709)
at org.apache.spark.deploy.worker.Worker.main(Worker.scala)


**Because we put 7 * 24 * 3600 as a string, forced to convert to the dragon 
type,  will lead to problems in the program.**

**So I think the default value of the current configuration should be a 
specific long value, rather than 7 * 24 * 3600,should be 604800. Because it 
would mislead users for similar configurations, resulting in spark start 
failure.**

## How was this patch tested?
manual tests

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/guoxiaolongzte/spark SPARK-20521

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/17798.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #17798


commit d383efba12c66addb17006dea107bb0421d50bc3
Author: 郭小龙 10207633 
Date:   2017-03-31T13:57:09Z

[SPARK-20177]Document about compression way has some little detail changes.

commit 3059013e9d2aec76def14eb314b6761bea0e7ca0
Author: 郭小龙 10207633 
Date:   2017-04-01T01:38:02Z

[SPARK-20177] event log add a space

commit 555cef88fe09134ac98fd0ad056121c7df2539aa
Author: guoxiaolongzte 
Date:   2017-04-02T00:16:08Z

'/applications/[app-id]/jobs' in rest api,status should be 
[running|succeeded|failed|unknown]

commit 46bb1ad3ddd9fb55b5607ac4f20213a90186cfe9
Author: 郭小龙 10207633 
Date:   2017-04-05T03:16:50Z

Merge branch 'master' of https://github.com/apache/spark into SPARK-20177

commit 0efb0dd9e404229cce638fe3fb0c966276784df7
Author: 郭小龙 10207633 
Date:   2017-04-05T03:47:53Z

[SPARK-20218]'/applications/[app-id]/stages' in REST API,add description.

commit 0e37fdeee28e31fc97436dabd001d3c85c5a7794
Author: 郭小龙 10207633 
Date:   2017-04-05T05:22:54Z

[SPARK-20218] '/applications/[app-id]/stages/[stage-id]' in REST API,remove 
redundant description.

commit 52641bb01e55b48bd9e8579fea217439d14c7dc7
Author: 郭小龙 10207633 
Date:   2017-04-07T06:24:58Z

Merge branch 'SPARK-20218'

commit d3977c9cab0722d279e3fae7aacbd4eb944c22f6
Author: 郭小龙 10207633 
Date:   2017-04-08T07:13:02Z

Merge branch 'master' of https://github.com/apache/spark

commit 137b90e5a85cde7e9b904b3e5ea0bb52518c4716
Author: 郭小龙 10207633 
Date:   2017-04-10T05:13:40Z

Merge branch 'master' of https://github.com/apache/spark

commit 0fe5865b8022aeacdb2d194699b990d8467f7a0a
Author: 郭小龙 10207633 
Date:   2017-04-10T10:25:22Z

Merge branch 'SPARK-20190' of https://github.com/guoxiaolongzte/spark

commit cf6f42ac84466960f2232c025b8faeb5d7378fe1
Author: 郭小龙 10207633 
Date:   2017-04-10T10:26:27Z

Merge branch 'master' of https://github.com/apache/spark

commit 685cd6b6e3799c7be65674b2670159ba725f0b8f
Author: 郭小龙 10207633