[GitHub] spark pull request #20857: [SPARK-23735]Optimize the document by adding an i...
Github user liuxianjiao commented on a diff in the pull request: https://github.com/apache/spark/pull/20857#discussion_r180397682 --- Diff: docs/configuration.md --- @@ -2288,6 +2288,13 @@ showDF(properties, numRows = 200, truncate = FALSE) on the receivers. + + spark.streaming.concurrentJobs + 1 + +The number of concurrent jobs.This parameter directly affects the number of threads in the jobExecutor thread pool. --- End diff -- @srowen I see, but this more or less help us uderstand this configuration. How can we stop eating for fear of choking? To better explain its behavior,maybe we could make this description more specific rather than giving it up. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20857: [SPARK-23735]Optimize the document by adding an i...
GitHub user liuxianjiao opened a pull request: https://github.com/apache/spark/pull/20857 [SPARK-23735]Optimize the document by adding an import streaming con⦠â¦figuration ## What changes were proposed in this pull request? Optimize the document by adding an import streaming configurationï¼spark.streaming.concurrentJobsãThis parameter is quite import ,but is lacking in our current spark document. ## How was this patch tested? (Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests) (If this patch involves UI changes, please attach a screenshot; otherwise, remove this) Please review http://spark.apache.org/contributing.html before opening a pull request. You can merge this pull request into a Git repository by running: $ git pull https://github.com/liuxianjiao/spark SPARK-23735 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/20857.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #20857 commit 91f3f9c10b07056a5fdeb5befa3fa8356ae501ee Author: liuxianjiao Date: 2018-03-19T09:00:24Z [SPARK-23735]Optimize the document by adding an import streaming configuration --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #19142: When the number of attempting to restart receiver greate...
Github user liuxianjiao commented on the issue: https://github.com/apache/spark/pull/19142 @jerryshao so,what's the meaning of these empty 'else'? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #19142: When the number of attempting to restart receiver greate...
Github user liuxianjiao commented on the issue: https://github.com/apache/spark/pull/19142 @srowen Thanks for your reply!We can let users know the theory of the restart of receiver by this PR.To be honest,the 'else' was redundancy,so I improve it by logtrace. To say the least,if this PR is not worthwhile and the 'else' do nothing,can we remove it. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #19142: When the number of attempting to restart receiver...
GitHub user liuxianjiao opened a pull request: https://github.com/apache/spark/pull/19142 When the number of attempting to restart receiver greater than 0,spark do nothing in 'else' When the number of attempting to restart receiver greater than 0,spark do nothing in 'else'.So I think we should log trace to let users know why. You can merge this pull request into a Git repository by running: $ git pull https://github.com/liuxianjiao/spark master0905 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/19142.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #19142 commit c4edc1b4304f5b540b576ea60e260f5caef303c2 Author: liuxianjiao Date: 2017-09-06T01:03:47Z [SPARK-21930]When the number of attempting to restart receiver greater than 0,spark do nothing in 'else' --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org