[GitHub] spark pull request #20392: Update ApplicationMaster.scala

2018-01-24 Thread Sangrho
Github user Sangrho closed the pull request at:

https://github.com/apache/spark/pull/20392


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20392: Update ApplicationMaster.scala

2018-01-24 Thread Sangrho
GitHub user Sangrho opened a pull request:

https://github.com/apache/spark/pull/20392

Update ApplicationMaster.scala

I have one question.
I think when maxNumExecutorFailures is calculated, MAX_EXECUTOR_FAILURES is 
already defined by specific by spark document (as numExecutors * 2, with 
minimum of 3)
So the annotation added by me in the code is not valid.
Give me the answer please.
Thank you

## What changes were proposed in this pull request?

(Please fill in changes proposed in this fix)

## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Sangrho/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/20392.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #20392


commit 2eb87e032582f3b398997f3877d6f27ec2b1653e
Author: Josh LEE 
Date:   2018-01-25T04:53:41Z

Update ApplicationMaster.scala

I have one question.
I think when maxNumExecutorFailures is calculated, MAX_EXECUTOR_FAILURES is 
already defined by specific by spark document (as numExecutors * 2, with 
minimum of 3)
So the annotation added by me in the code is not valid.
Give me the answer please.
Thank you




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org