[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/14149


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread jerryshao
Github user jerryshao commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70628729
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
--- End diff --

That sounds good to me, let me update the patch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70628081
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
--- End diff --

Right but as a user I could interpret that to mean its going to use the min 
executors for the initial number, which isn't necessarily true.  I'm just 
worried it will confuse the users.How about we change the phrasing to be 
something like: initial less than min is invalid, ignoring its setting, please 
update your configs.   Similar message for the num executor instances below .

Then after the max calculation we could simply print an info message like: 
using initial executors = value, max of initial, num executors, min executors.

These warnings are turning into a lot of extra code/text.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread jerryshao
Github user jerryshao commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70625909
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
--- End diff --

I think my original meaning is `DYN_ALLOCATION_MIN_EXECUTORS` will override 
`DYN_ALLOCATION_INITIAL_EXECUTORS` if the former is larger. Not related to num 
executor instance, num executor instance can still be larger than 
`DYN_ALLOCATION_MIN_EXECUTORS`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70625165
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
--- End diff --

so this isn't necessarily true, it could use num executor instances if that 
is higher then min.  I think we should say using the max of other 2




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-13 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70625202
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
+}
+
+if (conf.get(EXECUTOR_INSTANCES).getOrElse(0) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${EXECUTOR_INSTANCES.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
--- End diff --

similar to above


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread rdblue
Github user rdblue commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70469463
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS} instead.")
--- End diff --

I agree that either spark.executor.instances or initialExecutors less than 
minimum should produce a warning.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread jerryshao
Github user jerryshao commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70448559
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS} instead.")
--- End diff --

I don't consider user specified executor instances, I'm not sure if we 
should also take this configuration into consideration.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70450332
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS} instead.")
--- End diff --

executor instances is what gets set if user puts in --num-executors which I 
see easily happening now and if min is already set and > then they are 
different and we will use min so we should warn here also.  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70444399
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS} instead.")
--- End diff --

also note that the comment above isn't exactly correct because if the 
number of executor instances is > then the min it would use that number instead 
of the min executors.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70444156
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
+  logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
+s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
+  s"${DYN_ALLOCATION_MIN_EXECUTORS} instead.")
--- End diff --

so this doesn't cover is the user specified the executor instances < the 
minimum seems like we should also put out a warning for that also.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/14149#discussion_r70442221
  
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2342,6 +2342,12 @@ private[spark] object Utils extends Logging {
* Return the initial number of executors for dynamic allocation.
*/
   def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
+if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
--- End diff --

this isn't going to warn if executor instances was specified and is less 
then the minimum.  We could simply move this check til after the max and see if 
the number that is going to be returned is < min executors.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14149: [SPARK-16435][YARN][MINOR] Add warning log if ini...

2016-07-12 Thread jerryshao
GitHub user jerryshao opened a pull request:

https://github.com/apache/spark/pull/14149

[SPARK-16435][YARN][MINOR] Add warning log if initialExecutors is less than 
minExecutors

## What changes were proposed in this pull request?

Currently if `spark.dynamicAllocation.initialExecutors` is less than 
`spark.dynamicAllocation.minExecutors`, Spark will automatically pick the 
minExecutors without any warning. While in 1.6 Spark will throw exception if 
configured like this. So here propose to add warning log if these parameters 
are configured invalidly.


## How was this patch tested?

Unit test added to verify the scenario.




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jerryshao/apache-spark SPARK-16435

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/14149.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #14149


commit f08eabdc9c20f29dd3a007784a4b893172bdc2db
Author: jerryshao 
Date:   2016-07-12T08:20:04Z

Add warning log if initialExecutors is less than minExecutors




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org