[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244924182
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -141,6 +153,8 @@ package object config {
 .bytesConf(ByteUnit.MiB)
 .createOptional
 
+  private[spark] val CORES_MAX = 
ConfigBuilder("spark.cores.max").intConf.createOptional
 
 Review comment:
   ~I'm wondering if we can use `.createWithDefault(1)` like `EXECUTOR_CORES`?~
   Never mind. I found different use cases in `StandaloneSchedulerBackend`.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244924272
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -141,6 +153,8 @@ package object config {
 .bytesConf(ByteUnit.MiB)
 .createOptional
 
+  private[spark] val CORES_MAX = 
ConfigBuilder("spark.cores.max").intConf.createOptional
 
 Review comment:
   Also, could you add `.doc("..")`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244924182
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -141,6 +153,8 @@ package object config {
 .bytesConf(ByteUnit.MiB)
 .createOptional
 
+  private[spark] val CORES_MAX = 
ConfigBuilder("spark.cores.max").intConf.createOptional
 
 Review comment:
   Can we use `.createWithDefault(1)` like `EXECUTOR_CORES`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244924182
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -141,6 +153,8 @@ package object config {
 .bytesConf(ByteUnit.MiB)
 .createOptional
 
+  private[spark] val CORES_MAX = 
ConfigBuilder("spark.cores.max").intConf.createOptional
 
 Review comment:
   I'm wondering if we can use `.createWithDefault(1)` like `EXECUTOR_CORES`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244919000
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -130,7 +138,11 @@ package object config {
   private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
 
ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
 
-  private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
+  private[spark] val EXECUTOR_CORES = 
ConfigBuilder(SparkLauncher.EXECUTOR_CORES)
 
 Review comment:
   Could you describe a little bit more about this pattern in the PR 
description? This PR seems to be different from the PR description (and other 
related PRs) because the strings exists in `SparkLauncher` instead of this 
`org.apache.spark.internal.config` (although we use this configurations in 
other places).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244917210
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
 ##
 @@ -127,7 +127,7 @@ private[spark] class ExecutorAllocationManager(
   // allocation is only supported for YARN and the default number of cores per 
executor in YARN is
   // 1, but it might need to be attained differently for different cluster 
managers
   private val tasksPerExecutorForFullParallelism =
-conf.getInt("spark.executor.cores", 1) / conf.getInt("spark.task.cpus", 1)
+conf.get(EXECUTOR_CORES) / conf.getInt("spark.task.cpus", 1)
 
 Review comment:
   Hi, @kiszk . So, this seems to cause many conflicts with your PR because 
your PR is touching `spark.task.cpus`.
   Is it okay for us to proceed @ueshin 's PR first?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2019-01-02 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244917210
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala
 ##
 @@ -127,7 +127,7 @@ private[spark] class ExecutorAllocationManager(
   // allocation is only supported for YARN and the default number of cores per 
executor in YARN is
   // 1, but it might need to be attained differently for different cluster 
managers
   private val tasksPerExecutorForFullParallelism =
-conf.getInt("spark.executor.cores", 1) / conf.getInt("spark.task.cpus", 1)
+conf.get(EXECUTOR_CORES) / conf.getInt("spark.task.cpus", 1)
 
 Review comment:
   Hi, @kiszk . So, this seems to cause many conflicts with your PR because 
your PR chaning `spark.task.cpus`.
   Is it okay for us to proceed @ueshin 's PR first?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2018-12-31 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244628594
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkConf.scala
 ##
 @@ -503,12 +503,12 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable 
with Logging with Seria
   logWarning(msg)
 }
 
-val executorOptsKey = "spark.executor.extraJavaOptions"
-val executorClasspathKey = "spark.executor.extraClassPath"
-val driverOptsKey = "spark.driver.extraJavaOptions"
-val driverClassPathKey = "spark.driver.extraClassPath"
-val driverLibraryPathKey = "spark.driver.extraLibraryPath"
-val sparkExecutorInstances = "spark.executor.instances"
+val executorOptsKey = EXECUTOR_JAVA_OPTIONS.key
+val executorClasspathKey = EXECUTOR_CLASS_PATH.key
+val driverOptsKey = DRIVER_JAVA_OPTIONS.key
+val driverClassPathKey = DRIVER_CLASS_PATH.key
+val driverLibraryPathKey = DRIVER_LIBRARY_PATH.key
+val sparkExecutorInstances = EXECUTOR_INSTANCES.key
 
 Review comment:
   It seems that `sparkExecutorInstances` is not used. Can we clean up unused 
variables in this PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories.

2018-12-31 Thread GitBox
dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] 
Use ConfigEntry for hardcoded configs for driver/executor categories.
URL: https://github.com/apache/spark/pull/23415#discussion_r244628594
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/SparkConf.scala
 ##
 @@ -503,12 +503,12 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable 
with Logging with Seria
   logWarning(msg)
 }
 
-val executorOptsKey = "spark.executor.extraJavaOptions"
-val executorClasspathKey = "spark.executor.extraClassPath"
-val driverOptsKey = "spark.driver.extraJavaOptions"
-val driverClassPathKey = "spark.driver.extraClassPath"
-val driverLibraryPathKey = "spark.driver.extraLibraryPath"
-val sparkExecutorInstances = "spark.executor.instances"
+val executorOptsKey = EXECUTOR_JAVA_OPTIONS.key
+val executorClasspathKey = EXECUTOR_CLASS_PATH.key
+val driverOptsKey = DRIVER_JAVA_OPTIONS.key
+val driverClassPathKey = DRIVER_CLASS_PATH.key
+val driverLibraryPathKey = DRIVER_LIBRARY_PATH.key
+val sparkExecutorInstances = EXECUTOR_INSTANCES.key
 
 Review comment:
   It seems that `sparkExecutorInstances` is not used. Can we clean up unused 
variable in this PR?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org