dongjoon-hyun commented on a change in pull request #23415: [SPARK-26445][CORE] Use ConfigEntry for hardcoded configs for driver/executor categories. URL: https://github.com/apache/spark/pull/23415#discussion_r244628594
########## File path: core/src/main/scala/org/apache/spark/SparkConf.scala ########## @@ -503,12 +503,12 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging with Seria logWarning(msg) } - val executorOptsKey = "spark.executor.extraJavaOptions" - val executorClasspathKey = "spark.executor.extraClassPath" - val driverOptsKey = "spark.driver.extraJavaOptions" - val driverClassPathKey = "spark.driver.extraClassPath" - val driverLibraryPathKey = "spark.driver.extraLibraryPath" - val sparkExecutorInstances = "spark.executor.instances" + val executorOptsKey = EXECUTOR_JAVA_OPTIONS.key + val executorClasspathKey = EXECUTOR_CLASS_PATH.key + val driverOptsKey = DRIVER_JAVA_OPTIONS.key + val driverClassPathKey = DRIVER_CLASS_PATH.key + val driverLibraryPathKey = DRIVER_LIBRARY_PATH.key + val sparkExecutorInstances = EXECUTOR_INSTANCES.key Review comment: It seems that `sparkExecutorInstances` is not used. Can we clean up unused variable in this PR? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org