Github user lgrcyanny commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19079#discussion_r135943288
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
    @@ -393,7 +393,7 @@ object SparkEnv extends Logging {
         // Add a reference to tmp dir created by driver, we will delete this 
tmp dir when stop() is
         // called, and we only need to do it for driver. Because driver may 
run as a service, and if we
         // don't delete this tmp dir when sc is stopped, then will create too 
many tmp dirs.
    -    if (isDriver) {
    +    if (isDriver && 
conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
    --- End diff --
    
    Originally, my version is
    ```
    conf.get("spark.submit.deployMode", "client") == "client"
    ```
    Then I refered to the SparkContext#deployMode function,
    it use
    ```
    conf.getOption("spark.submit.deployMode").getOrElse("client")
    ```
    I just want to keep the same style as SparkContext.
    which one is more better?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to