Github user dennishuo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8072#discussion_r37323794
  
    --- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
    @@ -163,6 +163,23 @@ private[spark] class Client(
         appContext.setQueue(args.amQueue)
         appContext.setAMContainerSpec(containerContext)
         appContext.setApplicationType("SPARK")
    +    sparkConf.getOption(CONF_SPARK_YARN_APPLICATION_TAGS)
    --- End diff --
    
    Thanks for the suggestion! I'm a bit rusty in Scala, so I could be 
misapplying some idioms here, but my intent on this line was to use the 
foreach, etc., only to access the "None or single element" of the Option, as 
indicated on http://www.scala-lang.org/api/current/index.html#scala.Option , so 
the "foreach" here is supposed to only iterate over the "single" element which 
happens to itself be a collection, as opposed to iterating over elements of the 
inner collection. So there shouldn't be any way to cause multiple log 
statements or multiple reflection-based lookups of the method.
    
    I also wanted to err on the side of minimizing behavioral changes for 
existing setups, so that if the Option is None, then this chaining as a monad 
avoids ever needing to lookup a method, or even invoking any of the 
option-processing methods like StringUtils.getTrimmedStringCollection.
    
    I could add a note to make it more clear that the map/filter/foreach is on 
the Option, as opposed to the Collection if that'd help.
    
    Anyhow, I'll be happy to apply the reversal to start with the method as you 
suggest if you prefer, just wanted to hear your thoughts on this Option usage 
first.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to