Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20657#discussion_r172581601
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala 
---
    @@ -144,7 +145,8 @@ class SparkHadoopUtil extends Logging {
       private[spark] def addDelegationTokens(tokens: Array[Byte], sparkConf: 
SparkConf) {
         UserGroupInformation.setConfiguration(newConfiguration(sparkConf))
         val creds = deserialize(tokens)
    -    logInfo(s"Adding/updating delegation tokens ${dumpTokens(creds)}")
    +    logInfo("Updating delegation tokens for current user.")
    --- End diff --
    
    yeah I was thinking it might be handy to have it logged in the executors 
and driver as well, sort of as an RPC id, so you could correlate the log lines, 
in case there was ever a delay in propagation or a failure to get to one 
executor or something, since you're choosing to always log something here.  
Still, your call.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to