Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16923#discussion_r102361677
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
    @@ -106,21 +106,33 @@ private[hive] class HiveClientImpl(
     
         // Set up kerberos credentials for UserGroupInformation.loginUser 
within
         // current class loader
    -    // Instead of using the spark conf of the current spark context, a new
    -    // instance of SparkConf is needed for the original value of 
spark.yarn.keytab
    -    // and spark.yarn.principal set in SparkSubmit, as yarn.Client resets 
the
    -    // keytab configuration for the link name in distributed cache
         if (sparkConf.contains("spark.yarn.principal") && 
sparkConf.contains("spark.yarn.keytab")) {
           val principalName = sparkConf.get("spark.yarn.principal")
    -      val keytabFileName = sparkConf.get("spark.yarn.keytab")
    -      if (!new File(keytabFileName).exists()) {
    -        throw new SparkException(s"Keytab file: ${keytabFileName}" +
    -          " specified in spark.yarn.keytab does not exist")
    -      } else {
    -        logInfo("Attempting to login to Kerberos" +
    -          s" using principal: ${principalName} and keytab: 
${keytabFileName}")
    -        UserGroupInformation.loginUserFromKeytab(principalName, 
keytabFileName)
    +      val keytabFileName = {
    +        val keytab = sparkConf.get("spark.yarn.keytab")
    +        if (new File(keytab).exists()) {
    +          keytab
    +        } else {
    +          // Instead of using the spark conf of the current spark context, 
a new
    +          // instance of SparkConf is needed for the original value of 
spark.yarn.keytab
    +          // set in SparkSubmit, as yarn.Client resets the keytab 
configuration for the link name
    +          // in distributed cache, and this will make Spark driver fail to 
get correct keytab
    +          // path in yarn client mode.
    +          val originKeytab = new SparkConf().get("spark.yarn.keytab")
    +          require(originKeytab != null,
    +            "spark.yarn.keytab is not configured, this is unexpected")
    +          if (new File(originKeytab).exists()) {
    +            originKeytab
    +          } else {
    +            throw new SparkException(s"Keytab file: $originKeytab " +
    +              s"specified in spark.yarn.keytab does not exist")
    +          }
    +        }
           }
    +
    +      logInfo("Attempting to login to Kerberos" +
    +        s" using principal: ${principalName} and keytab: 
${keytabFileName}")
    +      UserGroupInformation.loginUserFromKeytab(principalName, 
keytabFileName)
    --- End diff --
    
    Ok, I think I can see a cleaner solution that what I proposed. Sorry for 
flip-flopping on this.
    
    Instead of overwriting the `KEYTAB` value in Client.scala, instead, how 
about:
    - keep the keytab name in an instance variable
    - don't update SparkConf, and use the instance variable in the `distribute` 
call (around L470)
    - when writing the AM conf, overwrite `KEYTAB.key` in the properties 
instance (around L708)
    
    That avoids the second config, and keeps all the code to handle the 
different locations in one place (Client.scala), without having to change 
anywhere else.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to