Github user BruceXu1991 commented on a diff in the pull request: https://github.com/apache/spark/pull/20034#discussion_r158472814 --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala --- @@ -186,7 +186,7 @@ private[hive] class HiveClientImpl( /** Returns the configuration for the current session. */ def conf: HiveConf = state.getConf - private val userName = state.getAuthenticator.getUserName + private val userName = conf.getUser --- End diff -- yes, i met this problem by using MySQL as Hive metastore. what's more, when I execute DESCRIBE FORMATTED spark_22846, NullPointerException will occur. ''' > DESCRIBE FORMATTED offline.spark_22846; Error: java.lang.NullPointerException (state=,code=0) ''' and the detail stack info: ``` 17/12/22 18:18:10 ERROR SparkExecuteStatementOperation: Error executing query, currentState RUNNING, java.lang.NullPointerException at scala.collection.immutable.StringOps$.length$extension(StringOps.scala:47) at scala.collection.immutable.StringOps.length(StringOps.scala:47) at scala.collection.IndexedSeqOptimized$class.isEmpty(IndexedSeqOptimized.scala:27) at scala.collection.immutable.StringOps.isEmpty(StringOps.scala:29) at scala.collection.TraversableOnce$class.nonEmpty(TraversableOnce.scala:111) at scala.collection.immutable.StringOps.nonEmpty(StringOps.scala:29) at org.apache.spark.sql.catalyst.catalog.CatalogTable.toLinkedHashMap(interface.scala:301) at org.apache.spark.sql.execution.command.DescribeTableCommand.describeFormattedTableInfo(tables.scala:559) at org.apache.spark.sql.execution.command.DescribeTableCommand.run(tables.scala:537) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67) at org.apache.spark.sql.Dataset.<init>(Dataset.scala:183) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:68) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:767) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691) ``` this result of NPE is that owner is null. The relevant source code is below: ``` def toLinkedHashMap: mutable.LinkedHashMap[String, String] = { ......... line 301: if (owner.nonEmpty) map.put("Owner", owner) ........ } ```
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org