GitHub user gatorsmile opened a pull request:

    https://github.com/apache/spark/pull/19395

    [SPARK-22171] [SQL] Describe Table Extended Failed when Table Owner is Empty

    ## What changes were proposed in this pull request?
    
    Users could hit `java.lang.NullPointerException` when the tables were 
created by Hive and the table's owner is `null` that are got from Hive 
metastore. `DESC EXTENDED` failed with the error:
    
    > SQLExecutionException: java.lang.NullPointerException at 
scala.collection.immutable.StringOps$.length$extension(StringOps.scala:47) at 
scala.collection.immutable.StringOps.length(StringOps.scala:47) at 
scala.collection.IndexedSeqOptimized$class.isEmpty(IndexedSeqOptimized.scala:27)
 at scala.collection.immutable.StringOps.isEmpty(StringOps.scala:29) at 
scala.collection.TraversableOnce$class.nonEmpty(TraversableOnce.scala:111) at 
scala.collection.immutable.StringOps.nonEmpty(StringOps.scala:29) at 
org.apache.spark.sql.catalyst.catalog.CatalogTable.toLinkedHashMap(interface.scala:300)
 at 
org.apache.spark.sql.execution.command.DescribeTableCommand.describeFormattedTableInfo(tables.scala:565)
 at 
org.apache.spark.sql.execution.command.DescribeTableCommand.run(tables.scala:543)
 at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:66)
 at 
    
    ## How was this patch tested?
    Added a unit test case

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/gatorsmile/spark desc

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19395.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19395
    
----
commit 9b2cc4c5df6211e2fb1e995d95340c9f84b7e5a4
Author: gatorsmile <gatorsm...@gmail.com>
Date:   2017-09-29T23:36:08Z

    fix

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to