Felix Cheung created SPARK-20188:
------------------------------------

             Summary: Catalog recoverPartitions should allow specifying the 
database name
                 Key: SPARK-20188
                 URL: https://issues.apache.org/jira/browse/SPARK-20188
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.0
            Reporter: Felix Cheung


Currently Catalog.recoverParitions only has a tableName parameter

https://github.com/apache/spark/blob/9effc2cdcb3d68db8b6b5b3abd75968633b583c8/sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala#L397

But it throws an exception when the table is not in the default database.

Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table 
or view 'foo' not found in database 'default';
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireTableExists(SessionCatalog.scala:154)
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:317)
        at 
org.apache.spark.sql.execution.command.AlterTableRecoverPartitionsCommand.run(ddl.scala:563)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to