[ https://issues.apache.org/jira/browse/SPARK-35629?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17356343#comment-17356343 ]
Apache Spark commented on SPARK-35629: -------------------------------------- User 'ulysses-you' has created a pull request for this issue: https://github.com/apache/spark/pull/32768 > Drop database should check if exists > ------------------------------------ > > Key: SPARK-35629 > URL: https://issues.apache.org/jira/browse/SPARK-35629 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.2.0 > Reporter: XiDuo You > Priority: Minor > > Curently execute `drop database test` will throw unfriendly error msg. > {code:java} > Error in query: org.apache.hadoop.hive.metastore.api.NoSuchObjectException: > test > org.apache.spark.sql.AnalysisException: > org.apache.hadoop.hive.metastore.api.NoSuchObjectException: test > at > org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:112) > at > org.apache.spark.sql.hive.HiveExternalCatalog.dropDatabase(HiveExternalCatalog.scala:200) > at > org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropDatabase(ExternalCatalogWithListener.scala:53) > at > org.apache.spark.sql.catalyst.catalog.SessionCatalog.dropDatabase(SessionCatalog.scala:273) > at > org.apache.spark.sql.execution.command.DropDatabaseCommand.run(ddl.scala:111) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84) > at > org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) > at > org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3707) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org