[ https://issues.apache.org/jira/browse/SPARK-35629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Gengliang Wang updated SPARK-35629: ----------------------------------- Summary: Use better exception type if database doesn't exist on `drop database` (was: Drop database should check if exists) > Use better exception type if database doesn't exist on `drop database` > ---------------------------------------------------------------------- > > Key: SPARK-35629 > URL: https://issues.apache.org/jira/browse/SPARK-35629 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.2.0 > Reporter: XiDuo You > Assignee: XiDuo You > Priority: Minor > Fix For: 3.2.0 > > > Curently execute `drop database test` will throw unfriendly error msg. > {code:java} > Error in query: org.apache.hadoop.hive.metastore.api.NoSuchObjectException: > test > org.apache.spark.sql.AnalysisException: > org.apache.hadoop.hive.metastore.api.NoSuchObjectException: test > at > org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:112) > at > org.apache.spark.sql.hive.HiveExternalCatalog.dropDatabase(HiveExternalCatalog.scala:200) > at > org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.dropDatabase(ExternalCatalogWithListener.scala:53) > at > org.apache.spark.sql.catalyst.catalog.SessionCatalog.dropDatabase(SessionCatalog.scala:273) > at > org.apache.spark.sql.execution.command.DropDatabaseCommand.run(ddl.scala:111) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73) > at > org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84) > at > org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) > at > org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3707) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org