[ https://issues.apache.org/jira/browse/SPARK-30104?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-30104: ----------------------------------- Assignee: Terry Kim > global temp db name can be used as a table name under v2 catalog > ---------------------------------------------------------------- > > Key: SPARK-30104 > URL: https://issues.apache.org/jira/browse/SPARK-30104 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Terry Kim > Assignee: Terry Kim > Priority: Major > > Currently, 'global_temp' can be used in certain commands (CREATE) but not in > others (DESCRIBE) because catalog look up logic only considers the first > element of the multi-part name and always uses the session catalog if it is > set to 'global_temp'. > For example: > {code:java} > // Assume "spark.sql.globalTempDatabase" is set to "global_temp". > sql(s"CREATE TABLE testcat.t (id bigint, data string) USING foo") > sql(s"CREATE TABLE testcat.global_temp (id bigint, data string) USING foo") > sql("USE testcat") > sql(s"DESCRIBE TABLE t").show > +---------------+---------+-------+ > | col_name|data_type|comment| > +---------------+---------+-------+ > | id| bigint| | > | data| string| | > | | | | > | # Partitioning| | | > |Not partitioned| | | > +---------------+---------+-------+ > sql(s"DESCRIBE TABLE global_temp").show > org.apache.spark.sql.AnalysisException: Table not found: global_temp;; > 'DescribeTable 'UnresolvedV2Relation [global_temp], > org.apache.spark.sql.connector.InMemoryTableSessionCatalog@2f1af64f, > `global_temp`, false > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis(CheckAnalysis.scala:47) > at > org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis$(CheckAnalysis.scala:46) > at > org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:122) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org