[jira] [Commented] (SPARK-32963) empty string should be consistent for schema name in SparkGetSchemasOperation
[ https://issues.apache.org/jira/browse/SPARK-32963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17199933#comment-17199933 ] Apache Spark commented on SPARK-32963: -- User 'yaooqinn' has created a pull request for this issue: https://github.com/apache/spark/pull/29834 > empty string should be consistent for schema name in SparkGetSchemasOperation > - > > Key: SPARK-32963 > URL: https://issues.apache.org/jira/browse/SPARK-32963 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.0.1, 3.1.0 >Reporter: Kent Yao >Priority: Major > > When the schema name is empty string, it is considered as ".*" and can match > all databases in the catalog. > But when it can not match the global temp view as it is not converted to ".*" -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-32963) empty string should be consistent for schema name in SparkGetSchemasOperation
[ https://issues.apache.org/jira/browse/SPARK-32963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17199936#comment-17199936 ] Apache Spark commented on SPARK-32963: -- User 'yaooqinn' has created a pull request for this issue: https://github.com/apache/spark/pull/29834 > empty string should be consistent for schema name in SparkGetSchemasOperation > - > > Key: SPARK-32963 > URL: https://issues.apache.org/jira/browse/SPARK-32963 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 3.0.1, 3.1.0 >Reporter: Kent Yao >Priority: Major > > When the schema name is empty string, it is considered as ".*" and can match > all databases in the catalog. > But when it can not match the global temp view as it is not converted to ".*" -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org