[ 
https://issues.apache.org/jira/browse/SPARK-44455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gengliang Wang reassigned SPARK-44455:
--------------------------------------

    Assignee: Runyao.Chen

> SHOW CREATE TABLE does not quote identifiers with special characters
> --------------------------------------------------------------------
>
>                 Key: SPARK-44455
>                 URL: https://issues.apache.org/jira/browse/SPARK-44455
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0, 3.4.1
>            Reporter: Runyao.Chen
>            Assignee: Runyao.Chen
>            Priority: Major
>
> Create a table with special characters:
> ```
> CREATE CATALOG `a_catalog-with+special^chars`; CREATE SCHEMA 
> `a_catalog-with+special^chars`.`a_schema-with+special^chars`; CREATE TABLE 
> `a_catalog-with+special^chars`.`a_schema-with+special^chars`.`table1` ( id 
> int, feat1 varchar(255), CONSTRAINT pk PRIMARY KEY (id,feat1) );
> ```
> Then run SHOW CREATE TABLE:
> ```
> SHOW CREATE TABLE 
> `a_catalog-with+special^chars`.`a_schema-with+special^chars`.`table1`;
> ```
> The response is:
> ```
> createtab_stmt "CREATE TABLE 
> a_catalog-with+special^chars.a_schema-with+special^chars.table1 ( id INT NOT 
> NULL, feat1 VARCHAR(255) NOT NULL, CONSTRAINT pk PRIMARY KEY (id, feat1)) 
> USING delta TBLPROPERTIES ( 'delta.minReaderVersion' = '1', 
> 'delta.minWriterVersion' = '2') "
> ```
> As you can see, the table name in the response is not properly escaped with 
> backticks. As a result, if a user copies and pastes this create table command 
> to recreate the table, it will fail:
> {{[INVALID_IDENTIFIER] The identifier a_catalog-with is invalid. Please, 
> consider quoting it with back-quotes as `a_catalog-with`.(line 1, pos 22)}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to