[ https://issues.apache.org/jira/browse/FLINK-21494?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Zheng Hu updated FLINK-21494: ----------------------------- Attachment: stacktrace.txt > Could not execute statement 'USE `default`' in Flink SQL client > --------------------------------------------------------------- > > Key: FLINK-21494 > URL: https://issues.apache.org/jira/browse/FLINK-21494 > Project: Flink > Issue Type: Bug > Components: Table SQL / Client > Affects Versions: 1.12.1 > Reporter: Zheng Hu > Priority: Major > Fix For: 1.12.2, 1.13.0 > > Attachments: stacktrace.txt > > > I have two databases in my iceberg catalog, one is `default`, another one is > `test_db`. While I cannot switch to use the `default` database because of > the Flink SQL parser bug: > {code} > Flink SQL> show databases; > default > test_db > > Flink SQL> use `default`; > [ERROR] Could not execute SQL statement. Reason: > org.apache.flink.sql.parser.impl.ParseException: Incorrect syntax near the > keyword 'USE' at line 1, column 1. > Was expecting one of: > "ABS" ... > "ALTER" ... > "ARRAY" ... > "AVG" ... > "CALL" ... > "CARDINALITY" ... > "CASE" ... > "CAST" ... > "CEIL" ... > "CEILING" ... > "CHAR_LENGTH" ... > "CHARACTER_LENGTH" ... > "CLASSIFIER" ... > "COALESCE" ... > "COLLECT" ... > "CONVERT" ... > "COUNT" ... > "COVAR_POP" ... > "COVAR_SAMP" ... > "CREATE" ... > "CUME_DIST" ... > "CURRENT" ... > "CURRENT_CATALOG" ... > "CURRENT_DATE" ... > "CURRENT_DEFAULT_TRANSFORM_GROUP" ... > "CURRENT_PATH" ... > "CURRENT_ROLE" ... > "CURRENT_SCHEMA" ... > "CURRENT_TIME" ... > "CURRENT_TIMESTAMP" ... > "CURRENT_USER" ... > "CURSOR" ... > "DATE" ... > "DELETE" ... > "DENSE_RANK" ... > "DESCRIBE" ... > "DROP" ... > "ELEMENT" ... > "EVERY" ... > "EXISTS" ... > "EXP" ... > "EXPLAIN" ... > "EXTRACT" ... > "FALSE" ... > "FIRST_VALUE" ... > "FLOOR" ... > "FUSION" ... > "GROUPING" ... > "HOUR" ... > "INSERT" ... > "INTERSECTION" ... > "INTERVAL" ... > "JSON_ARRAY" ... > "JSON_ARRAYAGG" ... > "JSON_EXISTS" ... > "JSON_OBJECT" ... > "JSON_OBJECTAGG" ... > "JSON_QUERY" ... > "JSON_VALUE" ... > "LAG" ... > "LAST_VALUE" ... > "LEAD" ... > "LEFT" ... > "LN" ... > "LOCALTIME" ... > "LOCALTIMESTAMP" ... > "LOWER" ... > "MATCH_NUMBER" ... > "MAX" ... > "MERGE" ... > "MIN" ... > "MINUTE" ... > "MOD" ... > "MONTH" ... > "MULTISET" ... > "NEW" ... > "NEXT" ... > "NOT" ... > "NTH_VALUE" ... > "NTILE" ... > "NULL" ... > "NULLIF" ... > "OCTET_LENGTH" ... > "OVERLAY" ... > "PERCENT_RANK" ... > "PERIOD" ... > "POSITION" ... > "POWER" ... > "PREV" ... > "RANK" ... > "REGR_COUNT" ... > "REGR_SXX" ... > "REGR_SYY" ... > "RESET" ... > "RIGHT" ... > "ROW" ... > "ROW_NUMBER" ... > "RUNNING" ... > "SECOND" ... > "SELECT" ... > "SESSION_USER" ... > "SET" ... > "SOME" ... > "SPECIFIC" ... > "SQRT" ... > "STDDEV_POP" ... > "STDDEV_SAMP" ... > "SUBSTRING" ... > "SUM" ... > "SYSTEM_USER" ... > "TABLE" ... > "TIME" ... > "TIMESTAMP" ... > "TRANSLATE" ... > "TRIM" ... > "TRUE" ... > "TRUNCATE" ... > "UNKNOWN" ... > "UPDATE" ... > "UPPER" ... > "UPSERT" ... > "USER" ... > "VALUES" ... > "VAR_POP" ... > "VAR_SAMP" ... > "WITH" ... > "YEAR" ... > <UNSIGNED_INTEGER_LITERAL> ... > <APPROX_NUMERIC_LITERAL> ... > <DECIMAL_NUMERIC_LITERAL> ... > <BINARY_STRING_LITERAL> ... > <QUOTED_STRING> ... > <PREFIXED_STRING_LITERAL> ... > <UNICODE_STRING_LITERAL> ... > <BIG_QUERY_DOUBLE_QUOTED_STRING> ... > <BIG_QUERY_QUOTED_STRING> ... > "(" ... > <LBRACE_D> ... > <LBRACE_T> ... > <LBRACE_TS> ... > <LBRACE_FN> ... > "?" ... > "+" ... > "-" ... > <BRACKET_QUOTED_IDENTIFIER> ... > <QUOTED_IDENTIFIER> ... > <BACK_QUOTED_IDENTIFIER> ... > <HYPHENATED_IDENTIFIER> ... > <IDENTIFIER> ... > <UNICODE_QUOTED_IDENTIFIER> ... > "SHOW" ... > "USE" <IDENTIFIER> ... > "USE" <HYPHENATED_IDENTIFIER> ... > "USE" <QUOTED_IDENTIFIER> ... > "USE" <BACK_QUOTED_IDENTIFIER> ... > "USE" <BRACKET_QUOTED_IDENTIFIER> ... > "USE" <UNICODE_QUOTED_IDENTIFIER> ... > {code} > It's OK to switch to use `test_db`. > {code} > Flink SQL> use `test_db`; > Flink SQL> show tables; > [INFO] Result was empty. > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)