Here's the full stack trace.

    => org.apache.flink.table.api.SqlParserException: SQL parse
failed. Encountered "." at line 1, column 336.
Was expecting one of:
    ")" ...
    "," ...

       
org.apache.flink.table.planner.parse.CalciteParser.parse(CalciteParser.java:56)
       
org.apache.flink.table.planner.calcite.FlinkPlannerImpl$ToRelContextImpl.expandView(FlinkPlannerImpl.scala:270)
       org.apache.calcite.plan.ViewExpanders$1.expandView(ViewExpanders.java:52)
       
org.apache.flink.table.planner.catalog.SqlCatalogViewTable.convertToRel(SqlCatalogViewTable.java:58)
       
org.apache.flink.table.planner.plan.schema.ExpandingPreparingTable.expand(ExpandingPreparingTable.java:59)
       [...]
     Caused by: org.apache.calcite.sql.parser.SqlParseException:
Encountered "." at line 1, column 336.
Was expecting one of:
    ")" ...
    "," ...

       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.convertException(FlinkSqlParserImpl.java:462)
       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.normalizeException(FlinkSqlParserImpl.java:225)
       
org.apache.calcite.sql.parser.SqlParser.handleException(SqlParser.java:140)
       org.apache.calcite.sql.parser.SqlParser.parseQuery(SqlParser.java:155)
       org.apache.calcite.sql.parser.SqlParser.parseStmt(SqlParser.java:180)
       [...]
     Caused by: org.apache.flink.sql.parser.impl.ParseException:
Encountered "." at line 1, column 336.
Was expecting one of:
    ")" ...
    "," ...

       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
       
org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
       [...]



On Wed, Nov 23, 2022 at 1:55 PM Dan Hill <quietgol...@gmail.com> wrote:

> If I remove the "TEMPORARY VIEW" and just inline the SQL, this works
> fine.  This seems like a bug with temporary views.
>
> On Wed, Nov 23, 2022 at 1:38 PM Dan Hill <quietgol...@gmail.com> wrote:
>
>> Looks related to this issue.
>> https://lists.apache.org/thread/1sb5bos6tjv39fh0wjkvmvht0824r4my
>>
>> In my case, it doesn't seem like it's a sink issue.  Even if I change my
>> minicluster test to SELECT * it, it fails the same way.
>>
>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>> SELECT
>>     DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), 'yyyy-MM-dd'),
>>     ROW(
>>         platform_id,
>>         content_id
>>     )
>> FROM content_event
>> GROUP BY
>>     platform_id,
>>     content_id,
>>     TUMBLE(rowtime, INTERVAL '1' DAY)
>>
>> SELECT * FROM test_content_metrics_view
>>
>>
>>
>> On Wed, Nov 23, 2022 at 1:19 PM Dan Hill <quietgol...@gmail.com> wrote:
>>
>>> I upgraded to Flink v1.16.0 and I get the same error.
>>>
>>> On Wed, Nov 23, 2022 at 9:47 AM Dan Hill <quietgol...@gmail.com> wrote:
>>>
>>>> For the error `Encountered "." at line 1, column 119.`, here are the
>>>> confusing parts:
>>>>
>>>> 1. The error happens when I executed the last part of the sql query:
>>>>
>>>> INSERT INTO `test_content_metrics`
>>>> SELECT * FROM `test_content_metrics_view`
>>>>
>>>> 2. Line 1 column 119 doesn't exist in that SQL statement.
>>>> 3. None of the SQL that I've written has a period "." in it.
>>>>
>>>>
>>>>
>>>> On Wed, Nov 23, 2022 at 8:32 AM Dan Hill <quietgol...@gmail.com> wrote:
>>>>
>>>>> I'm using Flink 1.14.4
>>>>>
>>>>> On Wed, Nov 23, 2022, 02:28 yuxia <luoyu...@alumni.sjtu.edu.cn> wrote:
>>>>>
>>>>>> Hi, Dan.
>>>>>> I'm wondering what type of error you expect. IMO, I think most
>>>>>> engines throw parse error in such way which tell you encounter an
>>>>>> unexpected token.
>>>>>>
>>>>>> Best regards,
>>>>>> Yuxia
>>>>>>
>>>>>> ------------------------------
>>>>>> *发件人: *"Dan Hill" <quietgol...@gmail.com>
>>>>>> *收件人: *"User" <user@flink.apache.org>
>>>>>> *发送时间: *星期三, 2022年 11 月 23日 下午 1:55:20
>>>>>> *主题: *Weird Flink SQL error
>>>>>>
>>>>>> Hi.  I'm hitting an obfuscated Flink SQL parser error.  Is there a
>>>>>> way to get better errors for Flink SQL?  I'm hitting it when I wrap some 
>>>>>> of
>>>>>> the fields on an inner Row.
>>>>>>
>>>>>>
>>>>>> *Works*
>>>>>>
>>>>>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>>>>>> SELECT
>>>>>>     DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), 'yyyy-MM-dd'),
>>>>>>     platform_id,
>>>>>>     content_id
>>>>>> FROM content_event
>>>>>> GROUP BY
>>>>>>     platform_id,
>>>>>>     content_id,
>>>>>>     TUMBLE(rowtime, INTERVAL '1' DAY)
>>>>>>
>>>>>> CREATE TABLE test_content_metrics (
>>>>>>    dt STRING NOT NULL,
>>>>>>    `platform_id` BIGINT,
>>>>>>    `content_id` STRING
>>>>>> ) PARTITIONED BY (dt) WITH (
>>>>>>    'connector' = 'filesystem',
>>>>>>    'path' = 'etl/test_content_metrics',
>>>>>>    'format' = 'json',
>>>>>> )
>>>>>>
>>>>>> INSERT INTO `test_content_metrics`
>>>>>> SELECT * FROM `test_content_metrics_view`
>>>>>>
>>>>>>
>>>>>> *Fails*
>>>>>>
>>>>>> Wrapping a couple parameters in a Row causes the following exception.
>>>>>>
>>>>>>      Caused by: org.apache.flink.sql.parser.impl.ParseException: 
>>>>>> Encountered "." at line 1, column 119.
>>>>>> Was expecting one of:
>>>>>>     ")" ...
>>>>>>     "," ...
>>>>>>
>>>>>>        
>>>>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
>>>>>>        
>>>>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
>>>>>>        
>>>>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.ParenthesizedSimpleIdentifierList(FlinkSqlParserImpl.java:25220)
>>>>>>        
>>>>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression3(FlinkSqlParserImpl.java:19925)
>>>>>>        
>>>>>> org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.Expression2b(FlinkSqlParserImpl.java:19581)
>>>>>>        [...]
>>>>>>
>>>>>>
>>>>>> CREATE TEMPORARY VIEW `test_content_metrics_view` AS
>>>>>> SELECT
>>>>>>     DATE_FORMAT(TUMBLE_ROWTIME(rowtime, INTERVAL '1' DAY), 'yyyy-MM-dd'),
>>>>>>     ROW(
>>>>>>         platform_id,
>>>>>>         content_id
>>>>>>     )
>>>>>> FROM content_event
>>>>>> GROUP BY
>>>>>>     platform_id,
>>>>>>     content_id,
>>>>>>     TUMBLE(rowtime, INTERVAL '1' DAY)
>>>>>>
>>>>>> CREATE TABLE test_content_metrics (
>>>>>>    dt STRING NOT NULL,
>>>>>>    `body` ROW(
>>>>>>        `platform_id` BIGINT,
>>>>>>        `content_id` STRING
>>>>>>    )
>>>>>> ) PARTITIONED BY (dt) WITH (
>>>>>>    'connector' = 'filesystem',
>>>>>>    'path' = 'etl/test_content_metrics',
>>>>>>    'format' = 'json',
>>>>>> )
>>>>>>
>>>>>> INSERT INTO `test_content_metrics`
>>>>>> SELECT * FROM `test_content_metrics_view`
>>>>>>
>>>>>>
>>>>>>

Reply via email to