[ 
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129904#comment-17129904
 ] 

Takeshi Yamamuro edited comment on SPARK-26905 at 6/10/20, 1:20 AM:
--------------------------------------------------------------------

Thanks for checking them, Maxim! Ah, I see. I think we forgot to add `TYPE` in 
the ANSI non-reserved list. But, IIRC we couldn't simply add `SEMI`, `MINUS`, 
and `ANTI` in the non-reserved list because of the discussion: 
[https://github.com/apache/spark/pull/23259#pullrequestreview-205855655] 
Probably, we need to refactor the parser before we add them as non-reserved. 


was (Author: maropu):
Thanks for checking them, Maxim! Ah, I see. I think we forgot to add `TYPE` in 
the ANSI non-reserved list. But, IIUC we couldn't simply add `SEMI`, `MINUS`, 
and `ANTI` in the non-reserved list because of the discussion: 
[https://github.com/apache/spark/pull/23259#pullrequestreview-205855655] 
Probably, we need to refactor the parser before we add them as non-reserved. 

> Revisit reserved/non-reserved keywords based on the ANSI SQL standard
> ---------------------------------------------------------------------
>
>                 Key: SPARK-26905
>                 URL: https://issues.apache.org/jira/browse/SPARK-26905
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Xiao Li
>            Priority: Major
>         Attachments: spark-ansiNonReserved.txt, spark-keywords-list.txt, 
> spark-nonReserved.txt, spark-strictNonReserved.txt, 
> sql2016-02-nonreserved.txt, sql2016-02-reserved.txt, 
> sql2016-09-nonreserved.txt, sql2016-09-reserved.txt, 
> sql2016-14-nonreserved.txt, sql2016-14-reserved.txt
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to