[ 
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17133215#comment-17133215
 ] 

Takeshi Yamamuro edited comment on SPARK-26905 at 6/11/20, 12:15 PM:
---------------------------------------------------------------------

Ah, I got noticed that they are already reserved in the ANSI mode (the document 
says so, too: 
[https://github.com/apache/spark/blob/master/docs/sql-ref-ansi-compliance.md]);
{code:java}
scala> sql("SET spark.sql.ansi.enabled=false")

scala> sql("create table t1 (anti int)")
res10: org.apache.spark.sql.DataFrame = []
scala> sql("create table t2 (semi int)")
res11: org.apache.spark.sql.DataFrame = []
scala> sql("create table t3 (minus int)")
res12: org.apache.spark.sql.DataFrame = []

scala> sql("SET spark.sql.ansi.enabled=true")

scala> sql("create table t4 (anti int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'anti'(line 1, pos 17)

== SQL ==
create table t4 (anti int)
-----------------^^^

scala> sql("create table t5 (semi int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'semi'(line 1, pos 17)

== SQL ==
create table t5 (semi int)
-----------------^^^

scala> sql("create table t6 (minus int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'minus'(line 1, pos 17)

== SQL ==
create table t6 (minus int)
-----------------^^^
{code}
Actually, the Spark reserved keywords in the ANSI mode are computed by 
`spark-keywords-list.txt – spark-ansiNonReserved.txt – "strict non-reserved 
keywords"`. Yay, I think we already follow the standard SQL2016, haha


was (Author: maropu):
Ah, I got noticed that they are already reserved in the ANSI mode (the document 
says so, too: 
[https://github.com/apache/spark/blob/master/docs/sql-ref-ansi-compliance.md]);
{code:java}
scala> sql("SET spark.sql.ansi.enabled=false")

scala> sql("create table t1 (anti int)")
res10: org.apache.spark.sql.DataFrame = []

scala> sql("create table t2 (semi int)")
res11: org.apache.spark.sql.DataFrame = []

scala> sql("create table t3 (minus int)")
res12: org.apache.spark.sql.DataFrame = []

scala> sql("SET spark.sql.ansi.enabled=true")

scala> sql("create table t4 (anti int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'anti'(line 1, pos 17)

== SQL ==
create table t4 (anti int)
-----------------^^^

scala> sql("create table t5 (semi int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'semi'(line 1, pos 17)

== SQL ==
create table t5 (semi int)
-----------------^^^

scala> sql("create table t6 (minus int)")
org.apache.spark.sql.catalyst.parser.ParseException:
no viable alternative at input 'minus'(line 1, pos 17)

== SQL ==
create table t6 (minus int)
-----------------^^^
{code}
Actually, the Spark reserved keywords in the ANSI mode are computed by 
`spark-keywords-list.txt – spark-ansiNonReserved.txt – "strict non-reserved 
keywords"`. Yay, I think we already follow the standard SQL2016, haha

> Revisit reserved/non-reserved keywords based on the ANSI SQL standard
> ---------------------------------------------------------------------
>
>                 Key: SPARK-26905
>                 URL: https://issues.apache.org/jira/browse/SPARK-26905
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Xiao Li
>            Priority: Major
>         Attachments: spark-ansiNonReserved.txt, spark-keywords-list.txt, 
> spark-nonReserved.txt, spark-strictNonReserved.txt, 
> sql2016-02-nonreserved.txt, sql2016-02-reserved.txt, 
> sql2016-09-nonreserved.txt, sql2016-09-reserved.txt, 
> sql2016-14-nonreserved.txt, sql2016-14-reserved.txt
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to