[ https://issues.apache.org/jira/browse/SPARK-20964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16405962#comment-16405962 ]
Alex Ott edited comment on SPARK-20964 at 3/20/18 8:35 AM: ----------------------------------------------------------- Just want to add another example of query that is rejected by sqlite, but works fine with Spark SQL: {{SELECT state, count(*) FROM user_addresses *where* group by state;}} In this case the *where* keyword is treated as table alias - although it matches to SQL specification, it really just hides the error that I did in this query by forgetting to add condition expression was (Author: alexott): Just want to add another example of query that is rejected by sqlite, but works fine with Spark SQL: SELECT state, count(*) FROM user_addresses *where* group by state; In this case the *where* keyword is treated as table alias - although it matches to SQL specification, it really just hides the error that I did in this query by forgetting to add condition expression > Make some keywords reserved along with the ANSI/SQL standard > ------------------------------------------------------------ > > Key: SPARK-20964 > URL: https://issues.apache.org/jira/browse/SPARK-20964 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.1.1 > Reporter: Takeshi Yamamuro > Priority: Minor > > The current Spark has many non-reserved words that are essentially reserved > in the ANSI/SQL standard > (http://developer.mimer.se/validator/sql-reserved-words.tml). > https://github.com/apache/spark/blob/master/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4#L709 > This is because there are many datasources (for instance twitter4j) that > unfortunately use reserved keywords for column names (See [~hvanhovell]'s > comments: https://github.com/apache/spark/pull/18079#discussion_r118842186). > We might fix this issue in future major releases. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org