[ https://issues.apache.org/jira/browse/SPARK-30049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998331#comment-16998331 ]
Jason Darrell Lowe commented on SPARK-30049: -------------------------------------------- Found some time to track this down to the following commit which first regressed the parsing behavior: {noformat} commit 148cd26799c69ab9cfdc2b3b8000a194c12518b8 (HEAD, refs/bisect/bad) Author: Yuming Wang <yumw...@ebay.com> Date: Sat Oct 12 22:21:14 2019 -0700 [SPARK-26321][SQL] Port HIVE-15297: Hive should not split semicolon within quoted string literals ## What changes were proposed in this pull request? This pr port [HIVE-15297](https://issues.apache.org/jira/browse/HIVE-15297) to fix **spark-sql** should not split semicolon within quoted string literals. ## How was this patch tested? unit tests and manual tests:  Closes #25018 from wangyum/SPARK-26321. Authored-by: Yuming Wang <yumw...@ebay.com> Signed-off-by: Yuming Wang <wgy...@gmail.com> {noformat} [~yumwang] would you mind taking a look? > SQL fails to parse when comment contains an unmatched quote character > --------------------------------------------------------------------- > > Key: SPARK-30049 > URL: https://issues.apache.org/jira/browse/SPARK-30049 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Jason Darrell Lowe > Priority: Major > > A SQL statement that contains a comment with an unmatched quote character can > lead to a parse error. These queries parsed correctly in older versions of > Spark. For example, here's an excerpt from an interactive spark-sql session > on a recent Spark-3.0.0-SNAPSHOT build (commit > e23c135e568d4401a5659bc1b5ae8fc8bf147693): > {noformat} > spark-sql> SELECT 1 -- someone's comment here > > ; > Error in query: > extraneous input ';' expecting <EOF>(line 2, pos 0) > == SQL == > SELECT 1 -- someone's comment here > ; > ^^^ > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org