[jira] [Assigned] (SPARK-20399) Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string in parser

2017-05-11 Thread Wenchen Fan (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-20399?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan reassigned SPARK-20399:
---

Assignee: Liang-Chi Hsieh

> Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string 
> in parser
> --
>
> Key: SPARK-20399
> URL: https://issues.apache.org/jira/browse/SPARK-20399
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.2.0
>Reporter: Liang-Chi Hsieh
>Assignee: Liang-Chi Hsieh
> Fix For: 2.2.0
>
>
> The new SQL parser is introduced into Spark 2.0. Seems it bring an issue 
> regarding the regex pattern string.
> The following codes can reproduce it:
> {code}
> val data = Seq("\u0020\u0021\u0023", "abc")
> val df = data.toDF()
> // 1st usage: works in 1.6
> // Let parser parse pattern string
> val rlike1 = df.filter("value rlike '^\\x20[\\x20-\\x23]+$'")
> // 2nd usage: works in 1.6, 2.x
> // Call Column.rlike so the pattern string is a literal which doesn't go 
> through parser
> val rlike2 = df.filter($"value".rlike("^\\x20[\\x20-\\x23]+$"))
> // In 2.x, we need add backslashes to make regex pattern parsed correctly
> val rlike3 = df.filter("value rlike '^x20[x20-x23]+$'")
> {code}
> Due to unescaping SQL String in parser, the first usage working in 1.6 can't 
> work in 2.0. To make it work, we need to add additional backslashes.
> It is quite weird that we can't use the same regex pattern string in the 2 
> usages. I think we should not unescape regex pattern string.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-20399) Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string in parser

2017-04-23 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-20399?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-20399:


Assignee: (was: Apache Spark)

> Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string 
> in parser
> --
>
> Key: SPARK-20399
> URL: https://issues.apache.org/jira/browse/SPARK-20399
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.2.0
>Reporter: Liang-Chi Hsieh
>
> The new SQL parser is introduced into Spark 2.0. Seems it bring an issue 
> regarding the regex pattern string.
> The following codes can reproduce it:
> {code}
> val data = Seq("\u0020\u0021\u0023", "abc")
> val df = data.toDF()
> // 1st usage: let parser parse pattern string: works in 1.6
> val rlike1 = df.filter("value rlike '^\\x20[\\x20-\\x23]+$'")
> // 2nd usage: call Column.rlike so the pattern string is a literal which 
> doesn't go through parser
> val rlike2 = df.filter($"value".rlike("^\\x20[\\x20-\\x23]+$"))  // 2: works 
> in 1.6, 2.x
> // To make 1st usage work, we need to add backslashes like this in 2.x:
> val rlike3 = df.filter("value rlike '^x20[x20-x23]+$'")
> {code}
> Due to unescaping SQL String in parser, the first usage working in 1.6 can't 
> work in 2.0. To make it work, we need to add additional backslashes.
> It is quite weird that we can't use the same regex pattern string in the 2 
> usages. I think we should not unescape regex pattern string.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-20399) Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string in parser

2017-04-23 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-20399?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-20399:


Assignee: Apache Spark

> Can't use same regex pattern between 1.6 and 2.x due to unescaped sql string 
> in parser
> --
>
> Key: SPARK-20399
> URL: https://issues.apache.org/jira/browse/SPARK-20399
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 2.2.0
>Reporter: Liang-Chi Hsieh
>Assignee: Apache Spark
>
> The new SQL parser is introduced into Spark 2.0. Seems it bring an issue 
> regarding the regex pattern string.
> The following codes can reproduce it:
> {code}
> val data = Seq("\u0020\u0021\u0023", "abc")
> val df = data.toDF()
> // 1st usage: let parser parse pattern string: works in 1.6
> val rlike1 = df.filter("value rlike '^\\x20[\\x20-\\x23]+$'")
> // 2nd usage: call Column.rlike so the pattern string is a literal which 
> doesn't go through parser
> val rlike2 = df.filter($"value".rlike("^\\x20[\\x20-\\x23]+$"))  // 2: works 
> in 1.6, 2.x
> // To make 1st usage work, we need to add backslashes like this in 2.x:
> val rlike3 = df.filter("value rlike '^x20[x20-x23]+$'")
> {code}
> Due to unescaping SQL String in parser, the first usage working in 1.6 can't 
> work in 2.0. To make it work, we need to add additional backslashes.
> It is quite weird that we can't use the same regex pattern string in the 2 
> usages. I think we should not unescape regex pattern string.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org