[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2016-10-03 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15543059#comment-15543059
 ] 

Apache Spark commented on SPARK-10634:
--

User 'dilipbiswal' has created a pull request for this issue:
https://github.com/apache/spark/pull/15332

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at scala.Option.getOrElse(Option.scala:120)
>   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
>   at 
> com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
>   at 
> com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
>   at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
>   at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
>   at 
> com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRefineEngine.java:1011)
>   ... 31 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2016-10-03 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15543104#comment-15543104
 ] 

Apache Spark commented on SPARK-10634:
--

User 'dilipbiswal' has created a pull request for this issue:
https://github.com/apache/spark/pull/15334

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at scala.Option.getOrElse(Option.scala:120)
>   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
>   at 
> com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
>   at 
> com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
>   at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
>   at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
>   at 
> com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRefineEngine.java:1011)
>   ... 31 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2015-09-16 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14768938#comment-14768938
 ] 

Sean Owen commented on SPARK-10634:
---

Shouldn't the " be escaped in some way? or else I'm not sure how the parser 
would know the end of the literal from a quote inside the literal.

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at scala.Option.getOrElse(Option.scala:120)
>   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
>   at 
> com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
>   at 
> com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
>   at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
>   at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
>   at 
> com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRefineEngine.java:1011)
>   ... 31 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2015-09-16 Thread Prachi Burathoki (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14769011#comment-14769011
 ] 

Prachi Burathoki commented on SPARK-10634:
--

I tried by escaping with \, but still same error

Caused by: java.lang.RuntimeException: [1.130] failure: ``)'' expected but 
identifier test found

SELECT clistc1426336010, corc2125646118, candc2031403851, SYSIBM_ROW_NUMBER 
FROM TABLE_1 WHERE ((clistc1426336010 = "this is a \"test\""))

 ^
at scala.sys.package$.error(package.scala:27)
at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
at 
org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
at 
org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
at 
org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
at 
org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at 
scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at 
scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
at 
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
at 
org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
at 
org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
at 
com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
at 
com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
at 
com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
at 
com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
at 
com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRef

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql

[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2015-09-16 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14769057#comment-14769057
 ] 

Sean Owen commented on SPARK-10634:
---

Don't you need to use "" to quote double quotes? I actually am not sure what 
Spark supports but that's not uncommon in SQL. I don't think you're escaping 
the quotes here.

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at scala.Option.getOrElse(Option.scala:120)
>   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
>   at 
> com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
>   at 
> com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
>   at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
>   at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
>   at 
> com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRefineEngine.java:1011)
>   ... 31 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10634) The spark sql fails if the where clause contains a string with " in it.

2015-09-16 Thread Prachi Burathoki (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10634?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14790634#comment-14790634
 ] 

Prachi Burathoki commented on SPARK-10634:
--

I tried escaping" with both \ and ".But got the same error.But I've found a 
workaround, in the where clause i changed it to listc = 'this is a"test"'  and 
that works.
Thanks

> The spark sql fails if the where clause contains a string with " in it.
> ---
>
> Key: SPARK-10634
> URL: https://issues.apache.org/jira/browse/SPARK-10634
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.3.1
>Reporter: Prachi Burathoki
>
> When running a sql query in which the where clause contains a string with " 
> in it, the sql parser throws an error.
> Caused by: java.lang.RuntimeException: [1.127] failure: ``)'' expected but 
> identifier test found
> SELECT clistc215647292, corc1749453704, candc1501025950, SYSIBM_ROW_NUMBER 
> FROM TABLE_1 WHERE ((clistc215647292 = "this is a "test""))
>   
> ^
>   at scala.sys.package$.error(package.scala:27)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:134)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
>   at 
> org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at 
> org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:138)
>   at scala.Option.getOrElse(Option.scala:120)
>   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:138)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:933)
>   at 
> com.ibm.is.drs.engine.spark.sql.task.SQLQueryTask.createTargetRDD(SQLQueryTask.java:106)
>   at 
> com.ibm.is.drs.engine.spark.sql.SQLQueryNode.createTargetRDD(SQLQueryNode.java:93)
>   at com.ibm.is.drs.engine.spark.sql.SQLNode.doExecute(SQLNode.java:153)
>   at com.ibm.is.drs.engine.spark.api.BaseNode.execute(BaseNode.java:291)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:840)
>   at 
> com.ibm.is.drs.engine.spark.api.SessionContext.applyDataShaping(SessionContext.java:752)
>   at 
> com.ibm.is.drs.engine.spark.api.SparkRefineEngine.applyDataShaping(SparkRefineEngine.java:1011)
>   ... 31 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org