[ 
https://issues.apache.org/jira/browse/SPARK-13297?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15145566#comment-15145566
 ] 

Xiu (Joe) Guo commented on SPARK-13297:
---------------------------------------

Looks like in the current [master 
branch|https://github.com/apache/spark/tree/42d656814f756599a2bc426f0e1f32bd4cc4470f],
 this problem is fixed.

{code}
scala> val columnName = "col`s"
columnName: String = col`s

scala> val rows = List(Row("foo"), Row("bar"))
rows: List[org.apache.spark.sql.Row] = List([foo], [bar])

scala> val schema = StructType(Seq(StructField(columnName, StringType)))
schema: org.apache.spark.sql.types.StructType = 
StructType(StructField(col`s,StringType,true))

scala> val rdd = sc.parallelize(rows)
rdd: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row] = 
ParallelCollectionRDD[0] at parallelize at <console>:28

scala> val df = sqlContext.createDataFrame(rdd, schema)
df: org.apache.spark.sql.DataFrame = [col`s: string]

scala> val selectingColumnName = "`" + columnName.replace("`", "``") + "`"
selectingColumnName: String = `col``s`

scala> selectingColumnName
res0: String = `col``s`

scala> val selectedDf = df.selectExpr(selectingColumnName)
selectedDf: org.apache.spark.sql.DataFrame = [col`s: string]

scala> selectedDf.show
+-----+
|col`s|
+-----+
|  foo|
|  bar|
+-----+
{code}

> [SQL] Backticks cannot be escaped in column names
> -------------------------------------------------
>
>                 Key: SPARK-13297
>                 URL: https://issues.apache.org/jira/browse/SPARK-13297
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Grzegorz Chilkiewicz
>            Priority: Minor
>
> We want to use backticks to escape spaces & minus signs in column names.
> Are we unable to escape backticks when a column name is surrounded by 
> backticks?
> It is not documented in: 
> http://spark.apache.org/docs/latest/sql-programming-guide.html
> In MySQL there is a way: double the backticks, but this trick doesn't work in 
> Spark-SQL.
> Am I correct or just missing something? Is there a way to escape backticks 
> inside a column name when it is surrounded by backticks?
> Code to reproduce the problem:
> https://github.com/grzegorz-chilkiewicz/SparkSqlEscapeBacktick



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to