[ https://issues.apache.org/jira/browse/SPARK-18502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15697210#comment-15697210 ]
Takeshi Yamamuro commented on SPARK-18502: ------------------------------------------ Please give us a simple query to reproduce this? I tried a simple query though, the query passed; {code} scala> val df = Seq(("a", 1), ("b", 2), ("c", 1), ("d", 5)).toDF("`k`ey`", "value") df: org.apache.spark.sql.DataFrame = [`k`ey`: string, value: int] scala> df.show +------+-----+ |`k`ey`|value| +------+-----+ | a| 1| | b| 2| | c| 1| | d| 5| +------+-----+ {code} > Spark does not handle columns that contain backquote (`) > -------------------------------------------------------- > > Key: SPARK-18502 > URL: https://issues.apache.org/jira/browse/SPARK-18502 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Barry Becker > Priority: Minor > > I know that if a column contains dots or hyphens we can put > backquotes/backticks around it, but what if the column contains a backtick > (`)? Can the back tick be escaped by some means? > Here is an example of the sort of error I see > {code} > org.apache.spark.sql.AnalysisException: syntax error in attribute name: > `Invoice`Date`;org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.e$1(unresolved.scala:99) > > org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.parseAttributeName(unresolved.scala:109) > > org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.quotedString(unresolved.scala:90) > org.apache.spark.sql.Column.(Column.scala:113) > org.apache.spark.sql.Column$.apply(Column.scala:36) > org.apache.spark.sql.functions$.min(functions.scala:407) > com.mineset.spark.vizagg.vizbin.strategies.DateBinStrategy.getDateExtent(DateBinStrategy.scala:158) > > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org