[ https://issues.apache.org/jira/browse/SPARK-6898?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14641446#comment-14641446 ]
Wenchen Fan commented on SPARK-6898: ------------------------------------ [~dsdinter] I tried your case on 1.4.1 {code} sqlContext.jsonRDD(sc.makeRDD("""{"a": {"c.b": 1}, "b.$q": [{"a@!.q": 1}], "q.w": {"w.i&": [1]}}""" :: Nil)).registerTempTable("t") sqlContext.sql("SELECT a.`c.b`, `b.$q`[0].`a@!.q`, `q.w`.`w.i&`[0] FROM t") {code} It has no problem, did you use jdbc or thrift-server to run the query? can you copy and paste the full stack-trace of this issue? > Special chars in column names is broken > --------------------------------------- > > Key: SPARK-6898 > URL: https://issues.apache.org/jira/browse/SPARK-6898 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Wenchen Fan > Assignee: Wenchen Fan > Fix For: 1.4.0 > > > This function is added a long time ago, but it's not complete, it will fail > if we have "." inside column name. > {code} > test("SPARK-3483 Special chars in column names") { > val data = sparkContext.parallelize( > Seq("""{"key?number1": "value1", "key.number2": "value2"}""")) > jsonRDD(data).registerTempTable("records") > sql("SELECT `key?number1`, `key.number2` FROM records") > } > {code} > this test will fail. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org