[ https://issues.apache.org/jira/browse/SPARK-8616?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14611811#comment-14611811 ]
David Sabater edited comment on SPARK-8616 at 7/2/15 11:39 AM: --------------------------------------------------------------- I would assume the error here is the lack of support for column names containing characters like " ,;{}() =" (This includes whitespaces which was my initial issue) If we are ok restricting this we just need to improve the error message when the exception is raised. I would suggest to revisit this in the Maillist to see what are the opinions out there. was (Author: dsdinter): I would assume the error here is the lack of support for columns containing characters like " ,;{}() =" (This includes whitespaces which was my initial issue) If we are ok restricting this we just need to improve the error message when the exception is raised. I would suggest to revisit this in the Maillist to see what are the opinions out there. > SQLContext doesn't handle tricky column names when loading from JDBC > -------------------------------------------------------------------- > > Key: SPARK-8616 > URL: https://issues.apache.org/jira/browse/SPARK-8616 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.0 > Environment: Ubuntu 14.04, Sqlite 3.8.7, Spark 1.4.0 > Reporter: Gergely Svigruha > > Reproduce: > - create a table in a relational database (in my case sqlite) with a column > name containing a space: > CREATE TABLE my_table (id INTEGER, "tricky column" TEXT); > - try to create a DataFrame using that table: > sqlContext.read.format("jdbc").options(Map( > "url" -> "jdbs:sqlite:...", > "dbtable" -> "my_table")).load() > java.sql.SQLException: [SQLITE_ERROR] SQL error or missing database (no such > column: tricky) > According to the SQL spec this should be valid: > http://savage.net.au/SQL/sql-99.bnf.html#delimited%20identifier -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org