[jira] [Assigned] (SPARK-14460) DataFrameWriter JDBC doesn't Quote/Escape column names
[ https://issues.apache.org/jira/browse/SPARK-14460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-14460: Assignee: Apache Spark > DataFrameWriter JDBC doesn't Quote/Escape column names > -- > > Key: SPARK-14460 > URL: https://issues.apache.org/jira/browse/SPARK-14460 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.1 >Reporter: Sean Rose >Assignee: Apache Spark > Labels: easyfix > > When I try to write a DataFrame which contains a column with a space in it > ("Patient Address"), I get an error: java.sql.BatchUpdateException: Incorrect > syntax near 'Address' > I believe the issue is that JdbcUtils.insertStatement isn't quoting/escaping > column names. JdbcDialect has the "quoteIdentifier" method, which could be > called. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-14460) DataFrameWriter JDBC doesn't Quote/Escape column names
[ https://issues.apache.org/jira/browse/SPARK-14460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-14460: Assignee: (was: Apache Spark) > DataFrameWriter JDBC doesn't Quote/Escape column names > -- > > Key: SPARK-14460 > URL: https://issues.apache.org/jira/browse/SPARK-14460 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.1 >Reporter: Sean Rose > Labels: easyfix > > When I try to write a DataFrame which contains a column with a space in it > ("Patient Address"), I get an error: java.sql.BatchUpdateException: Incorrect > syntax near 'Address' > I believe the issue is that JdbcUtils.insertStatement isn't quoting/escaping > column names. JdbcDialect has the "quoteIdentifier" method, which could be > called. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org