[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16848667#comment-16848667 ] Hyukjin Kwon commented on SPARK-13283: -- It's just closed due to EOL affect version set. See http://apache-spark-developers-list.1001551.n3.nabble.com/Resolving-all-JIRAs-affecting-EOL-releases-td27238.html > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński >Priority: Major > Labels: bulk-closed > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16848660#comment-16848660 ] Maciej Bryński commented on SPARK-13283: [~hyukjin.kwon] What is resolution: incomplete ? > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński >Priority: Major > Labels: bulk-closed > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15149396#comment-15149396 ] Apache Spark commented on SPARK-13283: -- User 'xguo27' has created a pull request for this issue: https://github.com/apache/spark/pull/11224 > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15149387#comment-15149387 ] Xiu (Joe) Guo commented on SPARK-13283: --- Yes, it is a different problem from [SPARK-13297|https://issues.apache.org/jira/browse/SPARK-13297]. We should escape the column name based on JdbcDialect. > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15148200#comment-15148200 ] Maciej Bryński commented on SPARK-13283: Yep. For MySQL this could look like this: {code} sb.append(s", `$name` $typ $nullable") {code} For other RDBMS: {code} sb.append(s", "$name" $typ $nullable") {code} > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15148196#comment-15148196 ] Adrian Wang commented on SPARK-13283: - So the problem here is that "from" is a reserved word in MySQL, but we failed to keep the backtick around it, do we? > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15148189#comment-15148189 ] Maciej Bryński commented on SPARK-13283: No it's not fixed. Problem is in: https://github.com/apache/spark/blob/0d42292f6a2dbe626e8f6a50e6c61dd79533f235/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L255 The trick is that different RDBMS are using different quoting sign. Most of them are using ", but MySQL `. So we have to add quote sign to dialect. > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-13283) Spark doesn't escape column names when creating table on JDBC
[ https://issues.apache.org/jira/browse/SPARK-13283?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15148171#comment-15148171 ] Adrian Wang commented on SPARK-13283: - See comments from SPARK-13297, this have been fixed in master branch. > Spark doesn't escape column names when creating table on JDBC > - > > Key: SPARK-13283 > URL: https://issues.apache.org/jira/browse/SPARK-13283 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.6.0 >Reporter: Maciej Bryński > > Hi, > I have following problem. > I have DF where one of the columns has 'from' name. > {code} > root > |-- from: decimal(20,0) (nullable = true) > {code} > When I'm saving it to MySQL database I'm getting error: > {code} > Py4JJavaError: An error occurred while calling o183.jdbc. > : com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an > error in your SQL syntax; check the manual that corresponds to your MySQL > server version for the right syntax to use near 'from DECIMAL(20,0) , ' at > line 1 > {code} > I think the problem is that Spark doesn't escape column names with ` sign on > creating table. > {code} > `from` > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org