[ https://issues.apache.org/jira/browse/SPARK-16387?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15364870#comment-15364870 ]
Dongjoon Hyun commented on SPARK-16387: --------------------------------------- Oh, it means Pull Request. Since you know `JdbcDialect` class, I think you can make a code patch for that. > Reserved SQL words are not escaped by JDBC writer > ------------------------------------------------- > > Key: SPARK-16387 > URL: https://issues.apache.org/jira/browse/SPARK-16387 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Lev > > Here is a code (imports are omitted) > object Main extends App { > val sqlSession = SparkSession.builder().config(new SparkConf(). > setAppName("Sql Test").set("spark.app.id", "SQLTest"). > set("spark.master", "local[2]"). > set("spark.ui.enabled", "false") > .setJars(Seq("/mysql/mysql-connector-java-5.1.38.jar" )) > ).getOrCreate() > import sqlSession.implicits._ > val localprops = new Properties > localprops.put("user", "xxxx") > localprops.put("password", "xxxx") > val df = sqlSession.createDataset(Seq("a","b","c")).toDF("order") > val writer = df.write > .mode(SaveMode.Append) > writer > .jdbc("jdbc:mysql://localhost:3306/test3", s"jira_test", localprops) > } > End error is : > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error > in your SQL syntax; check the manual that corresponds to your MySQL server > version for the right syntax to use near 'order TEXT )' at line 1 > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:422) > Clearly the reserved word <order> has to be quoted -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org