[ https://issues.apache.org/jira/browse/SPARK-19296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15831033#comment-15831033 ]
Paul Wu edited comment on SPARK-19296 at 1/20/17 9:52 PM: ---------------------------------------------------------- We found this Util is very useful in general (much, much better than primitive jdbc) and have been using it since 1.5.x... didn't realize it is internal. It will a big regret for us to not be able to use it. But it seems it is a pain for us now. I guess for code quality purpose, at least refactor the code to eliminate the duplication args. was (Author: zwu....@gmail.com): We found this Util is very useful in general (much, much better than primitive jdbc) and have been using it since 1.3.x... didn't realize it is internal. It will a big regret for us to not be able to use it. But it seems it is a pain for us now. I guess for code quality purpose, at least refactor the code to eliminate the duplication args. > Awkward changes for JdbcUtils.saveTable in Spark 2.1.0 > ------------------------------------------------------ > > Key: SPARK-19296 > URL: https://issues.apache.org/jira/browse/SPARK-19296 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.1.0 > Reporter: Paul Wu > Priority: Minor > > The Change from JdbcUtils.saveTable(DataFrame, String, String, Property) to > JdbcUtils.saveTable(DataFrame, String, String, JDBCOptions), not only > incompatible to previous versions (so the previous code in java won't > compile, but also introduced silly code change: One has to specify url and > table twice like this: > JDBCOptions jdbcOptions = new JDBCOptions(url, table, map); > JdbcUtils.saveTable(ds, url, table,jdbcOptions); > Why does one have to supply the same things ulr, table twice? (If you don't > specify it in both places, the exception will be thrown). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org