[ https://issues.apache.org/jira/browse/SPARK-11953?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15025538#comment-15025538 ]
Siva Gudavalli commented on SPARK-11953: ---------------------------------------- I Agree. It depends on how we define SaveMode.Append. Looking for an option similar to InsertIntoJdbc in 1.4.1 > CLONE - Sparksql-1.4.1 DataFrameWrite.jdbc() SaveMode.Append Bug > ---------------------------------------------------------------- > > Key: SPARK-11953 > URL: https://issues.apache.org/jira/browse/SPARK-11953 > Project: Spark > Issue Type: Bug > Components: Java API, Spark Submit, SQL > Affects Versions: 1.4.1, 1.5.1 > Environment: Spark stand alone cluster > Reporter: Siva Gudavalli > > In Spark 1.3.1 we have 2 methods i.e.. CreateJdbcTable and InsertIntoJdbc. > They are replaced with write.jdbc() in Spark 1.4.1 > When we specify SaveMode.Append we are letting application know that there is > a table in the database which means "tableExists = true". And we do not need > to perform "JdbcUtils.tableExists(conn, table)". > Please let me know if you think differently. > Regards > Shiv > def jdbc(url: String, table: String, connectionProperties: Properties): Unit > = { > val conn = JdbcUtils.createConnection(url, connectionProperties) > try { > var tableExists = JdbcUtils.tableExists(conn, table) > if (mode == SaveMode.Ignore && tableExists) > { return } > if (mode == SaveMode.ErrorIfExists && tableExists) > { sys.error(s"Table $table already exists.") } > if (mode == SaveMode.Overwrite && tableExists) > { JdbcUtils.dropTable(conn, table) tableExists = false } > // Create the table if the table didn't exist. > if (!tableExists) > { val schema = JDBCWriteDetails.schemaString(df, url) val sql = s"CREATE > TABLE $table ($schema)" conn.prepareStatement(sql).executeUpdate() } > } finally > { conn.close() } > JDBCWriteDetails.saveTable(df, url, table, connectionProperties) > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org