[ 
https://issues.apache.org/jira/browse/SPARK-8386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14947815#comment-14947815
 ] 

Peter Haumer commented on SPARK-8386:
-------------------------------------

Huaxin Gao, sorry for not replying earlier. It slipped through the cracks. 

I had stopped using the Spark jdbc framework completely because of this bug and 
implemented my own as I need to support DB2, Derby, SQL Server, as well as 
Oracle and ran into this issue with SQL Server and DB2. DB2's Limit is quite 
different: 
http://www-01.ibm.com/support/knowledgecenter/SSEPGG_10.5.0/com.ibm.db2.luw.sql.ref.doc/doc/r0059212.html?lang=en.
 

However, I ended up using the jdbc meta-data, which in fact does not perform 
too well on all DBMS, though; and it might be better to provide different 
queries for each DBMS here:

final ResultSet resultSet = connection.getMetaData().getTables(null,schemaName, 
tableName, null);
if (resultSet.next()) {
     return true;
}

> DataFrame and JDBC regression
> -----------------------------
>
>                 Key: SPARK-8386
>                 URL: https://issues.apache.org/jira/browse/SPARK-8386
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: RHEL 7.1
>            Reporter: Peter Haumer
>            Priority: Critical
>
> I have an ETL app that appends to a JDBC table new results found at each run. 
>  In 1.3.1 I did this:
> testResultsDF.insertIntoJDBC(CONNECTION_URL, TABLE_NAME, false);
> When I do this now in 1.4 it complains that the "object" 'TABLE_NAME' already 
> exists. I get this even if I switch the overwrite to true.  I also tried this 
> now:
> testResultsDF.write().mode(SaveMode.Append).jdbc(CONNECTION_URL, TABLE_NAME, 
> connectionProperties);
> getting the same error. It works running the first time creating the new 
> table and adding data successfully. But, running it a second time it (the 
> jdbc driver) will tell me that the table already exists. Even 
> SaveMode.Overwrite will give me the same error. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to