[ 
https://issues.apache.org/jira/browse/SPARK-11623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15024245#comment-15024245
 ] 

Siva Gudavalli commented on SPARK-11623:
----------------------------------------

Sean Owen, this is different from LIMIT. 

It seems write.jdbc is not acknowledging SaveMode.Append mode.  

Inside DataFrameWriter.scala if we want to insert records in the existing 
table,  we dont have a way around. 

No matter what, write.jdbc always verify if the table exists. SaveMode.Append 
should by pass the check i.e.. equivalent to InsertIntoJdbc

There is no way we can insert data into existing table

Regards
Shiv

> Sparksql-1.4.1 DataFrameWrite.jdbc() bug
> ----------------------------------------
>
>                 Key: SPARK-11623
>                 URL: https://issues.apache.org/jira/browse/SPARK-11623
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, Spark Submit, SQL
>    Affects Versions: 1.4.1, 1.5.1
>         Environment: Spark stand alone cluster
>            Reporter: Nguyen Van Nghia
>
> I am running spark-submit in window 8.1 with spark standalone cluster (01 
> worker and 01 master), the job throw Exception  in DataFrameWrite.jdbc(..) 
> scala function.
> We found that the following test:
>                          var tableExists = JdbcUtils.tableExists(conn, table) 
> always return false event if we already created a table. 
> That drive the function to do creating table from specified  DataFrame and 
> the SQL Syntax error for creating table, we locate the SQL execution 
> statement hereafter:
>       if (!tableExists) {
>         val schema = JDBCWriteDetails.schemaString(df, url)
>         val sql = s"CREATE TABLE $table ($schema)"
>         conn.prepareStatement(sql).executeUpdate() // This execution cause 
> sql syntax error
>       }
> This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
> Please help!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to