Not for sure, but I think it is bug as of 1.5.

Spark is using LIMIT keyword whether a table exists.
https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L48

If your database does not support LIMIT keyword such as SQL Server, spark
try to create table
https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala#L272-L275

This issue has already fixed and It will be released on 1.6
https://issues.apache.org/jira/browse/SPARK-9078


--
Cheon

2015-12-09 22:54 GMT+09:00 kali.tumm...@gmail.com <kali.tumm...@gmail.com>:

> Hi Spark Contributors,
>
> I am trying to append data  to target table using df.write.mode("append")
> functionality but spark throwing up table already exists exception.
>
> Is there a fix scheduled in later spark release ?, I am using spark 1.5.
>
> val sourcedfmode=sourcedf.write.mode("append")
> sourcedfmode.jdbc(TargetDBinfo.url,TargetDBinfo.table,targetprops)
>
> Full Code:-
>
> https://github.com/kali786516/ScalaDB/blob/master/src/main/java/com/kali/db/SaprkSourceToTargetBulkLoad.scala
>
> Spring Config File:-
>
> https://github.com/kali786516/ScalaDB/blob/master/src/main/resources/SourceToTargetBulkLoad.xml
>
>
> Thanks
> Sri
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-data-frame-write-mode-append-bug-tp25650.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to