HI Buddy,

   I sued both but DataFrame.write.jdbc is old, and will work if provide
table name , It wont work if you provide custom queries . Where
as DataFrame.write.format is more generic as well as working perfectly with
not only table name but also custom queries . Hence I recommend to use
the DataFrame.write.format("jdbc") .

Cheers !
Rabin



On Wed, Jul 6, 2016 at 10:35 PM, Dragisa Krsmanovic <dragi...@ticketfly.com>
wrote:

> I was expecting to get the same results with both:
>
> dataFrame.write.mode(SaveMode.Overwrite).jdbc(dbUrl, "my_table", props)
>
> and
>
> dataFrame.write.mode(SaveMode.Overwrite).format("jdbc").options(opts).option("dbtable",
> "my_table")
>
>
> In the first example, it behaves as expected. It creates a new table and
> populates it with the rows from DataFrame.
>
> In the second case, I get exception:
> org.apache.spark.sql.execution.datasources.jdbc.DefaultSource does not
> allow create table as select.
>
> Looking at the Spark source, it looks like there is a completely separate
> implementation for format("jdbc") and for jdbc(...).
>
> I find that confusing. Unfortunately documentation is rather sparse and
> one finds this discrepancy only through trial and error.
>
> Is there a plan to deprecate one of the forms ? Or to allow same
> functionality for both ?
>
> I tried both 1.6 and 2.0-preview
> --
>
> Dragiša Krsmanović | Platform Engineer | Ticketfly
>
> dragi...@ticketfly.com
>
> @ticketfly <https://twitter.com/ticketfly> | ticketfly.com/blog |
> facebook.com/ticketfly
>

Reply via email to