Right now, it is up to the source implementation to decide what to do. I
think path-based tables (with no metastore component) treat an append as an
implicit create.

If you're thinking that relying on sources to interpret SaveMode is bad for
consistent behavior, I agree. That's why the community adopted a proposal
to standardize logical plans and the behavior
<https://docs.google.com/document/d/1gYm5Ji2Mge3QBdOliFV5gSPTKlX4q1DCBXIkiyMv62A/edit?ts=5a987801#heading=h.m45webtwxf2d>
expected of data sources for the v2 API.

On Thu, Nov 8, 2018 at 11:53 PM Shubham Chaurasia <shubh.chaura...@gmail.com>
wrote:

> Hi,
>
> For SaveMode.Append, the doc
> https://spark.apache.org/docs/latest/sql-data-sources-load-save-functions.html#save-modes
> says
>
> *When saving a DataFrame to a data source, if data/table already exists,
> contents of the DataFrame are expected to be appended to existing data*
>
> However it does not specify behavior when the table does not exist.
> Does that throw exception or create the table or a NO-OP?
>
> Thanks,
> Shubham
>


-- 
Ryan Blue
Software Engineer
Netflix

Reply via email to