[ 
https://issues.apache.org/jira/browse/SPARK-13699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15181804#comment-15181804
 ] 

Xiao Li commented on SPARK-13699:
---------------------------------

After a research, we can NOT truncate the table if the table is created with 
EXTERNAL keyword, because all data resides outside of Hive Meta store. 

[~yhuai] Is that the reason why we chose drop-and-then-recreate the Hive table 
instead of truncate the table when the mode is SaveMode.Overwrite?

> Spark SQL drops the table in "overwrite" mode while writing into table
> ----------------------------------------------------------------------
>
>                 Key: SPARK-13699
>                 URL: https://issues.apache.org/jira/browse/SPARK-13699
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Dhaval Modi
>         Attachments: stackTrace.txt
>
>
> Hi,
> While writing the dataframe to HIVE table with "SaveMode.Overwrite" option.
> E.g.
> tgtFinal.write.mode(SaveMode.Overwrite).saveAsTable("tgt_table")
> sqlContext drop the table instead of truncating.
> This is causing error while overwriting.
> Adding stacktrace & commands to reproduce the issue,
> Thanks & Regards,
> Dhaval



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to