[ 
https://issues.apache.org/jira/browse/SPARK-13699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15182031#comment-15182031
 ] 

Dhaval Modi commented on SPARK-13699:
-------------------------------------

TGT_TABLE DDL:
CREATE TABLE IF NOT EXISTS tgt_table (col1 string, col2 int, col3 timestamp, 
col4 decimal(4,1), batchId string, currInd string, startDate timestamp, endDate 
timestamp, updateDate timestamp) stored as orc;

SRC_TABLE DDL:
CREATE TABLE IF NOT EXISTS src_table (col1 int, col2 int, col3 timestamp, col4 
decimal(4,1)) stored as orc;


INSERT STMT:
insert into table src_table values('1',1,'2016-2-3 00:00:00',23.1);
insert into table src_table values('2',1,'2016-2-3 00:00:00',23.1);
insert into table tgt_table values('1',2,'2016-2-3 00:00:00',23.1, '13', 'Y', 
'2016-2-3 00:00:00', '2016-2-3 00:00:00', '2016-2-3 00:00:00');
insert into table tgt_table values('1',3,'2016-2-3 00:00:00',23.1, '13', 'N', 
'2016-2-1 00:00:00', '2016-2-1 00:00:00', '2016-2-3 00:00:00');
insert into table tgt_table values('3',3,'2016-2-3 00:00:00',23.1, '13', 'Y', 
'2016-2-1 00:00:00', '2016-2-1 00:00:00', '2016-2-3 00:00:00');


> Spark SQL drops the table in "overwrite" mode while writing into table
> ----------------------------------------------------------------------
>
>                 Key: SPARK-13699
>                 URL: https://issues.apache.org/jira/browse/SPARK-13699
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Dhaval Modi
>         Attachments: stackTrace.txt
>
>
> Hi,
> While writing the dataframe to HIVE table with "SaveMode.Overwrite" option.
> E.g.
> tgtFinal.write.mode(SaveMode.Overwrite).saveAsTable("tgt_table")
> sqlContext drop the table instead of truncating.
> This is causing error while overwriting.
> Adding stacktrace & commands to reproduce the issue,
> Thanks & Regards,
> Dhaval



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to