[ https://issues.apache.org/jira/browse/SPARK-6130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-6130. ------------------------------------- Resolution: Fixed Fix Version/s: 1.4.0 Issue resolved by pull request 4865 [https://github.com/apache/spark/pull/4865] > support if not exists for insert overwrite into partition in hiveQl > ------------------------------------------------------------------- > > Key: SPARK-6130 > URL: https://issues.apache.org/jira/browse/SPARK-6130 > Project: Spark > Issue Type: New Feature > Components: SQL > Reporter: Adrian Wang > Fix For: 1.4.0 > > > Standard syntax: > INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 > ...) [IF NOT EXISTS]] select_statement1 FROM from_statement; > INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] > select_statement1 FROM from_statement; > > Hive extension (multiple inserts): > FROM from_statement > INSERT OVERWRITE TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 > ...) [IF NOT EXISTS]] select_statement1 > [INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] > select_statement2] > [INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2] ...; > FROM from_statement > INSERT INTO TABLE tablename1 [PARTITION (partcol1=val1, partcol2=val2 ...)] > select_statement1 > [INSERT INTO TABLE tablename2 [PARTITION ...] select_statement2] > [INSERT OVERWRITE TABLE tablename2 [PARTITION ... [IF NOT EXISTS]] > select_statement2] ...; > > Hive extension (dynamic partition inserts): > INSERT OVERWRITE TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] > ...) select_statement FROM from_statement; > INSERT INTO TABLE tablename PARTITION (partcol1[=val1], partcol2[=val2] ...) > select_statement FROM from_statement; -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org