[ https://issues.apache.org/jira/browse/SPARK-5774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14653287#comment-14653287 ]
Murtaza Kanchwala commented on SPARK-5774: ------------------------------------------ spark.hadoop.validateOutputSpecs=false makes spark to overwrite the files, Is there any way we could setup to append the files in the same path instead of overwrite > Support save RDD append to file > ------------------------------- > > Key: SPARK-5774 > URL: https://issues.apache.org/jira/browse/SPARK-5774 > Project: Spark > Issue Type: New Feature > Components: Spark Core > Affects Versions: 1.3.0 > Reporter: Yanbo Liang > > Now RDD.saveAsTextFile only support writing to a file which is empty. In some > cases, we need to save RDD append to an existing file. For example, when > execute SQL command "INSERT INTO ...", we need to append the RDD to an > existing file. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org